This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
>To: address@hidden >From: James Garnett <address@hidden> >Subject: Re: slow write on large files >Organization: Raytheon >Keywords: 200406092031.i59KVktK012876 performance Hi Jim, > I'm using NetCDF to stream video to disk real time. I'm doing this on a > PC, running windows, and C, Windows implementation of NetCDF. Using Visual > C++ 6.0. > > The data is coming in at about 7.5 MB/second. > I'm using successive calls to nc_put_vara_short() to write each frame of > video data. How much data, in bytes, is in a frame of video data in the way you are representing a frame? It might help to see the output of ncdump -h on your file, to see how it's structured. > When my file size is small (1000 frames of data or less), I have no > problem. When my file size is much larger (1500 frames or more), my > application runs horribly. It appears that the calls to nc_put_vara_short > are not keeping up with the incoming data. Yes, I'm buffering in RAM, but > my buffer is limited, and eventually I overflow my buffer. The real bugger > is that the problems start occuring very early in a large file, it's not > like it's fine for the first 1000 frames, then starts lagging. I'm seeing > problems very early on in a large file. > > Is there something about the way that nc_put_vara_short is coded that it > slows down based on the TOTAL SIZE of the file (or just the Variable > portion of the file). No, as far as I know there is nothing like that in the netCDF library. However, to help diagnose the problem, it would be useful to have more information: - Are you getting close to your disk capacity with 1500 frames? - Are you using an nc_set_fill() call with NC_NOFILL mode to prevent initializing with fill values before writing to a variable? - Are your frames using the unlimited dimension, or is the total collection of frames a variable of fixed size (that is, is there a dimension that's the number of frames and it's not the unlimited dimension)? - Do you have a relatively small example that demonstrates the problem, perhaps with dummy data, so we can try to reproduce it here? > When I write the data out to disk, using just CFile::Write() instead of > using the NetCDF library, I have no problems. Does that kind of write create a significantly smaller file? --Russ