This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
Hi Jim, > On a whim [*], I added the nc_set_fill() call with NC_NOFILL, and it fixed > my problem. > > This doesn't make sense to me. I'd think it would take longer to do the > individual writes when the file isn't already allocated on the disc. > Wouldn't you? It turns out that initializing a variable with fill values doesn't just allocate disk blocks, it also writes the default fill value for that variable's type into every value location for that variable, so it takes some time. In no-fill mode, the disk blocks aren't allocated, but the fill-values are not written, either. Allocating disk blocks doesn't take much time compared to writing to them, so that's why skipping the allocation is not significant, but skipping the writing saves a lot of time. The disadvantage of no-fill mode is that if you later inadvertently read data that was never written, you won't be able to detect it. So if you are going to use no-fill mode, you should make sure you actually write all the values. > [*] When you asked about nc_set_fill(), I went and looked it up (I wasn't > aware of this call previously). I decided the answer to my problem was to > create the file using nc_set_fill(), then close the file through netcdf. > Then re-open the file as a CFile, and append my video data on that way > since I could already do that with no problem. Just for giggles, I tried > letting NetCDF write the file, and it worked great, right up to the 2GB > limit. Also this has the added benefit of speeding up the time it takes to > initially create the file. You probably don't have to close and reopen the file either. > dimensions: > 5000 = 5000 ; > strFrameWidth = 256 ; > strFrameHeight = 256 ; > variables: > short Images(5000, strFrameHeight, strFrameWidth) ; I'm glad this solved the problem, but am still mystified what caused the original slowdown you observed. When you first created the file, it should have initialized all 5000 frames with fill-values, so the file would appear at its final size even before any real data was written to it. Subsequent writes should have merely overwritten the fill values, and there's no reason I can think of why writing 1500 frames would be a lot slower than writing 1000 frames. --Russ