This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
> Dear All, > > I would like to draw your attention on a problem occurred using the > netcdf-4.1.1 library (with hdf5-1.8.4_patch1) in order to write netCDF-4 > files. I've tried to sketch the problem statement in the C++ test program > attached to this e-mail: the test creates a netcdf file (test_file.nc) > and defines an unlimited dimension "rec" and 50 variables of type DOUBLE; > then it tries to write 10^5 double values in each variable one at a time > using nc_put_var_1_double() function. Monitoring the memory used by the > test you find out that the memory consumption steadily increases until > the program crashes when 3GB of memory are used (which is about the 37% > of the available memory on my machine). The setting and tuning of the > variables' chunking seem to be irrelevant: the behavior doesn't change. > Yet, the problem disappears if you don't declare the dimension "rec" as > unlimited: setting "rec" dimension size to 10^5, a Netcdf file of about > 40MB (10^5 x 50 x 8 bytes) is correctly generated. It seems that a memory > leak occurs when using unlimited dimensions and one by one value writing. > Do you already know this problem? > > I'm looking forward to hearing from you soon. > Thanks in advance. > > Best regards, > Marica Antonacci > Howdy Marica! I believe this is fixed in the daily snapshot release: ftp://ftp.unidata.ucar.edu/pub/netcdf/snapshot/netcdf-4-daily.tar.gz. Can you get it and try it? Thanks, Ed Ticket Details =================== Ticket ID: FSP-229232 Department: Support netCDF Priority: Normal Status: Closed