This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
> Here is the an example code and its output on Unbuntu 9.04. As you can > see, the memory usage keeps growing. The versions of HDF and NETCDF > are 1.8.4 and 4.1-snapshot2010020200, respectively. I suspect either > nc_close() does not clean all the references to HDF objects or > H5garbage_collect() is not doing a great job. > > -Huaiyu > Howdy Huaiyu! Thanks to you and several others I did a lot of work with HDF5 objects in netCDF-4, making sure that I closed them all. But there were a few activities that caused memory growth within the HDF5 library, even though I was closing everything correctly. In those cases I submitted bug reports to the HDF5 team, including sample programs that duplicate the problem. We believe all of the remaining problems are pretty minor. However I have just added code to call H5close when all netCDF files are closed. This will prevent the steady consumption of memory by the netCDF/HDF5 libraries, unless some files are always kept open. (This change is going to be available in the snapshot release that is being generated now: ftp://ftp.unidata.ucar.edu/pub/netcdf/snapshot/netcdf-4-daily.tar.gz) When the HDF5 team processes the remaining known memory problems, they will no doubt do another release. At that time we can check to see if we got all the leaks, or if there are still any left. Thanks! Ed Ticket Details =================== Ticket ID: RPP-420427 Department: Support netCDF Priority: Critical Status: Closed