This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
> Hello, > > This looks like a bug in the NC_64BIT_OFFSET big variable handling. > > I have a test here that creates 3000* 1000 * 2000 float variable, 24 GB > > After nc_enddef, the file size is suspiciously 4 GB > > After writing the data, the size is 16.0 GB (17,184,000,120 bytes), > instead of 24 GB. > > Reading fails to produce expected results at strange offset 0, 516, 704 > > I attached my version netcdf/nc_test/nc_test.c, it has a new function > test_big_cube_without_unlimited_dim(). > It should be easy to copy that over and run it. > > Runs very slowly, since it creates 24 GB dummy data. > > Kari > > Howdy Kari! Looks like you found a bug of some kind. I have reproduced your problem in the netCDF build with new test libsrc4/tst_large2.c. Now we will work on a fix. I will let you know when we have it. Meanwhile, as a workaround, I note that the NC_NETCDF4 format works fine for your test program. If you are going to use netCDF-4, make sure you get the HDF5-1.8.4-patch1 release, and the recent 4.1.1 release of netcdf. Thanks, Ed Ticket Details =================== Ticket ID: DFM-455316 Department: Support netCDF Priority: Normal Status: Open