This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
>To: address@hidden >cc: Bjarne B?chmann <address@hidden>, >cc: Peter Gylling J?rgensen <address@hidden> >From: Karsten Bolding <address@hidden> >Subject: restoring headers in large netcdf files >Organization: Bolding & Burchard Hydrodynamics >Keywords: 200502190949.j1J9ncv2029855 netCDF large file header restore Karsten, I wrote: > I've just tried this to produce a CDF1 file that's bigger than 2 GiB > for you to test with NCO. Put this in "large.cdl": > > netcdf large { > dimensions: > x = 1024; > y = 1024; > z = 1024; > time = UNLIMITED; > variables: > byte var(time, x, y, z); // 1 GiB per record > int time(time); > data: > time = 1, 2, 3; > } > > Then run > > ncgen -b large.cdl; ls -l large.nc > > For me this took a little over 3 minutes to produce a file over 3 GiB > in size: > > -rw-r--r-- 1 russ ustaff 3221225648 Feb 22 16:47 large.nc > > Now this is a CDF1 (classic) netCDF file, so even older versions of > NCO should be able to handle it, as long as NCO is built with large > file system support. If you attempt the NCO operation > > ncks -v time large.nc > > I would think it should work, if NCO is built with large file support. Having just built NCO 2.9.9 with large file support (I used --enable-largefiles as a configure argument when building, but that may be the default), I can now verify that ncks built this way does work fine on the above 3 GiB netCDF classic format (CDF1) file: $ ncks -v time large.nc time.nc $ ncdump time.nc netcdf time { dimensions: time = UNLIMITED ; // (3 currently) variables: int time(time) ; // global attributes: :history = "Wed Mar 2 14:58:55 2005: ncks -v time large.nc time.nc" ; data: time = 1, 2, 3 ; } If the ncks you have works the same way on the same file generated by ncgen, then I think it should work fine on the large CDF1 files you generated with the netcdf=3.6.0 version library. If ncks fails on this file, then you need to upgrade your NCO installation. I also generated a 64-bit offset (CDF2) netCDF file from the above large.cdl file, using ncgen: $ ncgen -v 2 -b large.cdl; ls -l large.nc -rw-r--r-- 1 russ ustaff 3221225656 Mar 2 15:08 large.nc and tested the same ncks, to make sure it would work with both CDF1 and CDF2 files. It worked identically, so for ncks (and any other application compiled with netCDF 3.6), it shouldn't matter whether you have CDF1 or CDF2 files. But I would recommend just using CDF1 files for as long as you can, until applications are upgraded to version 3.6. --Russ _____________________________________________________________________ Russ Rew UCAR Unidata Program address@hidden http://www.unidata.ucar.edu/staff/russ