[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[netCDF #JGD-817199]: Issue with netcdf classic format
- Subject: [netCDF #JGD-817199]: Issue with netcdf classic format
- Date: Wed, 06 Jun 2007 16:30:38 -0600
Hi Richard,
When I run your CDL through ncgen to try to generate a netCDF file, I
get the following erro message:
$ ncgen -b x.cdl
ncgen: x.cdl line 209: duplicate attribute Segmentation Fault(coredump)
The segmentation fault indicates a bug in ncgen, but the message about
a duplicate attribute appears correct, since I see the two lines:
double TS3(time, lat, lon) ; :Version = "$Name$" ;
and
:Version = "$Name$" ;
both of which try to define the global attribute "Version". When I
remove the first spurious Version global attribute defintion, I see
what you are seeing:
$ ncgen -b x.cdl
ncgen: One or more variable sizes violate format constraints
Adding up the number of bytes taken for each record variable, I get
2470182980 bytes of record variables after the TKE declaration:
double TKE(time, ilev, lat, lon) ;
so any record variable after this will mean the total sizes of all but
the last non-record variables in a file will exceed 2**31. As it says
in the FAQ "What does Large File Support have to do with netCDF?"
... The 32-bit file offset in the classic format limits the total
sizes of all but the last non-record variables in a file to less
than 2 GiB.
That's the constraint that is violated by your data. If you move the TKE
variable to the end of the record variables, the other record
variables still cause the constraint above to be violated.
If you made some of the variables type float instead of double, I
think you could easily get under these classic file limitations.
Alternatively, you will have to use the 64-bit offset variant or the
netCDF-4 beta release for this data.
--Russ
Russ Rew UCAR Unidata Program
address@hidden http://www.unidata.ucar.edu
Ticket Details
===================
Ticket ID: JGD-817199
Department: Support netCDF
Priority: Normal
Status: Closed