This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
Hi Jeff, > when accessing an openDAP dataset > > ncdump -v prmslmsl > http://nomads.ncep.noaa.gov:9090/dods/gens/gens20140123/gep_all_12z > > > This works fine with netcdf 4.3.0, but segfaults with 4.3.1 (and git > master). I can duplicate the symptoms, so it looks like a bug in ncdump, related to the apparent file size (over 140 GB). nccopy can't be used on it either, as it reports: NetCDF: One or more variable sizes violate format constraints Location: file ../../ncdump/nccopy.c; line 1437 which would be true if it was a netCDF classic format file. I assume the original data on the DAP server is a netCDF-4 classic model file. Thanks for reporting the bug. Will the file stay at the specified URL for a while, until we've determined the cause of the problem? If not, we'll have to get a copy or synthesize another large file that demonstrates the problem. --Russ Russ Rew UCAR Unidata Program address@hidden http://www.unidata.ucar.edu Ticket Details =================== Ticket ID: ZQX-155900 Department: Support netCDF Priority: Normal Status: Closed