This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
Ben,
> I had sent a message to the netcdf-group about a crash we were seeing
> with versions of netcdf past 4.2.1
>
> The original title of the message I sent was "unlimited dimension and
> chunking breaking in 4.3.1.1"
>
> Russ suggested I send the output of an ncdump -sh command on a file we
> were able to create with 4.2.1 as that might be helpful.
>
> In any case I am attaching such output.
Thanks for the ncdump output. In starting to try to reproduce the crash
you're encountering, it looks like you are dealing with truly huge files.
From your netcdfgroup posting, it sounds like you are dealing with
1048576 times.
In that case, I compute the size of the H variable
float H(time, lev, lat, lon) ;
to be 1048576*48*91*180*4 bytes, which is about 3.3 TB. And your file has
10 such 4D variables, so the file size is about 33 TB. Does the crash you
see occur also with a smaller number of times, or do you only see it with
on the order of a million times?
--Russ
Russ Rew UCAR Unidata Program
address@hidden http://www.unidata.ucar.edu
Ticket Details
===================
Ticket ID: LBV-326209
Department: Support netCDF
Priority: Normal
Status: Closed