This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
> Organization: Atmospheric Environment Service > Keywords: 199407292015.AA19305 netCDF max number variables Hi Jim, > I have written a small fortran program called "aes2cdf.f" > which translates our data format into a netCDF format file. > I am using single dimensions for each variable I wish to > read in. > > The problem occurs when I try to read in large numbers > of variables for each record. The following output > is included. ... > ncvardef: maximum number of variables 512 exceeded ... > Is there a workaround for this? (Other than reading in > less variables). Should I create hyperslabs of groups > of variables which are more related to each other? There are several possible workarounds. One possibility is to recompile the netCDF library, changing the definitions in the netcdf.h include file that set various maximums (for example changing MAX_NC_VARS to 1000): /* * These maximums are enforced by the interface, to facilitate writing * applications and utilities. However, nothing is statically allocated to * these sizes internally. */ #define MAX_NC_DIMS 32 /* max dimensions per file */ #define MAX_NC_ATTRS 512 /* max global or per variable attributes */ #define MAX_NC_VARS 512 /* max variables per file */ #define MAX_NC_NAME 128 /* max length of a name */ #define MAX_VAR_DIMS MAX_NC_DIMS /* max per variable dimensions */ As the comment says, these limits are defined purely for the convenience of application writers. In particular, Fortran programmers would have a harder time writing generic netCDF applications if there weren't some such limits, because it is difficult to allocate arbitrarily large arrays in Fortran at run time. If you change these limits for your local library, you will be able to create and use netCDF files with more than 512 variables, but those files may not be readable by applications or utilities at other sites linked against libraries compiled with the original limits. Hence you might lose some portability of your data. Perhaps you should try to make more use of arrays rather than naming each value with a unique variable. The netCDF interface is not much different from programming languages in this respect; few subroutines or functions have more than 500 uniquely named variables. Instead, arrays, records (structures), and arrays of records are used to organize the data. Since netCDF must support Fortran as well as C, it has no direct support for C structures, but you can use records much like C structures to store groups of values of different types. Groups of values of the same type are appropriate for arrays. By using a "single dimension for each variable", I assume you mean you are not using scalar variables but record variables with a single unlimited dimension on everything. If sets of these values are of the same type and always used together, that's a good candidate for a multidimensional array. > While on this topic of large datasets, does netCDF have > some limitations for the number of records written out? The only limit on the number of records is what can be held in a "long" integer to index the records, which is typically 2147483648. -- Russ Rew UCAR Unidata Program address@hidden P.O. Box 3000 http://www.unidata.ucar.edu/ Boulder, CO 80307-3000