This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
>To: address@hidden >cc: address@hidden >From: Marc Guyon <address@hidden> >Subject: NC_MAX_VAR and netcdf 3.4 >Organization: Institut du Developpement & des Ressources en Informatique >Scientifique Equipe Support Utilisateurs >Keywords: 200105101046.f4AAkHp08052 netCDF NC_MAX_VAR Hi Marc, > I'm working in a Computer Center and I have implemented netcdf 3.4 on several > platforms at IDRIS (vector and parallel scalar platforms, graphic platform, > mass storage platform...). Today one of our users would like to use a netcdf > lib with the limit NC_MAX_VAR equal to 10,000 instead of 2,000. > > I found comments in your "netCDF support email archive". I would like to keep > with the release 3.4 to have homogeneity between our platforms. I want to > avoid > side effects for all the jobs running today in operationnal use and our main > throughput platform is rather exotic for US software (Nec SX5). > > I would like toknow : > > 1. if there are other variables to change with the NC_MAX_VAR? If you are also using the Fortran interface, you should change the corresponding limit in the fortran/netcdf.inc file also: parameter (nf_max_vars = 10000) along with the macro definition in the libsrc/netcdf.h file: #define NC_MAX_VARS 10000 /* max variables per file */ > 2. if it's correct that fix the number NC_MAX_VAR involved to lose > portability? Files that use fewer variables than the previous maximum will still be portable. Files that use more than the previous maximum may be portable to any programs that don't use the NC_MAX_VARS macro and that don't create new variables. For example, an old "ncdump" utility compiled with a library using a smaller NC_MAX_VARS will still work fine on a file with more variables, because "ncdump" makes no use of NC_MAX_VARS and doesn't create new netCDF variables. Just calling nc_open() on such a file and accessing data from it still works OK. However, programs that use NC_MAX_VARS (or NF_MAX_VARS for Fortran) to allocate space may fail when asked to deal with a file containing too many variables. Also programs that define new variables by calling nc_def_var() will get an error returned when they try to exceed NC_MAX_VARS variables in the output dataset. So, for example, the "ncgen" utility compiled with an old library will fail when fed the output from ncdump on a file containing more than NC_MAX_VARS variables, with the error message: ncgen: NC_MAX_VARS exceeded when it would create too many variables. A generic utility that copied netCDF files, renaming their variables or adding attributes would likewise fail when it tried to create too many variables in the output, unless it was recompiled with the new limits. > 3. Which are the others known consequences of the change of the value of > NC_MAX_VAR? Programs that allocate static arrays using NC_MAX_VARS will occupy more memory. Also functions that access variables by name, such as nc_inq_varid() will be somewhat slower, since a simple linear search among the named variables is used to find the ID. However most functions use the variable ID, so you typically only pay this lookup penalty once per variable. You might be interested to know that we are increasing the limits in the next minor release of netCDFm (3.5.1) at the request of another user who found the current limits too small, though we had only intended to increase the maximum number of variables to 4096. You are the first user who has suggested a need for as many as 10000 variables. --Russ _____________________________________________________________________ Russ Rew UCAR Unidata Program address@hidden http://www.unidata.ucar.edu