This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
Hi Amanda, > I am generating a netcdf file containing multi-dimensional variables: > > dimensions: > time = 1 ; > cell = UNLIMITED ; // (23 currently) > level = 2 ; > contour = 178 ; > pt_traj = 8 ; > slice = 62 ; > > variable: > float temp(cell, pt_traj, slice) ; > > In my case, the number of temp data value can vary from one cell to > another (ex: temp(1,2,*) contains 10 good values while temp(1,4,*) > contains 178 good values). Is there a way to reduce the number of fill > values? (attached is the file i generated) Yes, there are several ways. One is to reorganize your data to follow a convention developed for trajectory data, such as the CF Conventions, which provides two representations for such data: H.4.3. Contiguous ragged array representation of trajectories http://cf-pcmdi.llnl.gov/documents/cf-conventions/1.6/cf-conventions.html#idp8393872 or H.4.4. Indexed ragged array representation of trajectories http://cf-pcmdi.llnl.gov/documents/cf-conventions/1.6/cf-conventions.html#idp8399648 Another solution is to use compression as supported in netCDF-4. In this case, you can preserve the current multi-dimensional representation of your data, but you need to specify you want the variables compressed when you create them. The C function to call would be nc_def_var_deflate to specify the "deflate" compression level. There is an analogous Fortran function. Then the data could be written and read using the same code you are using now, but to read it, the reading application would have to be linked with the netCDF-4 library, which would uncompress the data chunks as the data is read. For more information about this, see this section of the netCDF Workshop on compression: http://www.unidata.ucar.edu/netcdf/workshops/2012/nc4chunking/ You can test how well compression works on your data by using the "nccopy" utility program to copy and compress a typical data file, for example: nccopy -d1 RDT_output_200905251845.nc RDT_output_200905251845-d1.nc ls -l RDT_*.nc -rw-rw-r-- 1 russ ustaff 285251 Jun 18 10:02 RDT_output_200905251845-d1.nc -rw-rw-r-- 1 russ ustaff 313586 Jun 18 10:01 RDT_output_200905251845.nc shows that level 1 deflation made the file about 90% of the size of the original. --Russ Russ Rew UCAR Unidata Program address@hidden http://www.unidata.ucar.edu Ticket Details =================== Ticket ID: IXU-921318 Department: Support netCDF Priority: Normal Status: Closed