This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
Also, based on the python interface that Jeff Whitaker wrote and his test programs (while the calls are in python the bulk of the dirty work is done by the C libs), read access to a file using compression on a large array was roughly 8-10 times slower than the same array uncompressed, and the savings in size was roughly one order of magnitude. Does this sound like what you would expect? The compression feature is very attractive given the size of our files, but that is a reasonably large speed hit.
Now to use some of these new features, I am either going to have to learn to program in C for real or get the wrappers I am working on to work.
Thanks again, -Roy On Dec 27, 2007, at 10:00 AM, Russ Rew wrote:
Roy,The new features of netcdf-4, such as user-defined types, compound types and variable length arrays, open up a lot of possibilities for storing data, such as profile data that is taken at a variable number of depths. But as is often the case, when there are a lot of possibilities, usually some uses of these features make for better or more efficient design then others. We are looking for example files and/or code that use some of these new features in netcdf-4 to see different approaches that people have taken. Any help greatly appreciated.I was away from my email when you posted this question, but I have a fewsuggestions. First, I have a draft of a document on "Developing Conventions for NetCDF-4" that has some examples and discusses some issues in use of new netCDF-4 features:http://www.unidata.ucar.edu/software/netcdf/docs/nc4- conventions.htmlSecond, I'm trying to construct some more realistic examples for testingwith the netCDF-4 version of ncdump, including these: tst_comp.c tst_vlen_data.c tst_string_data.c tst_opaque_data.c tst_enum_data.c tst_group_data.c each of which creates a netCDF file that ncdump is tested against with output expected as ref_tst_comp.cdl ref_tst_vlen_data.cdl ref_tst_string_data.cdl ref_tst_opaque_data.cdl ref_tst_enum_data.cdl ref_tst_group_data.cdl These should all be available in the ncdump directory of recent releases. Third, a talk I gave in June on "Best Practices" contained some suggestions for netCDF-4: http://www.unidata.ucar.edu/presentations/Rew/bp-seminar.pdfI hope these are of some help. In any case, I'd appreciate feedback. Irealize I still need some more realistic examples, e.g. for the Dapperconventions for soundings or for some of John Caron's observational dataconventions for netCDF-3. We need real users trying out some of these features and reporting back on what's actually useful. --Russ
**********************"The contents of this message do not reflect any position of the U.S. Government or NOAA."
********************** Roy Mendelssohn Supervisory Operations Research Analyst NOAA/NMFS Environmental Research Division Southwest Fisheries Science Center 1352 Lighthouse Avenue Pacific Grove, CA 93950-2097 e-mail: address@hidden (Note new e-mail address) voice: (831)-648-9029 fax: (831)-648-8440 www: http://www.pfeg.noaa.gov/ "Old age and treachery will overcome youth and skill."