This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
Brent, >Date: Thu, 22 Mar 2001 16:47:27 -0500 (EST) >From: Brent A McDaniel <address@hidden> >Organization: Georgia Institute of Technology >To: Steve Emmerson <address@hidden> >Subject: Re: 20010322: netcdf question: NCEP reanalysis: precision loss >Keywords: 200103222003.f2MK3hL23783 The above message contained the following: > Thanks for the response. In regards to your question, yes and no. > The data was constructed thusly: ncep renanalysis data was used (say > air.1957.nc), from which the climatology was subtracted out and written to > the new file aanom.1957.nc. Ncdump -h on the original (air.1957.nc) > shows: > > omega:add_offset = 29.765f ; > omega:scale_factor = 0.001f ; > > The data file that was actually dumped, altered and regenerated > (aanom.1957.nc) is not packed, nor is the regenerated file (air.1957.cdf). > Can you briefly describe why that's changing things? If I added the > offsets/scale factors would that fix the problem? Or cause more? In that case, I suspect that what you're seeing is simple truncation errors being introduced by the ncdump utility. The utility isn't designed to dump values will maximum possible precision. Regards, Steve Emmerson <http://www.unidata.ucar.edu>