[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
20010322: netcdf question: NCEP reanalysis: precision loss
- Subject: 20010322: netcdf question: NCEP reanalysis: precision loss
- Date: Thu, 22 Mar 2001 14:52:00 -0700
Brent,
>Date: Thu, 22 Mar 2001 16:47:27 -0500 (EST)
>From: Brent A McDaniel <address@hidden>
>Organization: Georgia Institute of Technology
>To: Steve Emmerson <address@hidden>
>Subject: Re: 20010322: netcdf question: NCEP reanalysis: precision loss
>Keywords: 200103222003.f2MK3hL23783
The above message contained the following:
> Thanks for the response. In regards to your question, yes and no.
> The data was constructed thusly: ncep renanalysis data was used (say
> air.1957.nc), from which the climatology was subtracted out and written to
> the new file aanom.1957.nc. Ncdump -h on the original (air.1957.nc)
> shows:
>
> omega:add_offset = 29.765f ;
> omega:scale_factor = 0.001f ;
>
> The data file that was actually dumped, altered and regenerated
> (aanom.1957.nc) is not packed, nor is the regenerated file (air.1957.cdf).
> Can you briefly describe why that's changing things? If I added the
> offsets/scale factors would that fix the problem? Or cause more?
In that case, I suspect that what you're seeing is simple truncation
errors being introduced by the ncdump utility. The utility isn't
designed to dump values will maximum possible precision.
Regards,
Steve Emmerson <http://www.unidata.ucar.edu>