This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
Marston, > > After reading the data, do you apply the unpacking formula to the > > values before looking at them? The library does not apply the > > unpacking formula for you. If you're looking at the max and min of > > the packed values, those would not be the max and min of the data > > values you expect. > > Yes, the scale_factor and add_offset is applied before the max and > min is checked. The data is also plotted and confirms that the > netcdf array was written incorrectly in the file. > > I used the function nc_put_var() and still the data is garbled. > There is however one exception: in the example in the manual has the > function call like this: > > /* write values into netCDF variable */ > status = nc_put_var_double(ncid, rh_id, rh_vals); > if (status != NC_NOERR) handle_error(status); > > Where the array is sent in as just rh_vals. When I try to do this my > program compiles but I get an incompatible pointer type warning. My > call is like this: > > status = nc_put_var_short(ncid,qsid,&swd->qs[0][0][0]); > if(status != NC_NOERR) handle_error(status); > > and the program compiles without a warning. > > Could this be the problem? No, I don't think so. For a 1D array in C, such as rh in the example, the expressions rh and &rh[0] mean the same thing, but that's not true for higher dimension arrays. qs means the same thing as &qs[0], but that's not the same as &qs[0][0] or &qs[0][0][0]. Only the latter expression refers to the beginning of a contiguous block of data that has all the values of the 3-dimensional qs array. And that's only true if qs is declared as a conventional statically allocated 3-dimensional array, with a declaration such as short qs[nlen][nlon][nlat]; If qs is instead a dynamically allocated array, where the rows or columns or slices are pointers to separately allocated memory, then the expression &qs[0][0][0] would no longer point to a single contiguous block of nlev*nlon*nlat values. So if qs is a dynamically allocated array, as described in this FAQ: http://c-faq.com/aryptr/dynmuldimary.html that would also explain what you are seeing. Other than that, I can't see what's causing the problem. Have you tried running ncdump after writing the file to verify that what's written is not what you intended? That would at least eliminate the possibility of a bug in your reading and unpacking code. Another approach is using a text editor to create a CDL file, foo.cdl, that has the dimensions, variables, and attributes you want as well as a little data, then generating C code with ncgen -c foo.cdl > foo.c Look at foo.c. Compile it and link it with your netCDF library. Run it to create a netCDF file and see if that has what you expect in it, using ncdump. If that works, you may be able to discover the problem by comparing foo.c with your program. If that doesn't help, try isolating the problem in a small complete example that we could run here to try to reproduce and explain what you are seeing. It's possible you've discovered a bug in the library, but it's not the most likely explanation ... --Russ Russ Rew UCAR Unidata Program address@hidden http://www.unidata.ucar.edu Ticket Details =================== Ticket ID: ZPJ-875297 Department: Support netCDF Priority: Normal Status: Closed