This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
Marston, > I noticed in the examples that the dimensions of the arrays were > already known before writing the script. > I wanted to dynamically allocate the memory which is what I did. This > would explain why the data is mixed up like it is. > I used ncdump to check but the arrays were so large that I didn't see > it right away. Earlier, I used CDI as an interface to write the netcdf > file and then it worked fine. > Using the C API, I thought made the script faster and cleaner with > more flexibility since the file I'm making will be over 0.5 Gig large. > Still Russ, there must be away to dynamically allocate the memory and > make this work with just the C API interface? I'm not a programmer at > heart so this level of detail was over my head. But I understand now. > How can I make it work Russ when dynamically allocating the memory? Just make sure you allocate all the memory for the array with a single call to malloc (or whatever you are using for memory allocation) so that you get a single contiguous block of memory, rather than a bunch of scattered blocks. So if you want memory for an array shaped like short gs[nlev][nlon][nlat]; use something like short *gs = (short *) malloc(nlev * nlon * nlat * sizeof(short)); Of course then you would have to access individual elements of the array by using some index arithmetic, so instead of gs[ilev][jlon][klat] you would use something like gs[ ( ( ilev * nlon + jlon) * nlat) + klat] in your code. It's actually possible to set up auxiliary pointer arrays so that gs[ilev][jlon][klat] would still work, pointing to the right element of the allocated memory block, but it's complicated. For details, again see http://c-faq.com/aryptr/dynmuldimary.html --Russ > On May 26, 2009, at 11:20 PM, Unidata netCDF Support wrote: > > > Marston, > > > >>> After reading the data, do you apply the unpacking formula to the > >>> values before looking at them? The library does not apply the > >>> unpacking formula for you. If you're looking at the max and min of > >>> the packed values, those would not be the max and min of the data > >>> values you expect. > >> > >> Yes, the scale_factor and add_offset is applied before the max and > >> min is checked. The data is also plotted and confirms that the > >> netcdf array was written incorrectly in the file. > >> > >> I used the function nc_put_var() and still the data is garbled. > >> There is however one exception: in the example in the manual has the > >> function call like this: > >> > >> /* write values into netCDF variable */ > >> status = nc_put_var_double(ncid, rh_id, rh_vals); > >> if (status != NC_NOERR) handle_error(status); > >> > >> Where the array is sent in as just rh_vals. When I try to do this my > >> program compiles but I get an incompatible pointer type warning. My > >> call is like this: > >> > >> status = nc_put_var_short(ncid,qsid,&swd->qs[0][0][0]); > >> if(status != NC_NOERR) handle_error(status); > >> > >> and the program compiles without a warning. > >> > >> Could this be the problem? > > > > No, I don't think so. For a 1D array in C, such as rh in the example, > > the expressions rh and &rh[0] mean the same thing, but that's not true > > for higher dimension arrays. qs means the same thing as &qs[0], but > > that's not the same as &qs[0][0] or &qs[0][0][0]. Only the latter > > expression refers to the beginning of a contiguous block of data that > > has all the values of the 3-dimensional qs array. > > > > And that's only true if qs is declared as a conventional statically > > allocated 3-dimensional array, with a declaration such as > > > > short qs[nlen][nlon][nlat]; > > > > If qs is instead a dynamically allocated array, where the rows or > > columns or slices are pointers to separately allocated memory, then > > the expression &qs[0][0][0] would no longer point to a single > > contiguous block of nlev*nlon*nlat values. So if qs is a dynamically > > allocated array, as described in this FAQ: > > > > http://c-faq.com/aryptr/dynmuldimary.html > > > > that would also explain what you are seeing. > > > > Other than that, I can't see what's causing the problem. Have you > > tried running ncdump after writing the file to verify that what's > > written is not what you intended? That would at least eliminate the > > possibility of a bug in your reading and unpacking code. > > > > Another approach is using a text editor to create a CDL file, foo.cdl, > > that has the dimensions, variables, and attributes you want as well as > > a little data, then generating C code with > > > > ncgen -c foo.cdl > foo.c > > > > Look at foo.c. Compile it and link it with your netCDF library. Run > > it to create a netCDF file and see if that has what you expect in it, > > using ncdump. If that works, you may be able to discover the problem > > by comparing foo.c with your program. > > > > If that doesn't help, try isolating the problem in a small complete > > example that we could run here to try to reproduce and explain what > > you are seeing. It's possible you've discovered a bug in the library, > > but it's not the most likely explanation ... > > > > --Russ > > > > Russ Rew UCAR Unidata Program > > address@hidden http://www.unidata.ucar.edu > > > > > > > > Ticket Details > > =================== > > Ticket ID: ZPJ-875297 > > Department: Support netCDF > > Priority: Normal > > Status: Closed > > > > Russ Rew UCAR Unidata Program address@hidden http://www.unidata.ucar.edu Ticket Details =================== Ticket ID: ZPJ-875297 Department: Support netCDF Priority: Normal Status: Closed