[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: 980722: netCDF problem

This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.


  • Subject: Re: 980722: netCDF problem
  • Date: Wed, 22 Jul 1998 17:09:09 -0600

>To: address@hidden
>From: address@hidden (Carrie Gonzalez)
>Subject: Re: netCDF problem
>Keywords: 199807222032.OAA10741

Carrie,

>    Here is the netCDF file we created and are trying to from in order
> to create the new netCDF file with data triplets.

A quick look indicates that the data_matrix array seems to have only
been allocated to hold 1536 floats:

   NumDims = 3;
   bytes = NumDims * sizeof (float) * num_steps;
   if ((tmp_ptr = malloc (bytes)) == NULL)
    {
      cout << "Error allocating memory to hold data triplets.\n";
      exit (-1);
    }
   data_matrix = (float *) tmp_ptr;

but then you are trying to write 128*128*128 values from the data_matrix
array into the netCDF file:

   status = ncvarput (id_new, data_id, three_start, three_count, data_matrix);

where three_count is (128, 128, 128).

This would cause a segmentation violation and dump core, because you are
trying to access data in the data_matrix array way beyond the size of
the array.

Hope this helps.

--Russ