[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: 950217: handling multidimensions
- Subject: Re: 950217: handling multidimensions
- Date: Fri, 17 Feb 1995 09:52:19 -0700
> Keywords: 199502171321.AA05850
Hi Maureen,
> I am unclear on how to proceed in writing data into a netCDF file, given
> the strucure of my data set. The attached test ascii file (ess.csv) is
> representative of the real data set - columns of data where :
> column 1 = latitude
> column 2 = longitude
> column 3 = depth
> column 4 = conductivity
> The desired structure of the netCDF file is defined in the attached CDL
> file ess.cdl (I'm assuming I don't know the EXACT number of
> records in the test ascii file). Also included here is my attempt at
> writing into the netCDF file (ess.f).
First, looking at the test file and program, it doesn't look like you really
have multidimensional data.
Do you really have one conductivity measurement for each possible
combination of latitude, longitude, and depth? If you did, then you would
have, for example, 10 lats, 15 lons, and 20 depths that would define a 10 by
15 by 20 grid of points where you would have 10*15*20 = 3000 conductivity
measurements.
If what you really have is a series of n measurements of
conductivity at a fixed set of depths for a series of lat/lon points that
don't form a regular grid, the structure would be something like
dimensions:
n = unlimited; // number of measurement locations
dep = 10; // number of depths used at each measurement location
variables:
float cond(n,dep);
cond:long_name = "Conductivity";
cond:units = "millimho/cm";
cond:missing_value = 999.99;
float lat(n); // latitude at each location
float lon(n); // longitude at each location
float dep(dep); // depths at which measurements taken at each location
If what you really have is a series of n measurements of conductivity at a
varying number of varying depths at each lat/lon location and the lat/lon
locations don't form a regular grid, the structure would be something like
dimensions:
n = unlimited; // number of measurements
variables:
float cond(n);
cond:long_name = "Conductivity";
cond:units = "millimho/cm";
cond:missing_value = 999.99;
float lat(n); // latitude for each measurement
float lon(n); // longitude for each measurement
float dep(n); // depth for each measurement
> I guess I just haven't grasped the hyperslab concept and how to
> deal with the start and count vectors ... perhaps defining one dimension
> only, a record dimension, would best handle this particular type of
> data set?
I think that's right. That's the dimension I've called "n" above, but
"record" might be a better name for it, and it should probably be unlimited,
if you think you might ever append data to these datasets.
> netcdf ess{
>
> dimensions:
> lat = 10, lon = 10, dep = 10;
>
> variables:
>
> float cond(lat,lon,dep);
> cond:long_name = "Conductivity";
> cond:units = "millimho/cm";
> cond:missing_value = 999.99;
>
> float lat(lat), lon(lon), dep(dep);
> lat:long_name = "Latitude";
> lat:missing_value = 99.99;
> lon:long_name = "Longitude";
> lon:missing_value = 99.99;
> dep:long_name = "Depth";
> dep:missing_value = 99.99;
>
> }
> 26.4623, -33.2813, 1.092, 53.157
> 26.4633, -33.2813, 10.05, 53.064
> 26.6743, -33.4833, 1.092, 53.048
> 26.5653, -33.2843, 24.032, 53.029
> 26.7663, -33.2853, 32.048, 52.995
> program ess
>
> C-----------------------------------------
> C ????
> parameter (lats=10,lons=10,idep=10)
> C-----------------------------------------
>
> real lat(10),lon(10),dep(10),cond(10)
> integer rcode,var(4)
> double precision vlat,vlon,vdep,vcond
> character*4 par(4)
> character*1 cm
> data par /'cond','lat','lon','dep'/
>
> C-----------------------------------------
> C ????
> data start /1,1,1/
> data count /lats,lons,idep/
> C-----------------------------------------
>
> open(unit=100,file='ess.csv',status='old')
> ncid=ncopn('ess.nc',ncwrite,rcode)
>
> C Obtain variable ids
>
> do i = 1,4
> var(i) = ncvid(ncid, par(i),rcode)
> end do
It would make your program clearer later if you used mnemonic names for
these variable id's, such as condid, latid, lonid, depid, but then you
couldn't use a look for the ncvid calls.
> C Obtain missing values
>
> call ncagt(ncid,var(1),'missing_value',vcond,rcode)
> call ncagt(ncid,var(2),'missing_value',vlat,rcode)
> call ncagt(ncid,var(3),'missing_value',vlon,rcode)
> call ncagt(ncid,var(4),'missing_value',vdep,rcode)
>
> C fill arrays with missing values
>
> do i = 1,10
> cond(i)=vcond
> lat(i)=vlat
> lon(i)=vlon
> dep(i)=vdep
> end do
>
> C read and write
>
> 10 do j=1,999
> do i=1,5
> read(100,'(f7.4,a1,f9.4,a1,f8.3,a1,f7.3)',
> + iostat=num,end=15)
> + lat(i),cm,lon(i),cm,dep(i),cm,cond(i)
> end do
> 15 continue
>
> C ** NOT QUITE CERTAIN HOW TO HANDLE NCVPT HERE ... **
>
> if(num.eq.0) then
> ncvpt(ncid,var(1), ?, ?,cond,rcode)
If you're just writing out the 5 values you read in for this value of "j",
and cond is defined as a 1-dimensional record variable, something like this
will work:
start(1) = 1 + 5*(j-1) # the current record number
count(1) = 5 # the number of records to write
ncvpt(ncid, var(1), start, count, cond, rcode)
If you always have the same set of 5 depths and cond is a 2-dimensional
netCDF variable (float cond(n,dep)), use something like:
start(1) = j # which record to start writing at
start(2) = 1 # which depth in that record to start at
count(1) = 1 # how many records to write
count(2) = 5 # how many depths in those records to write
ncvpt(ncid, var(1), start, count, cond, rcode)
> ncvpt(ncid,var(2), ?, ?,lat,rcode)
ncvpt(ncid,var(2), 1 + 5*(j-1), 5, lat, rcode) # write 5 values
(or use the start and count arrays)
and so on for lat, dep.
If you have small arrays and put some data in your ess.cdl, you can also use
ncgen -f ess.cdl
to see what automatically generated Fortran would look like to write the
data as a netCDF file.
Hope this helps.
______________________________________________________________________________
Russ Rew UCAR Unidata Program
address@hidden P.O. Box 3000
http://www.unidata.ucar.edu/ Boulder, CO 80307-3000
______________________________________________________________________________