[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Binary encoding of coverages in netcdf
- Subject: Re: Binary encoding of coverages in netcdf
- Date: Mon, 13 Feb 2006 16:30:10 -0700
Lorenzo Bigagli wrote:
See comments inline...
John,
"A netCDF dataset often contains many different variables (a
hundred or
more is not uncommon for large model simulations). Because each
variable has unique units, each becomes a seperate coverage." (We
could
say something here about using multiple ranges when the domain is
the
same, if we could have seperate units. Stefano, so you know if
WCS 1.1
allows that?)
Yes, it is possible to have a common domain (i.e. space and time)
associated to multiple ranges characterized by different units.
Im not familiar with the "point sets" case, only the grid
(regular and
irregular). Stefano, do you have an example?
raobs datasets (i.e. atmospheric vertical profiles) are good examples.
"I would recommend that the subsets of netCDF encoded coverages
returned
by a WCS be consistent netCDF files themselves (not just data
chunks); a
server may prune also unneeded dimensions, coordinate variables,
etc."
I strongly agree with this. How about:
" NetCDF encoded coverages returned by a WCS must be valid netCDF
files
(not just data chunks). The server must return the requested
coverage(s)
as a netCDF variable and its associated coordinate variables. If
the WCS
request asks for a subset, the variable and its coordinates must be
approprately subsetted. The server should remove unneeded
dimensions,
variables, etc. not needed by the requested coverages. The exact
We may relax this requirement as in "The server MAY remove unneeded
dimensions, variables, etc. (...)", as there is ambiguity in the meaning
of "unneeded", which may make such a recommendation difficult to
implement. Leaving it as an option is probably more appropriate to our
current understanding of the implications related (see also
http://rfc.net/rfc2119.html).
Yes, I SHOULD have said MAY ;^)
encoding must be documented and published as a NetCDF Convention; we
recommend the CF-1.0 Convention, or variants of it as recommended
by the
CF working groups."
The use of OPeNDAP URLs is a nice way to do it also. I'll have to
look
We also think it is a very elegant and practical way to access data.
Actually, we have designed one "flavour" of ncML-GML data access to
leverage OPeNDAP, by referencing (subset and/or resampling of) published
datasets.
The online implementation of WCS-G serves such datasets (see for
instance scalarRangeSet/DataURL in
http://athena.pin.unifi.it:8080/galeon/WCS-v1.0?REQUEST=GetCoverage&VERSION=1.0.0&TIME=2001-01-16T00:00:00Z,2002-12-07T00:00:01Z&SERVICE=WCS&COVERAGE=sst(time-lat-lon)&RESPONSE_CRS=EPSG&CRS=WGS84(DD)&FORMAT=ncML-GML&BBOX=1.0,-69.5,359.0,89.5&RESY=1.0&RESX=2.0
).
Yes, this is a good way to do it for clients who know OpeNDAP. It is possible also to put
that URL into nj22 and view it as a NetCDF dataset. However, then it wouldnt have the
coordinate variables in it, so its not as "standalone" as you might like. But
specifying the coordinates in the NcML-G still makes it a complete solution.
BTW, we use the Unidata OPeNDAP/THREDDS server and a few weeks ago we
got the following feedback by Frank Warmerdam:
I tried checking out the WCS-G server. The get coverage returns
a GML document with a data url pointing to a THREDDS
OPeNDAP server. I don't actually support the GML (and am not
currently expecting to, though I might change my mind on that)
but I did try the OPeNDAP url. Unfortunately it did not work
with my OPeNDAP client since it seems to depend on DAP 3.x
features that the THREDDS server lacks. I get the following on my end.
Warning 1: I connected to the URL but could not get a DAP 3.x version string
from the server. I will continue to connect but access may fail.
gdalinfo: Error.cc:310: void Error::set_error_code(int): Assertion
`OK()' failed.
<crash on uncaught C++ exception>
Do you have an idea of what may be wrong with it?
Ill check this out, thanks.