This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
John, I sent you the e-mail below back in late September regarding a problem using netCDF-Java to open HDF datasets that have large attributes, i.e., character-string attributes with very long length. Although my earlier e-mail indicated that the threshold seemed to be at 64 kB, I recently heard from someone that he is having trouble with an HDF dataset where one of the attributes is 54 kB in length. This problem is being brought to my attention by a group at NASA-JPL involved with an upcoming Earth-observing mission. They have so far been using Panoply (and thus netCDF-Java) to work with test datasets but will be working with “real” data in a few months. Use of these long attributes is apparently specified by mission requirements, so it’s not something they are going to stop doing. In my prior e-mail I mentioned this problem occurring with NJ 4.3.22. I have since updated Panoply to work with the NJ 4.5.* branch and that the problem occurs there also. It does look like there has been some (minor?) change between 4.3.22. and 4.5.3 as the line number where the exception happens has shifted a bit. Using NJ 4.5.3, I now see java.io.IOException: java.io.IOException: Negative seek offset at ucar.nc2.NetcdfFile.open(NetcdfFile.java:431) at ucar.nc2.dataset.NetcdfDataset.openOrAcquireFile(NetcdfDataset.java:744) at ucar.nc2.dataset.NetcdfDataset.openDataset(NetcdfDataset.java:426) at ucar.nc2.dataset.NetcdfDataset.acquireDataset(NetcdfDataset.java:527) at ucar.nc2.dataset.NetcdfDataset.acquireDataset(NetcdfDataset.java:504) at gov.nasa.giss.data.nc.NcDataset.initMe(NcDataset.java:109) Caused by: java.io.IOException: Negative seek offset at ucar.unidata.io.RandomAccessFile.seek(RandomAccessFile.java:404) at ucar.nc2.iosp.hdf5.FractalHeap.readDirectBlock(FractalHeap.java:444) at ucar.nc2.iosp.hdf5.FractalHeap.<init>(FractalHeap.java:174) at ucar.nc2.iosp.hdf5.H5header$DataObject.processAttributeInfoMessage(H5header.java:2392) at ucar.nc2.iosp.hdf5.H5header$DataObject.<init>(H5header.java:2380) at ucar.nc2.iosp.hdf5.H5header$DataObject.<init>(H5header.java:2230) at ucar.nc2.iosp.hdf5.H5header.getDataObject(H5header.java:2077) at ucar.nc2.iosp.hdf5.H5header.access$600(H5header.java:70) at ucar.nc2.iosp.hdf5.H5header$DataObjectFacade.<init>(H5header.java:2123) at ucar.nc2.iosp.hdf5.H5header.readGroupNew(H5header.java:3998) at ucar.nc2.iosp.hdf5.H5header.access$900(H5header.java:70) at ucar.nc2.iosp.hdf5.H5header$H5Group.<init>(H5header.java:2203) at ucar.nc2.iosp.hdf5.H5header$H5Group.<init>(H5header.java:2168) at ucar.nc2.iosp.hdf5.H5header.readSuperBlock2(H5header.java:355) at ucar.nc2.iosp.hdf5.H5header.read(H5header.java:203) at ucar.nc2.iosp.hdf5.H5iosp.open(H5iosp.java:130) at ucar.nc2.NetcdfFile.<init>(NetcdfFile.java:1529) at ucar.nc2.NetcdfFile.open(NetcdfFile.java:821) at ucar.nc2.NetcdfFile.open(NetcdfFile.java:428) ... 15 more I realize that updating NJ to deal with this sort of HDF dataset might be complex and potentially time-consuming, but do you have any idea if doing so is something that will happen in the near or foreseeable future? Thanks, rbs On Sep 23, 2014, at 17:30, Robert Schmunk <address@hidden> wrote: > > John, > > A group at NASA JPL is gearing up to begin serving a new collection of Earth > observations data next year and plan to provide HDF5 datasets. They have run > into a problem accessing test datasets using my Panoply app, and the > exception traces back to code in netCDF-Java’s reading of the HDF5 header . > > One thing that the JPL group is doing that is out of the ordinary is that > they are stuffing a very large XML string into a single global attribute. In > some cases, the string is longer than 64 kB. When NJ tries to read the file > header for those cases, a "Negative seek offset” exception gets thrown. The > stack trace that results looks like the following... > > java.io.IOException: java.io.IOException: Negative seek offset > at ucar.nc2.NetcdfFile.open(NetcdfFile.java:425) > at > ucar.nc2.dataset.NetcdfDataset.openOrAcquireFile(NetcdfDataset.java:699) > at ucar.nc2.dataset.NetcdfDataset.openDataset(NetcdfDataset.java:421) > at ucar.nc2.dataset.NetcdfDataset.acquireDataset(NetcdfDataset.java:516) > at ucar.nc2.dataset.NetcdfDataset.acquireDataset(NetcdfDataset.java:493) > at gov.nasa.giss.data.nc.NcDataset.initMe(NcDataset.java:107) > > Caused by: java.io.IOException: Negative seek offset > at ucar.unidata.io.RandomAccessFile.seek(RandomAccessFile.java:402) > at ucar.nc2.iosp.hdf5.FractalHeap.readDirectBlock(FractalHeap.java:463) > at ucar.nc2.iosp.hdf5.FractalHeap.<init>(FractalHeap.java:180) > at > ucar.nc2.iosp.hdf5.H5header$DataObject.processAttributeInfoMessage(H5header.java:2344) > at ucar.nc2.iosp.hdf5.H5header$DataObject.<init>(H5header.java:2332) > at ucar.nc2.iosp.hdf5.H5header$DataObject.<init>(H5header.java:2180) > at ucar.nc2.iosp.hdf5.H5header.getDataObject(H5header.java:2027) > at ucar.nc2.iosp.hdf5.H5header.access$600(H5header.java:70) > at > ucar.nc2.iosp.hdf5.H5header$DataObjectFacade.<init>(H5header.java:2073) > at ucar.nc2.iosp.hdf5.H5header.readGroupNew(H5header.java:3911) > at ucar.nc2.iosp.hdf5.H5header.access$900(H5header.java:70) > at ucar.nc2.iosp.hdf5.H5header$H5Group.<init>(H5header.java:2153) > at ucar.nc2.iosp.hdf5.H5header$H5Group.<init>(H5header.java:2118) > at ucar.nc2.iosp.hdf5.H5header.readSuperBlock2(H5header.java:354) > at ucar.nc2.iosp.hdf5.H5header.read(H5header.java:206) > at ucar.nc2.iosp.hdf5.H5iosp.open(H5iosp.java:128) > at ucar.nc2.NetcdfFile.<init>(NetcdfFile.java:1521) > at ucar.nc2.NetcdfFile.open(NetcdfFile.java:813) > at ucar.nc2.NetcdfFile.open(NetcdfFile.java:422) > ... 15 more > > Note: The above comes from NJ 4.3.22. I have not yet switched up to NJ 4.5.X, > although I plan to do so after the next Panoply release. > > rbs -- Robert B. Schmunk Webmaster / Senior Systems Programmer NASA Goddard Institute for Space Studies 2880 Broadway, New York, NY 10025