This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
Ive made a mod that will keep reading until the end of file, if you open the file specifically as a hdf5 file, which admittedly is not all that easy to do. Weve never seen a file with this kind of offset before (it will actually fail when opening in netcdf4 library). Do you know who created it?First, the HDF5 spec allows for an offset superblock. A Panoply user was trying to open an HDF file in which the superblock was offset 1,048,576 (512*2048) bytes, and the NJ library was throwing back an exception that the dataset did not look like valid CDM.
Second, in a different case, the NJ library was throwing back just a NullPointerException, with stack trace java.lang.NullPointerException at ucar.nc2.iosp.hdf5.H5header.extendDimension(H5header.java:536) at ucar.nc2.iosp.hdf5.H5header.findDimensionLists(H5header.java:476) at ucar.nc2.iosp.hdf5.H5header.makeNetcdfGroup(H5header.java:378) at ucar.nc2.iosp.hdf5.H5header.makeNetcdfGroup(H5header.java:390) at ucar.nc2.iosp.hdf5.H5header.makeNetcdfGroup(H5header.java:390) at ucar.nc2.iosp.hdf5.H5header.read(H5header.java:182)
This one is probably my bug, but im still trying to figure it out.