Hi Tom:the last dimension of int[] pt is the length of the element in bytes, which is ignored.
the valuess of pt[] in this case are coming from the DataBTree structure, read from the file, which holds the offsets of each chunk
in this code we are searching through the chunks to find the one that contains wantOrigin, offset[currentEntry + 1] are the pt[] you see further in:
for (currentEntry = 0; currentEntry < nentries; currentEntry++) {if ((wantOrigin == null) || tiling.compare(wantOrigin, offset[currentEntry + 1]) < 0) { currentNode = new Node(childPointer[currentEntry], this.address);
currentNode.first(wantOrigin); break; } }there are 3 chunks, and their offsets (values of the DataBTree) look suspicious to me:
0 0 0 0 0 4 0 120 0 0 4 48 240 624 4 but im not sure if they are wrong or not the chunk size is: 2 24 120 208 4 an h5dump on it produces NaNs, so my guess is that it is damaged. On 4/28/2010 7:51 AM, Tom Margolis wrote:
Hi,is this netcdf4 or hdf5 direct ?We are using the native/C NetCDF 4 libraries rather than the native/C HDF5 libraries to write out this file.Can you try in c library?Sorry, I don't have the means to do that. I really don't think this is a problem with your code. If you can just tell me why 'int[] pt' would have five values instead of the expected four (x,y,z,t), and why the fourth value would be so large (624), that would help: 'int[] pt' variable is {4, 48, 240, 624, 4} Thanks, TomOn 4/27/2010 2:49 PM, Tom Margolis wrote:Sorry, I don't have a file written from the pre-patch days. And now I suspect that this isn't a problem with your code. The problem is almost certainly caused by the HDF5 patch which we use to write out the test data that I sent you. So really I guess I'm asking you to replicate the problem and explain where the odd 'int[] pt' value is coming from (and what it means). But if you confirm that this isn't a bug with your code, just tell em and I'll let you off the hook! Thanks, Tomim not sure yet where the bug is, although it looks suspicious - can you try in c library? is this netcdf4 or hdf5 direct ?On 4/27/2010 1:39 PM, Tom Margolis wrote:Hi John, (I looked up Reed College by the way - now I want to go.) We recently upgraded our HDF5 libraries to version 1.8.4-patch1, and now one of our NetCDF API tests is failing. Here is simple test code to run, pointing to the attached 'gtgGrib.nc' file: GridDataset gridDataset = new GridDataset( new NetcdfDataset( NetcdfFile.open("/path/to/gtgGrib.nc") ) ); GridDatatype grid = gridDataset.findGridByName( "Turbulence_SIGMET_AIRMET" ); grid.readDataSlice( 4, 13, 176, 216 ); // FAILS The call to grid.readDataSlice fails with: java.lang.AssertionError at ucar.nc2.iosp.hdf5.Tiling.tile(Tiling.java:84) at ucar.nc2.iosp.hdf5.Tiling.order(Tiling.java:96) at ucar.nc2.iosp.hdf5.Tiling.compare(Tiling.java:110) at ucar.nc2.iosp.hdf5.H5header$DataBTree$Node.first(H5header.java:4242) at ucar.nc2.iosp.hdf5.H5header$DataBTree$DataChunkIterator.<init>(H5header.java:4142) at ucar.nc2.iosp.hdf5.H5header$DataBTree.getDataChunkIterator(H5header.java:4085) at ucar.nc2.iosp.hdf5.H5tiledLayoutBB.<init>(H5tiledLayoutBB.java:110) at ucar.nc2.iosp.hdf5.H5iosp.readData(H5iosp.java:142) at ucar.nc2.iosp.hdf5.H5iosp.readData(H5iosp.java:121) at ucar.nc2.NetcdfFile.readData(NetcdfFile.java:1775) at ucar.nc2.Variable._read(Variable.java:1010) at ucar.nc2.Variable.read(Variable.java:809) at ucar.nc2.dataset.VariableDS._read(VariableDS.java:503) at ucar.nc2.Variable.read(Variable.java:809) at ucar.nc2.Variable.read(Variable.java:755) at ucar.nc2.dt.grid.GeoGrid.readDataSlice(GeoGrid.java:613) at ucar.nc2.dt.grid.GeoGrid.readDataSlice(GeoGrid.java:524) Here is the method in Tiling which throws the AssertionError on line 84: line 80: public int[] tile(int[] pt) { line 81: // assert pt.length == rank; line 82: int[] tile = new int[rank]; line 83: for (int i = 0; i< rank; i++) { line 84: assert shape[i]>= pt[i]; line 85: tile[i] = pt[i] / tileSize[i]; line 86: } line 87: return tile; line 88: } The shape[] is correct: {5, 67, 340, 587}. The AssertionError is thrown when the submitted 'int[] pt' variable is {4, 48, 240, 624, 4} - note that '624' is out-of-bounds of the shape[]. Why would requested indices of {4, 13, 176, 216 } - which are all within bounds of the data - ever generate an out-of-bounds 'int[] pt' of {4, 48, 240, 624, 4}? Thanks, Tomthanks tom, i am reproducing the error and trying to figure it out. you say this used to work? do you have an equivilent fie written before 1.8.4-patch1 ?