This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
Peter, How are you building your test program? I keep getting a NullPointerException at line 47 (getting the format). I agree with John, this feels like some kind of conflict, though I'm not quite sure how given the file is closed between calls from the different library. Ryan > Hi Peter: > > Not sure why the HDFView library messes up, but perhaps a link conflict? Do > you get the problem when you just use the HDFView library and not the > netcdf-java library? > > If you look at the attributes of a variable that is chunked in a netCDF4 > file, you will see one called _ChunkSizes that has the chunk sizes in them. > Im not sure if theres one for the compression algorithm, though we would > consider adding it. > > John > > > > > Hi Ryan, > > > > It works fine for me as well under OS X 10.10 if I remove the HDF Java > > stuff (which I do have installed, got it from the HDF web site). My issue > > is that in my code, I need to use the HDF Java library in some cases. For > > example I read and write data in HDF 4, and in NetCDF 4 / HDF 5 the only > > way I could find to detect chunk size and compression of a variable in a > > NetCDF 4 file was to use the HDF object library directly. In this test > > program I've provided, I'm using HDF Java in exactly the way I need to in > > my actual code. Do you have any recommendations on how to perform the HDF > > 4/5 operations I need without having to resort to HDF Java? > > Ticket Details =================== Ticket ID: YCN-767203 Department: Support netCDF Java Priority: Critical Status: Open