[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[netCDFJava #CRA-892492]: Using Java NetCDF With a Large HDF5 Database - Performance Questions
- Subject: [netCDFJava #CRA-892492]: Using Java NetCDF With a Large HDF5 Database - Performance Questions
- Date: Fri, 26 May 2017 10:53:24 -0600
Can you at least tell me the size (in characters) of the header (using h5dump
or ncdump or NCdumpW)? A very large header can be partly responsible for
performance issues. The header is read once per file open. Your report
would seem consistent with the cost of reading the header in Java.
We rarely encounter really large headers so we have not optimized for that case.
I suspect, however, that your original thought was correct. Our Java-native
HDF5 implementation is probably the problem. I have put it on our list of
issues to look at (see https://github.com/Unidata/thredds/issues/847)
but it may take a while.
=Dennis Heimbigner
Unidata
Ticket Details
===================
Ticket ID: CRA-892492
Department: Support netCDF Java
Priority: Normal
Status: Closed
===================
NOTE: All email exchanges with Unidata User Support are recorded in the Unidata
inquiry tracking system and then made publicly available through the web. If
you do not want to have your interactions made available in this way, you must
let us know in each email you send to us.