[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[netCDFJava #QQO-385265]: Unable to read grib data using OPENDAP
- Subject: [netCDFJava #QQO-385265]: Unable to read grib data using OPENDAP
- Date: Tue, 30 Jun 2015 15:35:48 -0600
Hi Ann,
It's hard for me to answer your question as it greatly depends on the actual
processing you're doing. However, as a general rule, if you can fit all of the
data into memory, it makes processing it much simpler. But if the data is too
large, you don't have that option and you'll need to write it to disk. It's
also hard to define what "too large" is, as it depends on your hardware and how
you've configured the JVM, but I tend to get uneasy at the 1 GB mark.
Do you need to have all of the data available to do any processing, or is only
a small chunk needed to compute part of the result? For example, maybe you only
need one time slice loaded into memory at a time? That would make things easier.
I don't expect caching to help here, unless you plan to have a lot of local
NetCDF files open. And NCO might be useful, but only if part or all of your
processing could be done by their operators. I expect that NCO has been coded
to properly handle very large datasets as well.
Cheers,
Christian
Ticket Details
===================
Ticket ID: QQO-385265
Department: Support netCDF Java
Priority: Normal
Status: Closed