[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Thredds out of memory
- Subject: Re: Thredds out of memory
- Date: Tue, 29 Mar 2005 12:14:56 -0700
Tennessee Leeuwenburg wrote:
I guessed as much. Can you give me a back-of-the-envelope calculation
as to the relationship between request size and memory requirements?
Say I want to get the whole 500Mb file. Do I need 500Mb, or 1Gb - 500
for the file and 500 for the DODS object?
im pretty sure it will read it in once into memory and manipulate it
from there, so 500 Mb.
On top of that, how much is needed for the rest of thredds to do its
business in terms of building the dds / das etc?
this is pretty small, a few 10K and constant (not proportional to data
request)
I can imagine streaming would be a medium-hard problem, especially if
the netCDF structure is significantly different from the DODS
structure, which I suppose could sometimes happen. But it would
definately make a difference to the performance of the system I am
building...
yeah, as i said it will take a while to get there. also, it will be done
in the version 2.2 library.
Cheers,
-T
Its the size of the data request that determines the memory needeed,
not the file per se.
unfortunately, we currently have to bring the whole request into
memory before sending it out. eventually we will modify both
netcdf-java and netcdf-dods to allow data to be streamed. however i
doubt we can get to it before the end of the year.
meanwhile your only recourse is to increase java heap space. you
could also modify the code to test for data request size and reject
anything thats too big.