[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Memory problems with UnidataStationObsDataset
- Subject: Re: Memory problems with UnidataStationObsDataset
- Date: Tue, 04 Jul 2006 17:11:38 -0600
Hi Eric: my profiler claims the following code consumes about 5 MB on the heap.
Not sure if you are doing something a lot more complicated. Are you keeping
pointers to the data? Anything non-standard with garbage collection?
public void testMetarDataset() throws IOException {
long start = System.currentTimeMillis();
StationObsDataset sod = (StationObsDataset) PointObsDatasetFactory.open(
topDir+"ldm/Surface_METAR_20060701_0000.nc", null, null);
DataIterator iter = sod.getDataIterator(0);
double sum = 0.0;
int count = 0;
while (iter.hasNext()) {
PointObsDatatype obs = (PointObsDatatype) iter.nextData();
StructureData sdata = obs.getData();
sum += sdata.getScalarDouble("wind_speed");
count++;
}
long took = System.currentTimeMillis() - start;
System.out.println(" that took = "+took+" msec");
System.out.println("sum= "+sum+" count = "+count);
}
that took = 6405 msec
sum= -1.4362039E8 count = 112763
If i open through catalog and read from dods, it takes ~ 20 Mb:
public void testMetarDataset() throws IOException {
long start = System.currentTimeMillis();
ThreddsDataFactory fac = new ThreddsDataFactory();
ThreddsDataFactory.Result result = fac.openDatatype(
"http://motherlode.ucar.edu:8080/thredds/catalog/station/metar/catalog.xml#NWS/METAR/Surface_METAR_20060701_0000.nc",
null);
StationObsDataset sod = (StationObsDataset) result.pobsDataset;
long took = System.currentTimeMillis() - start;
System.out.println(" open took = "+took+" msec");
start = System.currentTimeMillis();
//StationObsDataset sod = (StationObsDataset) PointObsDatasetFactory.open(
topDir+"ldm/Surface_METAR_20060701_0000.nc", null, null);
DataIterator iter = sod.getDataIterator(0);
double sum = 0.0;
int count = 0;
while (iter.hasNext()) {
PointObsDatatype obs = (PointObsDatatype) iter.nextData();
StructureData sdata = obs.getData();
sum += sdata.getScalarDouble("wind_speed");
count++;
}
took = System.currentTimeMillis() - start;
System.out.println(" read took = "+took+" msec");
System.out.println("sum= "+sum+" count = "+count);
}
open took = 2498 msec
read took = 22249 msec
sum= -1.4362039E8 count = 112763
Eric Russell wrote:
At 1:26 PM -0600 7/4/06, John Caron wrote:
Hi Eric:
Can you make the file you are working on available through ftp or http? thanks
I was using the link from the catalog you sent me:
http://motherlode.ucar.edu:8080/thredds/catalog/station/metar/catalog.xml
(the third on the list, because the "latest" one is usually in the process
of growing and is not yet large enough to cause problems), and opening using
ThreddsDataFactory.openDatatype(InvDataset, CancelTask). Thanks,
Eric
Eric Russell wrote:
I've been having OutOfMemoryError issues with UnidataStationObsDataset.
Even when I iterate throught the file without retaining a reference to
anything, like this:
DataIterator iter = _dataset.getDataIterator(0);
while (iter.hasNext()) {
PointObsDatatype obs = (PointObsDatatype)iter.nextData();
}
I get an OutOfMemoryError, even with a 512MB max heap size. I did some
investigation with a profiler, and the culprit seems to be the ArrayList
of Station objects defined on line 45 of StationObsDatasetImpl. Is it
necessary to keep the Station objects around? If so, could you provide
an alternate means of iterating throught the PointObsDatatype's that
doesn't maintain those references? Thanks,
Eric