[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: distributing mesonet data for use in IDV
- Subject: Re: distributing mesonet data for use in IDV
- Date: Mon, 18 Jun 2007 13:49:02 -0600 (MDT)
John,
We have a netCDF subseting service that the IDV folks plan on using to
read the point data. When the design is completed the IDV will be able to
produce a widget for a user to enter their parameters and then use the
subsetting service.
http://www.unidata.ucar.edu/projects/THREDDS/tech/interfaceSpec/StationDataSubsetService.html
Currently John Caron has implemented a METAR subsetting service on the
THREDDS side. The service is created by StationObsServlet and some other
files StationObsCollection.java and QueryParams.java that I'll attach so
you have an sort-of a template to work from. The METAR service reads from
a netCDF file but you will have to change the code so you can read from
your mySQL database. This should at least get you started in the right
direction. Sometime in the future you will have to get all the source
code. It's not available on the web because we don't want hackers reading
the code looking for break in spots.
Robb...
On Wed, 13 Jun 2007, John Horel wrote:
Robb-
We use mySQL for storing the data. So, the goal would be to have a widget in
IDV that serves up a mySQL query to our database. Basically, when someone
wants to sync surface obs to radar or whatever else in IDV, the widget would
create a query and then we serve up the data in a way that is compatible with
IDV. We can handle all the query software and serving it up, but we'll
definitely need some help with being IDV/THREDDS compliant.
John
Robb Kambic wrote:
John,
Congrats on getting the equipment grant, now you should have the disk space
to store your data.
It's been awhile since our conversation at the AMS about serving up your
surface obs using THREDDS and the IDV. Could you elaborate on how you
expect it to work, what type of displays, and what type of software you are
using. This would give us some idea on how to approach the problem.
I did a prototype implementation of using the db4o database to store the
METAR reports and to distribute them though the THREDDS server. But the
db4o database had a too restrictive license so I dropped the effort. At
this point, I don't know if any of that effort would help your effort.
Thanks,
RObb...
On Fri, 8 Jun 2007, John Horel wrote:
Robb-
Got our letter yesterday saying we're funded from Unidata to move forward
on the IDV hook to MesoWest. How have things progressed on your end as far
as having the capability to query a surface ob database from within IDV?
Regards,
John
Robb Kambic wrote:
On Wed, 14 Feb 2007, John Horel wrote:
Robb-
Starting to put together the equipment proposal to Unidata, which in
part, would include a data server to serve up mesonet observations. So,
I'd like to follow up on our conversations in San Antonio as far as
possible approaches. I'm still a bit fuzzy as to what would be the best
way to do it. My preference would be to develop a query directly to our
database rather than having to store the data in an external hosted
format. That way we don't have to keep around all 10 years of data in a
netCDF file format or some such. Have you had a chance to proceed with
your middleware to handle metars? And, could you describe that approach
a bit more for me?
Regards,
John
John,
your timing is perfect, yesterday our group had a meeting on this. in
fact, the last month i created a realtime database to store the Metars.
at this time there is a simple url servlet interface that permits one to
make queries against it. so your idea of maintaining the data in a
database is the correct approach. i'm just starting on making a general
station observation dataset adapter that would create the link from some
data repository ie database into the Java netcdf library, then the IDV.
the station observation dataset adapter would get enough info so it could
know how to query the proper data repository. at this time, i don't know
the type/amount of info needed. i'll keep you informed on our progress.
do you keep all the data in one database? i was planning on daily
creating a new database file for data that was 3 days old and only
keeping ~3 days of data in the realtime database. this way the
performance would be good for the realtime requests and archive requests
would require opening the daily database files.
Machine requirements guess...
the Thredds Data Server would have to be installed on the machine with
cpu power to handle the amount of requests plus good network conectivity.
i'm sure you have a better handle on the disk space requirements. you
might want to look at THREDDS page for requirements for the TDS.
http://www.unidata.ucar.edu/projects/THREDDS/
robb...
===============================================================================
Robb Kambic Unidata Program Center
Software Engineer III Univ. Corp for Atmospheric Research
address@hidden WWW: http://www.unidata.ucar.edu/
===============================================================================
===============================================================================
Robb Kambic Unidata Program Center
Software Engineer III Univ. Corp for Atmospheric Research
address@hidden WWW: http://www.unidata.ucar.edu/
===============================================================================
===============================================================================
Robb Kambic Unidata Program Center
Software Engineer III Univ. Corp for Atmospheric Research
address@hidden WWW: http://www.unidata.ucar.edu/
===============================================================================
/*
* Copyright 1997-2007 Unidata Program Center/University Corporation for
* Atmospheric Research, P.O. Box 3000, Boulder, CO 80307,
* address@hidden.
*
* This library is free software; you can redistribute it and/or modify it
* under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation; either version 2.1 of the License, or (at
* your option) any later version.
*
* This library is distributed in the hope that it will be useful, but
* WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser
* General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with this library; if not, write to the Free Software Foundation,
* Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
*/
package thredds.server.ncSubset;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import java.io.*;
import java.util.ArrayList;
import thredds.servlet.AbstractServlet;
import thredds.servlet.ServletUtil;
import thredds.servlet.DebugHandler;
import thredds.servlet.ThreddsConfig;
import thredds.datatype.DateRange;
import org.jdom.transform.XSLTransformer;
import org.jdom.output.XMLOutputter;
import org.jdom.output.Format;
import org.jdom.Document;
import org.jdom.JDOMException;
import org.jdom.input.SAXBuilder;
/**
* Netcdf StationObs subsetting.
*
* @author caron
*/
public class StationObsServlet extends AbstractServlet {
private boolean allow = false;
private StationObsCollection soc;
private boolean debug = false;
// must end with "/"
protected String getPath() {
return "ncss/metars/";
}
protected void makeDebugActions() {
DebugHandler debugHandler = DebugHandler.get("NetcdfSubsetServer");
DebugHandler.Action act;
act = new DebugHandler.Action("showMetarFiles", "Show Metar Files") {
public void doAction(DebugHandler.Event e) {
e.pw.println("Metar Files\n");
ArrayList<StationObsCollection.Dataset> list = soc.getDatasets();
for (StationObsCollection.Dataset ds : list) {
e.pw.println(" " + ds);
}
}
};
debugHandler.addAction(act);
}
public void init() throws ServletException {
super.init();
allow = ThreddsConfig.getBoolean("NetcdfSubsetService.allow", false);
if (!allow) return;
//socRewrite = new StationObsCollection("C:/temp2/", false);
//socOrg = new StationObsCollection("C:/data/metars/", false);
//socOrg = new
StationObsCollection("/data/ldm/pub/decoded/netcdf/surface/metar/", true);
soc = new StationObsCollection("/opt/tomcat/content/thredds/public/stn/",
"/data/ldm/pub/decoded/netcdf/surface/metar/");
//soc = new StationObsCollection("C:/temp2/", "C:/data/metars/");
}
public void destroy() {
super.destroy();
if (null != soc)
soc.close();
}
protected void doGet(HttpServletRequest req, HttpServletResponse res) throws
ServletException, IOException {
if (!allow) {
res.sendError(HttpServletResponse.SC_FORBIDDEN, "Service not supported");
return;
}
long start = System.currentTimeMillis();
ServletUtil.logServerAccessSetup(req);
if (debug) System.out.println(req.getQueryString());
String pathInfo = req.getPathInfo();
if (pathInfo == null) pathInfo = "";
boolean wantXML = pathInfo.endsWith("dataset.xml");
boolean showForm = pathInfo.endsWith("dataset.html");
boolean wantStationXML = pathInfo.endsWith("stations.xml");
if (wantXML || showForm || wantStationXML) {
showForm(res, wantXML, wantStationXML);
return;
}
// parse the input
QueryParams qp = new QueryParams();
if (!qp.parseQuery(req, res, new String[]{QueryParams.RAW, QueryParams.CSV,
QueryParams.XML, QueryParams.NETCDF}))
return; // has sent the error message
if (qp.hasBB) {
qp.stns = soc.getStationNames(qp.getBB());
if (qp.stns.size() == 0) {
qp.errs.append("ERROR: Bounding Box contains no stations\n");
qp.writeErr(res, qp.errs.toString(),
HttpServletResponse.SC_BAD_REQUEST);
return;
}
}
if (qp.hasStns && soc.isStationListEmpty(qp.stns)) {
qp.errs.append("ERROR: No valid stations specified\n");
qp.writeErr(res, qp.errs.toString(), HttpServletResponse.SC_BAD_REQUEST);
return;
}
if (qp.hasLatlonPoint) {
qp.stns = new ArrayList<String>();
qp.stns.add(soc.findClosestStation(qp.lat, qp.lon));
} else if (qp.fatal) {
qp.errs.append("ERROR: No valid stations specified\n");
qp.writeErr(res, qp.errs.toString(), HttpServletResponse.SC_BAD_REQUEST);
return;
}
boolean useAllStations = (!qp.hasBB && !qp.hasStns && !qp.hasLatlonPoint);
if (useAllStations)
qp.stns = new ArrayList<String>(); // empty list denotes all
if (qp.hasTimePoint && (soc.filterDataset(qp.time) == null)) {
qp.errs.append("ERROR: This dataset does not contain the time point= " +
qp.time + " \n");
qp.writeErr(res, qp.errs.toString(), HttpServletResponse.SC_BAD_REQUEST);
return;
}
if (qp.hasDateRange) {
DateRange dr = qp.getDateRange();
if (!soc.intersect(dr)) {
qp.errs.append("ERROR: This dataset does not contain the time range= "
+ qp.time + " \n");
qp.writeErr(res, qp.errs.toString(),
HttpServletResponse.SC_BAD_REQUEST);
return;
}
}
boolean useAllTimes = (!qp.hasTimePoint && !qp.hasDateRange);
if (useAllStations && useAllTimes) {
qp.errs.append("ERROR: You must subset by space or time\n");
qp.writeErr(res, qp.errs.toString(), HttpServletResponse.SC_BAD_REQUEST);
return;
}
// set content type
String contentType = qp.acceptType;
if (qp.acceptType.equals(QueryParams.CSV))
contentType = "text/plain"; // LOOK why
res.setContentType(contentType);
if (qp.acceptType.equals(QueryParams.NETCDF)) {
res.setHeader("Content-Disposition", "attachment;
filename=metarSubset.nc");
File file = soc.writeNetcdf(qp);
ServletUtil.returnFile(this, req, res, file, QueryParams.NETCDF);
file.delete();
long took = System.currentTimeMillis() - start;
System.out.println("\ntotal response took = " + took + " msecs");
return;
}
soc.write(qp, res.getWriter());
long took = System.currentTimeMillis() - start;
System.out.println("\ntotal response took = " + took + " msecs");
}
private void showForm(HttpServletResponse res, boolean wantXml, boolean
wantStationXml) throws IOException {
String infoString;
if (wantXml) {
Document doc = soc.getDoc();
XMLOutputter fmt = new XMLOutputter(Format.getPrettyFormat());
infoString = fmt.outputString(doc);
} else if (wantStationXml) {
Document doc = soc.getStationDoc();
XMLOutputter fmt = new XMLOutputter(Format.getPrettyFormat());
infoString = fmt.outputString(doc);
} else {
InputStream xslt = getXSLT("ncssSobs.xsl");
Document doc = soc.getDoc();
try {
XSLTransformer transformer = new XSLTransformer(xslt);
Document html = transformer.transform(doc);
XMLOutputter fmt = new XMLOutputter(Format.getPrettyFormat());
infoString = fmt.outputString(html);
} catch (Exception e) {
log.error("SobsServlet internal error", e);
ServletUtil.logServerAccess(HttpServletResponse.SC_INTERNAL_SERVER_ERROR, 0);
res.sendError(HttpServletResponse.SC_INTERNAL_SERVER_ERROR,
"SobsServlet internal error");
return;
}
}
res.setContentLength(infoString.length());
if (wantXml || wantStationXml)
res.setContentType("text/xml; charset=iso-8859-1");
else
res.setContentType("text/html; charset=iso-8859-1");
OutputStream out = res.getOutputStream();
out.write(infoString.getBytes());
out.flush();
ServletUtil.logServerAccess(HttpServletResponse.SC_OK, infoString.length());
}
private InputStream getXSLT(String xslName) {
return getClass().getResourceAsStream("/resources/xsl/" + xslName);
}
}
/*
* Copyright 1997-2007 Unidata Program Center/University Corporation for
* Atmospheric Research, P.O. Box 3000, Boulder, CO 80307,
* address@hidden.
*
* This library is free software; you can redistribute it and/or modify it
* under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation; either version 2.1 of the License, or (at
* your option) any later version.
*
* This library is distributed in the hope that it will be useful, but
* WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser
* General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with this library; if not, write to the Free Software Foundation,
* Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
*/
package thredds.server.ncSubset;
import ucar.ma2.StructureData;
import ucar.ma2.Array;
import ucar.nc2.dt.*;
import ucar.nc2.dt.point.WriterStationObsDataset;
import ucar.nc2.dt.point.StationObsDatasetInfo;
import ucar.nc2.VariableSimpleIF;
import ucar.nc2.NetcdfFile;
import ucar.nc2.units.DateFormatter;
import ucar.unidata.geoloc.LatLonRect;
import ucar.unidata.geoloc.LatLonPointImpl;
import ucar.unidata.util.Format;
import java.io.*;
import java.util.*;
import java.util.concurrent.locks.ReadWriteLock;
import java.util.concurrent.locks.ReentrantReadWriteLock;
import thredds.datatype.DateRange;
import thredds.datatype.DateType;
import thredds.catalog.XMLEntityResolver;
import org.jdom.Document;
import org.jdom.Element;
public class StationObsCollection {
static private org.slf4j.Logger log =
org.slf4j.LoggerFactory.getLogger(StationObsCollection.class);
static private org.slf4j.Logger cacheLogger =
org.slf4j.LoggerFactory.getLogger("cacheLogger");
private static boolean debug = true, debugDetail = false;
private static long timeToScan = 0;
private String archiveDir, realtimeDir;
private ArrayList<Dataset> datasetList;
private List<VariableSimpleIF> variableList;
private DateFormatter format = new DateFormatter();
private boolean isRealtime;
private Date start, end;
private Timer timer;
private ReadWriteLock lock = new ReentrantReadWriteLock();
public StationObsCollection(String archiveDir, String realtimeDir) {
this.archiveDir = archiveDir;
this.realtimeDir = realtimeDir;
this.isRealtime = (realtimeDir != null);
if (isRealtime) { // LOOK what if not realtime ??
timer = new Timer("StationObsCollection.Rescan");
Calendar c = Calendar.getInstance(); // contains current startup time
c.setTimeZone(TimeZone.getDefault()); // local time
c.add(Calendar.HOUR_OF_DAY, 24); // start tommorrow
c.set(Calendar.HOUR_OF_DAY, 1); // at 1 AM
c.set(Calendar.MINUTE, 0);
c.set(Calendar.SECOND, 0);
timer.schedule(new ReinitTask(), 1000 * 5); // do in 5 secs
timer.schedule(new ReinitTask(), c.getTime(), (long) 1000 * 60 * 60 *
24); // repeat once a day
cacheLogger.info("StationObsCollection timer set to run at " +
c.getTime());
}
}
public void close() {
closeDatasets();
if (timer != null)
timer.cancel();
}
private void closeDatasets() {
if (datasetList == null)
return;
for (Dataset ds : datasetList) {
try {
ds.sod.close();
} catch (IOException ioe) {
log.warn("Couldnt close " + ds.sod, ioe);
}
}
if (timer != null)
timer.cancel();
}
private class ReinitTask extends TimerTask {
public void run() {
cacheLogger.info("StationObsCollection.reinit at " +
format.toDateTimeString(new Date()));
init();
}
}
public String getName() {
return archiveDir + "/" + realtimeDir;
}
public ArrayList<Dataset> getDatasets() {
return datasetList;
}
////////////////////////////////////////////
// keep track of the available datasets LOOK should be configurable
private String dateFormatString = "yyyyMMdd_HHmm";
private FileFilter ff = new
org.apache.commons.io.filefilter.SuffixFileFilter(".nc");
private void init() {
CollectionManager archive = new CollectionManager( archiveDir, ff,
dateFormatString);
CollectionManager realtime = new CollectionManager( realtimeDir, ff,
dateFormatString);
makeArchiveFiles(archive, realtime);
try {
lock.writeLock().lock(); // wait till no readers
// close all open datasets
closeDatasets();
// create new list
ArrayList<Dataset> newList = new ArrayList<Dataset>();
initArchive(archive, newList);
initRealtime(archive, realtime, newList);
Collections.sort(newList);
// make this one the operational one
datasetList = newList;
int n = datasetList.size();
start = datasetList.get(0).time_start;
end = datasetList.get(n - 1).time_end;
} finally {
lock.writeLock().unlock();
}
datasetDesc = null; // mostly to create new time range
}
// make new archive files
private void makeArchiveFiles(CollectionManager archive, CollectionManager
realtime) {
// the set of files in realtime collection that come after the archive files
List<CollectionManager.MyFile> realtimeList = realtime.after(
archive.getLatest());
if (realtimeList.size() < 2)
return;
// leave latest one alone
int n = realtimeList.size();
Collections.sort(realtimeList);
realtimeList.remove(n - 1);
// for the rest, convert to a more efficient format
for (CollectionManager.MyFile myfile : realtimeList) {
String filename = myfile.file.getName();
String fileIn = realtimeDir + "/" + filename;
String fileOut = archiveDir + "/" + filename;
try {
long start = System.currentTimeMillis();
cacheLogger.info("StationObsCollection: write " + fileIn + " to archive
" + fileOut);
WriterStationObsDataset.rewrite(fileIn, fileOut);
archive.add(new File(fileOut), myfile.date);
long took = System.currentTimeMillis() - start;
cacheLogger.info(" that took= " + (took/1000)+" secs");
} catch (IOException e) {
cacheLogger.error("StationObsCollection: write failed (" + fileIn + "
to archive " + fileOut + ")", e);
}
}
}
// read archive files - these have been rewritten for efficiency, and dont
change in realtime
private void initArchive(CollectionManager archive, ArrayList<Dataset>
newList) {
// LOOK can these change ?
stationList = null;
variableList = null;
Calendar c = Calendar.getInstance(); // contains current startup time
c.add(Calendar.HOUR, -8 * 24); // 7 or 8 days ago
Date firstDate = c.getTime();
cacheLogger.info("StationObsCollection: delete files before " + firstDate);
int count = 0;
long size = 0;
// each one becomes a Dataset and added to the list
ArrayList<CollectionManager.MyFile> list = new
ArrayList<CollectionManager.MyFile>(archive.getList());
for (CollectionManager.MyFile myfile : list) {
// delete old files
if (myfile.date.before(firstDate)) {
myfile.file.delete();
boolean ok = archive.remove(myfile);
cacheLogger.info("StationObsCollection: Deleted archive file " +
myfile+" ok= "+ok);
continue;
}
// otherwise, add an sobsDataset
try {
Dataset ds = new Dataset(myfile.file, false);
newList.add(ds);
count++;
size += myfile.file.length();
if (null == stationList)
stationList = new ArrayList<Station>(ds.sod.getStations());
if (null == variableList)
variableList = new
ArrayList<VariableSimpleIF>(ds.sod.getDataVariables());
} catch (IOException e) {
cacheLogger.error("Cant open " + myfile, e);
}
}
size = size / 1000 / 1000;
cacheLogger.info("Reading directory " + archive + " # files = " + count + "
size= " + size + " Mb");
}
// the archive files - these have been rewritten for efficiency, and dont
change in realtime
private void initRealtime(CollectionManager archive, CollectionManager
realtime, ArrayList<Dataset> newList) {
long size = 0;
int count = 0;
List<CollectionManager.MyFile> realtimeList = realtime.after(
archive.getLatest());
if (realtimeList.size() == 0)
return;
// each one becomes a Dataset and added to the list
for (CollectionManager.MyFile myfile : realtimeList) {
try {
Dataset ds = new Dataset(myfile.file, true);
newList.add(ds);
count++;
size += myfile.file.length();
if (null == stationList)
stationList = new ArrayList<Station>(ds.sod.getStations());
if (null == variableList)
variableList = new
ArrayList<VariableSimpleIF>(ds.sod.getDataVariables());
} catch (IOException e) {
cacheLogger.error("Cant open " + myfile, e);
}
}
size = size / 1000 / 1000;
cacheLogger.info("Reading realtime directory " + realtime + " # files = " +
count + " total file sizes = " + size + " Mb");
}
class Dataset implements Comparable {
String filename, name;
StationObsDataset sod;
Date time_start;
Date time_end;
boolean mayChange;
Dataset(File file, boolean mayChange) throws IOException {
this.filename = file.getAbsolutePath();
this.name = file.getName();
this.mayChange = mayChange;
this.sod = get();
this.time_start = sod.getStartDate();
this.time_end = sod.getEndDate();
if (debug)
System.out.println(" add " + this);
}
StationObsDataset get() throws IOException {
if (sod == null) {
StringBuffer sbuff = new StringBuffer();
if (debug) System.out.println("StationObsDataset open " + filename);
sod = (StationObsDataset)
TypedDatasetFactory.open(thredds.catalog.DataType.STATION, filename, null,
sbuff);
if (null == sod) {
log.info("Cant open " + filename + "; " + sbuff);
return null;
}
} else if (mayChange) {
NetcdfFile ncfile = sod.getNetcdfFile();
ncfile.syncExtend();
}
return sod;
}
public int compareTo(Object o) {
Dataset od = (Dataset) o;
return time_start.compareTo(od.time_start);
}
public String toString() {
return "StationObsDataset " + filename + " start= " +
format.toDateTimeString(time_start) +
" end= " + format.toDateTimeString(time_end);
}
}
////////////////////////////////////////////////////
// Dataset Description
private String datasetName = "/thredds/ncss/metars";
private Document datasetDesc;
public Document getDoc() throws IOException {
if (datasetDesc != null)
return datasetDesc;
Dataset ds = datasetList.get(0);
if (debug) System.out.println("getDoc open " + ds.filename);
StationObsDataset sod = ds.get();
StationObsDatasetInfo info = new StationObsDatasetInfo(sod, null);
Document doc = info.makeStationObsDatasetDocument();
Element root = doc.getRootElement();
// fix the location
root.setAttribute("location", datasetName); // LOOK
// fix the time range
Element timeSpan = root.getChild("TimeSpan");
timeSpan.removeContent();
DateFormatter format = new DateFormatter();
timeSpan.addContent(new
Element("begin").addContent(format.toDateTimeStringISO(start)));
timeSpan.addContent(new Element("end").addContent(isRealtime ? "present" :
format.toDateTimeStringISO(end)));
// add pointer to the station list XML
Element stnList = new Element("stationList");
stnList.setAttribute("title", "Available Stations",
XMLEntityResolver.xlinkNS);
stnList.setAttribute("href", "/thredds/ncss/metars/stations.xml",
XMLEntityResolver.xlinkNS); // LOOK kludge
root.addContent(stnList);
datasetDesc = doc;
return doc;
}
public Document getStationDoc() throws IOException {
StationObsDataset sod = datasetList.get(0).get();
StationObsDatasetInfo info = new StationObsDatasetInfo(sod, null);
return info.makeStationCollectionDocument();
}
///////////////////////////////////////
// station handling
private List<Station> stationList;
private HashMap<String, Station> stationMap;
/**
* Determine if any of the given station names are actually in the dataset.
*
* @param stns List of station names
* @return true if list is empty, ie no names are in the actual station list
* @throws IOException if read error
*/
public boolean isStationListEmpty(List<String> stns) throws IOException {
HashMap<String, Station> map = getStationMap();
for (String stn : stns) {
if (map.get(stn) != null) return false;
}
return true;
}
public boolean intersect(DateRange dr) throws IOException {
return dr.intersect(start, end);
}
private List<Station> getStationList() throws IOException {
return stationList;
}
private HashMap<String, Station> getStationMap() throws IOException {
if (null == stationMap) {
stationMap = new HashMap<String, Station>();
List<Station> list = getStationList();
for (Station station : list) {
stationMap.put(station.getName(), station);
}
}
return stationMap;
}
/**
* Get the list of station names that are contained within the bounding box.
*
* @param boundingBox lat/lon bounding box
* @return list of station names contained within the bounding box
* @throws IOException if read error
*/
public List<String> getStationNames(LatLonRect boundingBox) throws
IOException {
LatLonPointImpl latlonPt = new LatLonPointImpl();
ArrayList<String> result = new ArrayList<String>();
List<Station> stations = getStationList();
for (Station s : stations) {
latlonPt.set(s.getLatitude(), s.getLongitude());
if (boundingBox.contains(latlonPt)) {
result.add(s.getName());
// boundingBox.contains(latlonPt); debugging
}
}
return result;
}
/**
* Find the station closest to the specified point.
* The metric is (lat-lat0)**2 + (cos(lat0)*(lon-lon0))**2
*
* @param lat latitude value
* @param lon longitude value
* @return name of station closest to the specified point
* @throws IOException if read error
*/
public String findClosestStation(double lat, double lon) throws IOException {
double cos = Math.cos(Math.toRadians(lat));
List<Station> stations = getStationList();
Station min_station = stations.get(0);
double min_dist = Double.MAX_VALUE;
for (Station s : stations) {
double lat1 = s.getLatitude();
double lon1 = LatLonPointImpl.lonNormal(s.getLongitude(), lon);
double dy = Math.toRadians(lat - lat1);
double dx = cos * Math.toRadians(lon - lon1);
double dist = dy * dy + dx * dx;
if (dist < min_dist) {
min_dist = dist;
min_station = s;
}
}
return min_station.getName();
}
////////////////////////////////////////////////////////
// scanning
// scan all data in the file, records that pass the dateRange and predicate
match are acted on
private void scanAll(Dataset ds, DateRange range, Predicate p, Action a,
Limit limit) throws IOException {
StringBuffer sbuff = new StringBuffer();
StationObsDataset sod = ds.get();
if (debug) System.out.println("scanAll open " + ds.filename);
if (null == sod) {
log.info("Cant open " + ds.filename + "; " + sbuff);
return;
}
DataIterator iter = sod.getDataIterator(0);
while (iter.hasNext()) {
StationObsDatatype sobs = (StationObsDatatype) iter.nextData();
// date filter
if (null != range) {
Date obs = sobs.getObservationTimeAsDate();
if (!range.included(obs))
continue;
}
StructureData sdata = sobs.getData();
if ((p == null) || p.match(sdata)) {
a.act(sod, sobs, sdata);
limit.matches++;
}
limit.count++;
if (limit.count > limit.limit) break;
}
}
// scan data for the list of stations, in order
// records that pass the dateRange and predicate match are acted on
private void scanStations(Dataset ds, List<String> stns, DateRange range,
Predicate p, Action a, Limit limit) throws IOException {
StringBuffer sbuff = new StringBuffer();
StationObsDataset sod = ds.get();
if (debug) System.out.println("scanStations open " + ds.filename);
if (null == sod) {
log.info("Cant open " + ds.filename + "; " + sbuff);
return;
}
for (String stn : stns) {
Station s = sod.getStation(stn);
if (s == null) {
log.warn("Cant find station " + s);
continue;
}
if (debugDetail) System.out.println("stn " + s.getName());
DataIterator iter = sod.getDataIterator(s);
while (iter.hasNext()) {
StationObsDatatype sobs = (StationObsDatatype) iter.nextData();
// date filter
if (null != range) {
Date obs = sobs.getObservationTimeAsDate();
if (!range.included(obs))
continue;
}
// general predicate filter
StructureData sdata = sobs.getData();
if ((p == null) || p.match(sdata)) {
a.act(sod, sobs, sdata);
limit.matches++;
}
limit.count++;
if (limit.count > limit.limit) break;
}
}
}
// scan all data in the file, first eliminate any that dont pass the predicate
// for each station, track the closest record to the given time
// then act on those
private void scanAll(Dataset ds, DateType time, Predicate p, Action a, Limit
limit) throws IOException {
StringBuffer sbuff = new StringBuffer();
HashMap<Station, StationDataTracker> map = new HashMap<Station,
StationDataTracker>();
long wantTime = time.getDate().getTime();
StationObsDataset sod = ds.get();
if (debug) System.out.println("scanAll open " + ds.filename);
if (null == sod) {
log.info("Cant open " + ds.filename + "; " + sbuff);
return;
}
DataIterator iter = sod.getDataIterator(0);
while (iter.hasNext()) {
StationObsDatatype sobs = (StationObsDatatype) iter.nextData();
// general predicate filter
if (p != null) {
StructureData sdata = sobs.getData();
if (!p.match(sdata))
continue;
}
// find closest time for this station
long obsTime = sobs.getObservationTimeAsDate().getTime();
long diff = Math.abs(obsTime - wantTime);
Station s = sobs.getStation();
StationDataTracker track = map.get(s);
if (track == null) {
map.put(s, new StationDataTracker(sobs, diff));
} else {
if (diff < track.timeDiff) {
track.sobs = sobs;
track.timeDiff = diff;
}
}
}
for (Station s : map.keySet()) {
StationDataTracker track = map.get(s);
a.act(sod, track.sobs, track.sobs.getData());
limit.matches++;
limit.count++;
if (limit.count > limit.limit) break;
}
}
private class StationDataTracker {
StationObsDatatype sobs;
long timeDiff = Long.MAX_VALUE;
StationDataTracker(StationObsDatatype sobs, long timeDiff) {
this.sobs = sobs;
this.timeDiff = timeDiff;
}
}
// scan data for the list of stations, in order
// eliminate records that dont pass the predicate
// for each station, track the closest record to the given time, then act on
those
private void scanStations(Dataset ds, List<String> stns, DateType time,
Predicate p, Action a, Limit limit) throws IOException {
StringBuffer sbuff = new StringBuffer();
StationObsDataset sod = ds.get();
if (null == sod) {
log.info("Cant open " + ds.filename + "; " + sbuff);
return;
}
long wantTime = time.getDate().getTime();
for (String stn : stns) {
Station s = sod.getStation(stn);
if (s == null) {
log.warn("Cant find station " + s);
continue;
}
StationObsDatatype sobsBest = null;
long timeDiff = Long.MAX_VALUE;
// loop through all data for this station, take the obs with time closest
DataIterator iter = sod.getDataIterator(s);
while (iter.hasNext()) {
StationObsDatatype sobs = (StationObsDatatype) iter.nextData();
// general predicate filter
if (p != null) {
StructureData sdata = sobs.getData();
if (!p.match(sdata))
continue;
}
long obsTime = sobs.getObservationTimeAsDate().getTime();
long diff = Math.abs(obsTime - wantTime);
if (diff < timeDiff) {
sobsBest = sobs;
timeDiff = diff;
}
}
if (sobsBest != null) {
a.act(sod, sobsBest, sobsBest.getData());
limit.matches++;
}
limit.count++;
if (limit.count > limit.limit) break;
}
}
private interface Predicate {
boolean match(StructureData sdata);
}
private interface Action {
void act(StationObsDataset sod, StationObsDatatype sobs, StructureData
sdata) throws IOException;
}
private class Limit {
int count;
int limit = Integer.MAX_VALUE;
int matches;
}
////////////////////////////////////////////////////////////////
// date filter
private List<Dataset> filterDataset(DateRange range) {
if (range == null)
return datasetList;
List<Dataset> result = new ArrayList<Dataset>();
for (Dataset ds : datasetList) {
if (range.intersect(ds.time_start, ds.time_end))
result.add(ds);
}
return result;
}
Dataset filterDataset(DateType want) {
if (want.isPresent())
return datasetList.get(datasetList.size() - 1);
Date time = want.getDate();
for (Dataset ds : datasetList) {
if (time.before(ds.time_end) && time.after(ds.time_start)) {
return ds;
}
if (time.equals(ds.time_end) || time.equals(ds.time_start)) {
return ds;
}
}
return null;
}
////////////////////////////////////////////////////////////////
// writing
//private File netcdfResult = new File("C:/temp/sobs.nc");
public File writeNetcdf(QueryParams qp) throws IOException {
WriterNetcdf w = (WriterNetcdf) write(qp, null);
return w.netcdfResult;
}
public Writer write(QueryParams qp, java.io.PrintWriter pw) throws
IOException {
long start = System.currentTimeMillis();
Limit counter = new Limit();
List<String> vars = qp.vars;
List<String> stns = qp.stns;
DateRange range = qp.getDateRange();
DateType time = qp.time;
String type = qp.acceptType;
Writer w;
if (type.equals(QueryParams.RAW)) {
w = new WriterRaw(qp, vars, pw);
} else if (type.equals(QueryParams.XML)) {
w = new WriterXML(qp, vars, pw);
} else if (type.equals(QueryParams.CSV)) {
w = new WriterCSV(qp, vars, pw);
} else if (type.equals(QueryParams.NETCDF)) {
w = new WriterNetcdf(qp, vars, pw);
} else {
log.error("Unknown writer type = " + type);
return null;
}
Collections.sort(stns);
w.header(stns);
boolean useAll = stns.size() == 0;
Action act = w.getAction();
try {
lock.readLock().lock(); // wait till no writer
if (null == time) {
// use range, null means all
List<Dataset> need = filterDataset(range);
for (Dataset ds : need) {
if (useAll)
scanAll(ds, range, null, act, counter);
else
scanStations(ds, stns, range, null, act, counter);
}
} else {
// match specific time point
Dataset ds = filterDataset(time);
if (useAll)
scanAll(ds, time, null, act, counter);
else
scanStations(ds, stns, time, null, act, counter);
}
} finally {
lock.readLock().unlock();
}
w.trailer();
if (pw != null) pw.flush();
if (debug) {
long took = System.currentTimeMillis() - start;
System.out.println("\nread " + counter.count + " records; match and write
" + counter.matches + " raw records");
System.out.println("that took = " + took + " msecs");
if (timeToScan > 0) {
long writeTime = took - timeToScan;
double mps = 1000 * counter.matches / writeTime;
System.out.println(" writeTime = " + writeTime + " msecs; write
messages/sec = " + mps);
}
}
return w;
}
abstract class Writer {
abstract void header(List<String> stns);
abstract Action getAction();
abstract void trailer();
QueryParams qp;
List<String> varNames;
java.io.PrintWriter writer;
DateFormatter format = new DateFormatter();
int count = 0;
Writer(QueryParams qp, List<String> varNames, final java.io.PrintWriter
writer) {
this.qp = qp;
this.varNames = varNames;
this.writer = writer;
}
List<VariableSimpleIF> getVars(List<String> varNames,
List<VariableSimpleIF> dataVariables) {
List<VariableSimpleIF> result = new ArrayList<VariableSimpleIF>();
for (VariableSimpleIF v : dataVariables) {
if ((varNames == null) || varNames.contains(v.getName()))
result.add(v);
}
return result;
}
}
class WriterNetcdf extends Writer {
File netcdfResult;
WriterStationObsDataset sobsWriter;
List<Station> stnList;
List<VariableSimpleIF> varList;
WriterNetcdf(QueryParams qp, List<String> varNames, final
java.io.PrintWriter writer) throws IOException {
super(qp, varNames, writer);
netcdfResult = File.createTempFile("ncss", ".nc");
sobsWriter = new
WriterStationObsDataset(netcdfResult.getAbsolutePath(),"Extracted data from
Unidata/TDS Metar dataset");
if ((varNames == null) || (varNames.size() == 0)) {
varList = variableList;
} else {
varList = new ArrayList<VariableSimpleIF>(varNames.size());
for (VariableSimpleIF v : variableList) {
if (varNames.contains(v.getName()))
varList.add(v);
}
}
}
public void header(List<String> stns) {
try {
getStationMap();
if (stns.size() == 0)
stnList = stationList;
else {
stnList = new ArrayList<Station>(stns.size());
for (String s : stns) {
stnList.add(stationMap.get(s));
}
}
sobsWriter.writeHeader(stnList, varList);
} catch (IOException e) {
log.error("WriterNetcdf.header", e);
}
}
public void trailer() {
try {
sobsWriter.finish();
} catch (IOException e) {
log.error("WriterNetcdf.trailer", e);
}
}
Action getAction() {
return new Action() {
public void act(StationObsDataset sod, StationObsDatatype sobs,
StructureData sdata) throws IOException {
sobsWriter.writeRecord(sobs, sdata);
count++;
}
};
}
}
class WriterRaw extends Writer {
WriterRaw(QueryParams qp, List<String> vars, final java.io.PrintWriter
writer) {
super(qp, vars, writer);
}
public void header(List<String> stns) {
}
public void trailer() {
}
Action getAction() {
return new Action() {
public void act(StationObsDataset sod, StationObsDatatype sobs,
StructureData sdata) throws IOException {
String report = sdata.getScalarString("report");
writer.println(report);
count++;
}
};
}
}
class WriterXML extends Writer {
WriterXML(QueryParams qp, List<String> vars, final java.io.PrintWriter
writer) {
super(qp, vars, writer);
}
public void header(List<String> stns) {
writer.println("<?xml version='1.0' encoding='UTF-8'?>");
writer.println("<metarCollection dataset='"+datasetName+"'>\n");
}
public void trailer() {
writer.println("</metarCollection>");
}
Action getAction() {
return new Action() {
public void act(StationObsDataset sod, StationObsDatatype sobs,
StructureData sdata) throws IOException {
Station s = sobs.getStation();
writer.print(" <metar date='");
writer.print(format.toDateTimeStringISO(sobs.getObservationTimeAsDate()));
writer.println("'>");
writer.print(" <station name='" + s.getName() +
"' latitude='" + Format.dfrac(s.getLatitude(), 3) +
"' longitude='" + Format.dfrac(s.getLongitude(), 3));
if (!Double.isNaN(s.getAltitude()))
writer.print("' altitude='" + Format.dfrac(s.getAltitude(), 0));
if (s.getDescription() != null) {
writer.println("'>");
writer.print(s.getDescription());
writer.println("</station>");
} else {
writer.println("'/>");
}
List<VariableSimpleIF> vars = getVars(varNames,
sod.getDataVariables());
for (VariableSimpleIF var : vars) {
writer.print(" <data name='" + var.getName());
if (var.getUnitsString() != null)
writer.print("' units='" + var.getUnitsString());
writer.print("'>");
Array sdataArray = sdata.getArray(var.getName());
writer.println(sdataArray.toString() + "</data>");
}
writer.println(" </metar>");
count++;
}
};
}
}
class WriterCSV extends Writer {
boolean headerWritten = false;
List<VariableSimpleIF> validVars;
WriterCSV(QueryParams qp, List<String> stns, final java.io.PrintWriter
writer) {
super(qp, stns, writer);
}
public void header(List<String> stns) {
}
public void trailer() {
}
Action getAction() {
return new Action() {
public void act(StationObsDataset sod, StationObsDatatype sobs,
StructureData sdata) throws IOException {
if (!headerWritten) {
writer.print("time,station,latitude[unit=\"degrees_north\"],longitude[unit=\"degrees_east\"]");
validVars = getVars(varNames, sod.getDataVariables());
for (VariableSimpleIF var : validVars) {
writer.print(",");
writer.print(var.getName());
if (var.getUnitsString()!=null)
writer.print("[unit=\""+var.getUnitsString()+"\"]");
}
writer.println();
headerWritten = true;
}
Station s = sobs.getStation();
writer.print(format.toDateTimeStringISO(sobs.getObservationTimeAsDate()));
writer.print(',');
writer.print(s.getName());
writer.print(',');
writer.print(Format.dfrac(s.getLatitude(), 3));
writer.print(',');
writer.print(Format.dfrac(s.getLongitude(), 3));
for (VariableSimpleIF var : validVars) {
writer.print(',');
Array sdataArray = sdata.getArray(var.getName());
writer.print(sdataArray.toString());
}
writer.println();
count++;
}
};
}
}
static public void main(String args[]) throws IOException {
//getFiles("R:/testdata/station/ldm/metar/");
// StationObsCollection soc = new StationObsCollection("C:/data/metars/",
false);
}
}
/*
* Copyright 1997-2007 Unidata Program Center/University Corporation for
* Atmospheric Research, P.O. Box 3000, Boulder, CO 80307,
* address@hidden.
*
* This library is free software; you can redistribute it and/or modify it
* under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation; either version 2.1 of the License, or (at
* your option) any later version.
*
* This library is distributed in the hope that it will be useful, but
* WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser
* General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with this library; if not, write to the Free Software Foundation,
* Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
*/
package thredds.server.ncSubset;
import ucar.unidata.geoloc.LatLonPointImpl;
import ucar.unidata.geoloc.LatLonRect;
import ucar.unidata.geoloc.LatLonPoint;
import ucar.unidata.util.StringUtil;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import java.util.List;
import java.util.ArrayList;
import java.util.StringTokenizer;
import java.io.IOException;
import java.io.PrintWriter;
import thredds.servlet.ServletUtil;
import thredds.datatype.DateRange;
import thredds.datatype.TimeDuration;
import thredds.datatype.DateType;
/**
* Query parameter parsing for Netcdf Subset Service
*
* @author caron
*/
class QueryParams {
static final String RAW = "text/plain";
static final String XML = "application/xml";
static final String CSV = "text/csv";
static final String NETCDF = "application/x-netcdf";
// the first in the list is the canonical name, the others are aliases
static String[][] validAccept = new String[][]{
{XML, "text/xml", "xml"},
{RAW, "raw", "ascii"},
{CSV, "csv"},
{"text/html", "html"},
{"application/x-netcdf", "netcdf"},
};
public String queryString;
public List<String> accept;
public String acceptType; // choose one of the accept
public boolean wantAllVariables;
public List<String> vars;
// spatial subsetting
public boolean hasBB = false, hasStns = false, hasLatlonPoint = false; //
only one is true
public double north, south, east, west;
public double lat, lon;
public List<String> stns; // for stationObs, empty list means all
public int horizStride, vertStride, timeStride; // 0 = none
public boolean hasVerticalCoord = false;
public double vertCoord;
// temporal subsetting
public boolean hasDateRange = false, hasTimePoint = false; // only one is true
public DateType time_start, time_end, time;
public TimeDuration time_duration;
public int time_latest;
// track errors
public StringBuffer errs = new StringBuffer();
public boolean fatal;
public String toString() {
StringBuffer sbuff = new StringBuffer();
sbuff.append("queryString= " + queryString + "\n\n");
sbuff.append("parsed=\n ");
if (hasBB)
sbuff.append("bb=" + getBB().toString2() + ";");
else if (hasLatlonPoint)
sbuff.append("lat/lon=" + getPoint() + ";");
else if (hasStns) {
boolean first = true;
sbuff.append("stns=");
for (String stnName : stns) {
if (!first) sbuff.append(",");
sbuff.append(stnName);
first = false;
}
sbuff.append(";");
} else {
sbuff.append("spatial=all;");
}
sbuff.append("\n ");
if (hasTimePoint)
sbuff.append("time=" + time + ";");
else if (hasDateRange) {
sbuff.append("timeRange=" + getDateRange() + ";");
} else {
sbuff.append("temporal=all;");
}
sbuff.append("\n ");
if (wantAllVariables)
sbuff.append("vars=all;");
else {
boolean first = true;
sbuff.append("vars=");
for (String varName : vars) {
if (!first) sbuff.append(",");
sbuff.append(varName);
first = false;
}
sbuff.append(";");
}
sbuff.append("\n ");
return sbuff.toString();
}
/**
* Parse request
*
* @param req HTTP request
* @param res HTTP response
* @param acceptOK array of acceptable accept types, in order. First one is
default
* @return true if params are ok
* @throws java.io.IOException if I/O error
*/
public boolean parseQuery(HttpServletRequest req, HttpServletResponse res,
String[] acceptOK) throws IOException {
queryString = req.getQueryString();
accept = parseList(req, "accept", QueryParams.validAccept, acceptOK[0]);
for (String ok : acceptOK) {
if (accept.contains(ok)) {
acceptType = ok;
}
}
if (acceptType == null) {
fatal = true;
errs.append("Accept parameter not supported ="+accept);
}
// list of variable names
String variables = ServletUtil.getParameterIgnoreCase(req, "variables");
wantAllVariables = (variables != null) && (variables.equals("all"));
if (!wantAllVariables) {
vars = parseList(req, "var");
if (vars.isEmpty()) {
vars = null;
wantAllVariables = true;
}
}
// spatial subsetting
String spatial = ServletUtil.getParameterIgnoreCase(req, "spatial");
boolean spatialNotSpecified = (spatial == null);
// bounding box
if (spatialNotSpecified || spatial.equalsIgnoreCase("bb")) {
north = parseLat(req, "north");
south = parseLat(req, "south");
east = parseDouble(req, "east");
west = parseDouble(req, "west");
hasBB = hasValidBB();
}
// stations
if (!hasBB && (spatialNotSpecified || spatial.equalsIgnoreCase("stns"))) {
stns = parseList(req, "stn");
hasStns = stns.size() > 0;
}
// lat/lon point
if (!hasBB && !hasStns && (spatialNotSpecified ||
spatial.equalsIgnoreCase("point"))) {
lat = parseLat(req, "latitude");
lon = parseLon(req, "longitude");
hasLatlonPoint = hasValidPoint();
}
// strides
horizStride = parseInt(req, "horizStride");
vertStride = parseInt(req, "vertStride");
timeStride = parseInt(req, "timeStride");
// time range
String temporal = ServletUtil.getParameterIgnoreCase(req, "temporal");
boolean timeNotSpecified = (temporal == null);
// time range
if (timeNotSpecified || temporal.equalsIgnoreCase("range")) {
time_start = parseDate(req, "time_start");
time_end = parseDate(req, "time_end");
time_duration = parseW3CDuration(req, "time_duration");
hasDateRange = hasValidDateRange();
}
// time point
if (timeNotSpecified || temporal.equalsIgnoreCase("point")) {
time = parseDate(req, "time");
hasTimePoint = (time != null);
}
// vertical coordinate
vertCoord = parseDouble(req, "vertCoord");
hasVerticalCoord = !Double.isNaN(vertCoord);
if (fatal) {
writeErr(res, errs.toString(), HttpServletResponse.SC_BAD_REQUEST);
return false;
}
return true;
}
public DateType parseDate(HttpServletRequest req, String key) {
String s = ServletUtil.getParameterIgnoreCase(req, key);
if (s != null) {
try {
return new DateType(s, null, null);
} catch (java.text.ParseException e) {
errs.append("Illegal param= '" + key + "=" + s + "' must be valid ISO
Date\n");
}
}
return null;
}
public TimeDuration parseW3CDuration(HttpServletRequest req, String key) {
String s = ServletUtil.getParameterIgnoreCase(req, key);
if (s != null) {
try {
return TimeDuration.parseW3CDuration(s);
} catch (java.text.ParseException e) {
errs.append("Illegal param= '" + key + "=" + s + "' must be valid ISO
Duration\n");
}
}
return null;
}
public double parseDouble(HttpServletRequest req, String key) {
String s = ServletUtil.getParameterIgnoreCase(req, key);
if ((s != null) && (s.trim().length() > 0)) {
try {
return Double.parseDouble(s);
} catch (NumberFormatException e) {
errs.append("Illegal param= '" + key + "=" + s + "' must be valid
floating point number\n");
}
}
return Double.NaN;
}
public int parseInt(HttpServletRequest req, String key) {
String s = ServletUtil.getParameterIgnoreCase(req, key);
if (s != null) {
try {
return Integer.parseInt(s);
} catch (NumberFormatException e) {
errs.append("Illegal param= '" + key + "=" + s + "' must be valid
integer number\n");
}
}
return 0;
}
public double parseLat(HttpServletRequest req, String key) {
double lat = parseDouble(req, key);
if (!Double.isNaN(lat)) {
if ((lat > 90.0) || (lat < -90.0)) {
errs.append("Illegal param= '" + key + "=" + lat + "' must be between
+/- 90.0\n");
lat = Double.NaN;
}
}
return lat;
}
public double parseLon(HttpServletRequest req, String key) {
double lon = parseDouble(req, key);
if (!Double.isNaN(lon)) {
lon = LatLonPointImpl.lonNormal(lon);
}
return lon;
}
/**
* parse KVP for key=value or key=value,value,...
*
* @param req HTTP request
* @param key key to look for
* @return list of values, may be empty
*/
public List<String> parseList(HttpServletRequest req, String key) {
ArrayList<String> result = new ArrayList<String>();
// may have multiple key=value
String[] vals = ServletUtil.getParameterValuesIgnoreCase(req, key);
if (vals != null) {
for (String userVal : vals) {
if (userVal.contains(",")) { // comma separated values
StringTokenizer stoke = new StringTokenizer(userVal, ",");
while (stoke.hasMoreTokens()) {
String token = stoke.nextToken();
result.add(token);
}
} else { // single value
result.add(userVal);
}
}
}
return result;
}
/**
* Used for accept
* parse KVP for key=value or key=value,value,...
*
* @param req HTTP request
* @param key key to look for
* @param valids list of valid keywords
* @param defValue default value
* @return list of values, use default if not otherwise specified
*/
public List<String> parseList(HttpServletRequest req, String key, String[][]
valids, String defValue) {
ArrayList<String> result = new ArrayList<String>();
// may have multiple key=value
String[] vals = ServletUtil.getParameterValuesIgnoreCase(req, key);
if (vals != null) {
for (String userVal : vals) {
if (userVal.contains(",")) { // comma separated values
StringTokenizer stoke = new StringTokenizer(userVal, ",");
while (stoke.hasMoreTokens()) {
String token = stoke.nextToken();
if (!findValid(token, valids, result))
errs.append("Illegal param '" + key + "=" + token + "'\n");
}
} else { // single value
if (!findValid(userVal, valids, result))
errs.append("Illegal param= '" + key + "=" + userVal + "'\n");
}
}
}
if ((result.size() == 0) && (defValue != null)) {
result.add(defValue);
}
return result;
}
// look for userVal in list of valids; add to result if found
// return true if found
private boolean findValid(String userVal, String[][] valids,
ArrayList<String> result) {
for (String[] list : valids) {
String canon = list[0];
for (String valid : list) {
if (userVal.equalsIgnoreCase(valid)) {
result.add(canon);
return true;
}
}
}
return false;
}
/**
* Determine if a valid lat/lon bounding box was specified
*
* @return true if there is a valid BB, false if not. If an invalid BB, set
fatal=true, with error message in errs.
*/
boolean hasValidBB() {
// no bb
if (Double.isNaN(north) && Double.isNaN(south) && Double.isNaN(east) &&
Double.isNaN(west))
return false;
// misformed bb
if (Double.isNaN(north) || Double.isNaN(south) || Double.isNaN(east) ||
Double.isNaN(west)) {
errs.append("Bounding Box must have all 4 parameters:
north,south,east,west\n");
fatal = true;
return false;
}
if (north < south) {
errs.append("Bounding Box must have north > south\n");
fatal = true;
return false;
}
if (east < west) {
errs.append("Bounding Box must have east > west; if crossing 180
meridion, use east boundary > 180\n");
fatal = true;
return false;
}
return true;
}
LatLonRect getBB() {
return new LatLonRect(new LatLonPointImpl(south, west), new
LatLonPointImpl(north, east));
}
LatLonPoint getPoint() {
return new LatLonPointImpl(lat, lon);
}
/**
* Determine if a valid lat/lon point was specified
*
* @return true if there is a valid point, false if not. If an invalid point,
set fatal=true, with error message in errs.
*/
boolean hasValidPoint() {
// no point
if (Double.isNaN(lat) && Double.isNaN(lon))
return false;
// misformed point
if (Double.isNaN(lat) || Double.isNaN(lon)) {
errs.append("Missing lat or lon parameter\n");
fatal = true;
return false;
}
return true;
}
/**
* Determine if a valid date range was specified
*
* @return true if there is a valid date range, false if not. If an invalid
date range, append error message in errs.
*/
boolean hasValidDateRange() {
// no range
if ((null == time_start) && (null == time_end) && (null == time_duration))
return false;
if ((null != time_start) && (null != time_end))
return true;
if ((null != time_start) && (null != time_duration))
return true;
if ((null != time_end) && (null != time_duration))
return true;
// misformed range
errs.append("Must have 2 of 3 parameters: time_start, time_end,
time_duration\n");
return false;
}
DateRange getDateRange() {
return hasDateRange ? new DateRange(time_start, time_end, time_duration,
null) : null;
}
void writeErr(HttpServletResponse res, String s, int code) throws IOException
{
ServletUtil.logServerAccess(code, 0);
res.setStatus(code);
if (s.length() > 0) {
PrintWriter pw = res.getWriter();
pw.print(s);
pw.close();
}
}
}