This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
Great, you're getting close to having a lot! It looks like if you write a script to change the format of the lightning data, it can be processed in EDEX and show up in CAVE. Here's the current parser: https://github.com/Unidata/awips2/blob/unidata_18.2.1/edexOsgi/com.raytheon.edex.plugin.textlightning/src/com/raytheon/edex/plugin/textlightning/impl/TextLightningParser.java I was able to change your format to match the first pattern (ignoring the 8, converting the epoch to a date/time string, using the lat/lon, ignoring the 4 - this may be the count so you may move it to the end, use the amplitude of -2.2): 09/22/2021 22:03:51 37.60 -79.48 -2.2 1 And I was able to ingest the file. It does store it as "UNKN" so you will have to load it from the Product Browser or create a new menu item. Is this a sufficient work around? Thanks, Tiffany Meyer AWIPS Lead Software Engineer UCAR-Unidata If you're interested, please feel free to fill out a survey about the support you receive: https://docs.google.com/forms/d/e/1FAIpQLSeDIkdk8qUMgq8ZdM4jhP-ubJPUOr-mJMQgxInwoAWoV5QcOw/viewform > Greetings. We are successfully ingesting/processing radar, satellite, metars, > watches/warnings, and WPC surface analysis/fronts (all "near real-time" data) > and displaying it in CAVE (windows version). Our sys admins are still looking > at installing the extensions for glxinfo you had recommended. > > In the interim we have a data feed for cloud to ground lightning data in the > format as follow and was wondering if you had any recommendations or advice > on how to package this data and submit it via pqinsert so it will be > renderable in CAVE. The format of the data is simple as follows (comma > delimited with 2nd column being epoch time, 3 and 4th column being a lat/lon > respectively, and the 6th column being kA). We realize this is non-standard > but was wondering if there is a way to pump this through the ldm so CAVE will > recognize it and render it as real time data. > > Pete > > 8,1632348231376,37.6018,-79.4807,4,-2.2 > 8,1632348232301,40.8563,-79.7869,4,5.1 > 8,1632348232583,40.8735,-79.8227,4,-4.0 > 8,1632348236274,35.6116,-80.5635,4,2.5 > 8,1632348236932,35.5102,-79.5936,4,-2.4 > 8,1632348240656,35.6542,-80.4707,4,5.2 > 8,1632348240814,41.5724,-80.4570,4,-4.9 > 8,1632348240829,41.5721,-80.4930,1,9.8 > 8,1632348240833,41.5937,-80.4700,4,5.0 > 8,1632348241243,41.6714,-80.4345,4,2.3 > 8,1632348241323,41.5473,-80.5464,4,-4.1 > 8,1632348241343,41.5630,-80.4601,1,-20.7 > 8,1632348241343,41.5697,-80.4616,4,-8.6 > 8,1632348241405,41.5219,-80.4335,1,-15.7 > 8,1632348242546,40.9719,-79.8236,4,4.7 > 8,1632348242547,40.9731,-79.8227,4,4.6 > 8,1632348242548,40.9749,-79.8250,4,9.1 > 8,1632348242548,40.9752,-79.8255,4,15.6 > 8,1632348242549,40.9450,-79.8366,4,-13.9 > 8,1632348242549,40.9555,-79.8018,4,6.7 > 8,1632348242549,40.9509,-79.8203,4,15.0 > 8,1632348242549,40.9739,-79.8248,1,34.7 > 8,1632348242550,40.9776,-79.8531,4,7.7 > 8,1632348242551,40.9841,-79.8266,4,8.4 > 8,1632348242552,40.9656,-79.8194,4,7.0 > 8,1632348242554,40.9679,-79.8211,4,5.2 > 8,1632348242561,40.9758,-79.8280,4,6.9 > 8,1632348242775,40.9989,-79.7508,1,-63.0