This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
Hi Jason, Please view: http://my.unidata.ucar.edu/cgi-bin/rtstats/iddstats_nc?NNEXRAD+drizzle.hupas.howard.edu It should look something like: http://my.unidata.ucar.edu/cgi-bin/rtstats/iddstats_nc?NNEXRAD+thelma.ucar.edu The LDM is very time sensitive as it looks for the time stamp to determine what products to request.. We suggest running NTP (Network Time Protocol)...since you will be talking to the IT folks thought you should mention this as well. Cheers, Jeff --------------------------------------------------------------------- Jeff Weber address@hidden : Unidata Program Center PH:303-497-8676 : University Corp for Atmospheric Research 3300 Mitchell Ln : http://www.unidata.ucar.edu/staff/jweber Boulder,Co 80307-3000 : --------------------------------------------------------------------- On Wed, 4 Aug 2004, Jason T. Smith wrote: > Hello Jeff, > How are things going on your end? Here at Howard we had our local meeting > and a few questions came up. > > 1) Right now were are getting the NNEXRAD feed, is there a default feed that > gives more than just NNEXRAD? Also is there a list of all of the avalible > feeds, we have looked thoughout the UNIDATA site and counldnt find it. > > 2) Is there anything similar to scour that can save data between specified > dates? Like if there was a storm today, is there a script that can save that > data so it wont be deleted when scour runs. > > > I think that was it, well atleast what my notes are showing. They were > talking > about getting the decoders working, is this the decoders package on unidata's > site? THanks in advance. > > > > Jason Smith > > > > ------------------------------------------------- > This mail sent through IMP: http://horde.org/imp/ >