This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
Hi Elie, re: > Thanks, Tom. > I've examinded our ldm installation and we should be receiving the data in > $LDMHOME/data > but unfortunaetly there is nothing in there. You're right we are not > currently processing > the data, are you saying that ldm automatically gets rid of it, if so how > often? No, the end user (UNCC) is responsible for setting up scouring of the data that is ingested/processed. This is typically done through cron-initiated scripts. We provide a 'scour' utility with the LDM that many sites use for their data scouring purposes. The cron entry for 'scour' could look something like: # Scour data directory 0 * * * * bin/ldmadmin scour > /dev/null 2>&1 It is unlikely, however, that all of the data would be scoured off as the concept embodied in the 'scour' utility is for the user to choose the number of days of data to keep. re: > our scour file is set to remove and/or overwrite the data every 3 days. I assume that if you are using 'scour', it is setup to scour data that is more than three days old. re; > The service is up and running, that I was able to verify : > > [ldm@meteo ~]$ netstat -al | grep ldm > tcp 0 0 meteo.uncc.edu:54988 conan.unidata.u:unidata-ldm > TIME_WAIT > tcp 0 0 meteo.uncc.edu:46321 idd.unidata.uca:unidata-ldm > ESTABLISHED > tcp 0 0 meteo.uncc.edu:46320 idd.unidata.uca:unidata-ldm > ESTABLISHED > [ldm@meteo ~]$ This agrees with the real time statistics I talked about in my last email. re: > [ldm@meteo ~]$ ps -aef | grep ldm > root 18555 16613 0 14:31 pts/0 00:00:00 su - ldm > ldm 18556 18555 0 14:31 pts/0 00:00:00 -bash > ldm 18742 1 0 14:36 ? 00:00:00 rpc.ldmd -I 0.0.0.0 -P 388 -M > 256 -m 3600 -o 3600 -q /usr/local/ldm/data/ldm.pq /usr/local/ldm/etc/ldmd.conf > ldm 18744 18742 0 14:36 ? 00:00:27 rtstats -f ANY -h > rtstats.unidata.ucar.edu > ldm 18748 18742 1 14:36 ? 00:00:54 rpc.ldmd -I 0.0.0.0 -P 388 -M > 256 -m 3600 -o 3600 -q /usr/local/ldm/data/ldm.pq /usr/local/ldm/etc/ldmd.conf > ldm 18749 18742 0 14:36 ? 00:00:01 rpc.ldmd -I 0.0.0.0 -P 388 -M > 256 -m 3600 -o 3600 -q /usr/local/ldm/data/ldm.pq /usr/local/ldm/etc/ldmd.conf > > any suggestions on how to diagnose this problem further is greatly > appreciated. OK, I see a problem right off: there is no invocation of 'pqact' in your listing of 'ldm' processes. This means that either you are not trying to run processing actions, or there is some problem with the actions you are trying to run. The first thing to check is to make sure that you have an invocation of at least one instance of 'pqact' in your LDM configuration file, ~ldm/etc/ldmd.conf. The invocation will look something like: EXEC "pqact" It can look a little different if you specify which datastreams that invocation of 'pqact' is to act on. Here is a different example: EXEC "pqact -f WMO|FSL2|GPS|NLDN|PCWS|EXP /local/ldm/etc/pqact.gempak_decoders" NB: 'exec' can be upper or lower case. All that is important is that the EXEC line not be commented. Can you send us your ~ldm/etc/ldmd.conf file so we can take a look? The other thing that we could do is login to your system and take a look at your setup. If you would like us to do this, please send me (address@hidden) an email with the password for your 'ldm' user. Please do _not_ specify the user the password is for or the machine on which it is valid. Cheers, Tom -- **************************************************************************** Unidata User Support UCAR Unidata Program (303) 497-8642 P.O. Box 3000 address@hidden Boulder, CO 80307 ---------------------------------------------------------------------------- Unidata HomePage http://www.unidata.ucar.edu **************************************************************************** Ticket Details =================== Ticket ID: ZIU-540402 Department: Support IDD Priority: Normal Status: Closed