This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
Hi Chris, re: > I have updated the LDM, and everything seemed to go very smoothly. It's > back up and runing, watch shows product coming in, however tail is > starting to show lots more the the gempak decoder problem from earlier. OK. > If you would like to access my machine to take a look around, you may. Thanks for the login info. I am on your machine right now, and the first thing I noticed is that the size of your LDM queue is too small for the amount of data you are receiving _and_ given your setup of using only one instance of pqact to process all of the data you are receiving. I strongly recommend that you revisit the ldmd.conf and pqact.conf setup on your machine. In particular, I suggest that you do the following: <login as 'gempak'> -- run the script that creates the pqact.conf actions: $NAWIPS/ldm/etc/gen_pqact.csh When the script asks you if you want to create a single output file or multiple output files (I can't remember the exact question), indicate that you want to create multiple files. The script will create multiple pqact.conf files AND indicate a sequence of 'exec' lines that you should add to your ~ldm/etc/ldmd.conf file. The only reason I did not do this for you was I saw that you have what looks like locally-developed pqact.conf actions that I do not want to touch since I don't know exactly what you are trying to accomplish. One thing that was happening was your single 'pqact' invocation was either having a hard time keeping up with all of the processing that it was scheduled to do by the actions in ~ldm/etc/pqact.conf, or was failing to keep up with the processing so that data that was being ingested was not getting processed before new data overwrote it in your LDM queue. In order to mitigate this situation somewhat, I did the following: <as 'ldm'> -- edit ~ldm/etc/ldmadmin-pl.conf and change the default LDM queue size from 400M (400 MB) to 1G (1 GB) ldmadmin stop ldmadmin delqueue ldmadmin mkqueue ldmadmin start You are not using a 1GB LDM product queue which will give your single 'pqact' invocation more time to process the data in your queue. This is _not_ sufficient, however, since you are requesting is enough to warrant having multiple 'pqact' invocations. Redoing your processing using the multiple pqact.conf files produced by the GEMPAK gen_pqact.csh script The other thing I noticed was that your latencies for all feeds were horrible. Take a look at the 'latency' links for each feed in: http://www.unidata.ucar.edu/software/idd/rtstats/siteindex.php?thunder.storm.uni.edu The latencies were bad enough that you were only receiving a small portion of the data you were requesting! After I deleted and remade your LDM queue, I note a rapid drop in latencies down to near zero. This tells me that there was likely something wrong with your LDM queue. Deleting and remaking it started things from ground zero. The problem that the queue evidently had may well have been the cause of your processing problems. Since redoing your queue, I have seen _no_ processing errors in ~ldm/logs/ldmd.log. This might not mean much as I have not been watching your system for that long... > Thanks again for your help No worries. Please seriously consider redoing your pqact.conf processing as I advise above. This will help your system run more efficiently and decode products faster. Cheers, Tom **************************************************************************** Unidata User Support UCAR Unidata Program (303) 497-8642 P.O. Box 3000 address@hidden Boulder, CO 80307 ---------------------------------------------------------------------------- Unidata HomePage http://www.unidata.ucar.edu **************************************************************************** Ticket Details =================== Ticket ID: VVR-781619 Department: Support LDM Priority: Normal Status: Closed