[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
20040415: Vietnam and GEMPAK GRIB decoding (cont.)
- Subject: 20040415: Vietnam and GEMPAK GRIB decoding (cont.)
- Date: Thu, 15 Apr 2004 17:50:50 -0600
>From: Mai Nguyen <address@hidden>
>Organization: National Center for Hydro-Meteorological Forecasting of Vietnam
>Keywords: 200312020023.hB20N4p2027742 IDD LDM Linux GEMPAK
Mai,
Here is the latest thinking on the easiest way for you to incorporate
your GTS observations into decoded GEMPAK synoptic files.
First the issues related to running the dclsfc decoder separately from
the invocation in ~ldm/etc/pqact.gempak:
- only one decoder should be writing to an output file
at a time
- there is no guarantee of exactly when there will be no
synoptic data being ingested and processed by the LDM
- you could decode your GTS observations into a separate
file, but they you would need to perform two steps to
plot the data being decoded by the LDM and the data being
decoded by another process. In order to have the two
different datasets available to plot in NMAP2, you
would have to modify an NMAP2 configuration file
Given the above, it is probably better to reformat the data file and
then insert it into the LDM queue and let the single instance of dclsfc
decode it into the same GEMPAK file as all of the synoptic data
received by the LDM. The issues with this are:
- you have to create a combined station table that contains
the station table in the GEMPAK distribution, and your
own additions;.
- you will have to manually modify the synoptic decoding entry
in the ~ldm/etc/pqact.gempak file each time you get a new
GEMPAK distribution and use the script $NAWIPS/ldm/etc/gen_pqact.csh
to generate a new pqact.gempak set of actions.
The first item is the road you were already headed down when I
suggested that you run the decoder separately. After talking with Chiz
about this strategy this morning, my mind was changed about the "best"
way to accomplish your task at hand.
I did the following to set things up:
- modified the data file cleanup script to inject the product into
the LDM queue
- moved the cleanup script from ~gempak/util to ~ldm/util
since it will run from the 'ldm' account
- created a combined station database from the Vietnam station file
/home/gempak/etc/lsystns.vn and from the station file distributed
with the GEMPAK distribution, $GEMTBL/stns/lsfstns.tbl. I
created the combined file in the ~ldm/etc directory as follows:
<as 'ldm'>
<add 'source /home/gempak/NAWIPS/Gemenviron' to ~ldm/.cshrc>
cd etc
cat /home/gempak/etc/lsystns.vn $GEMTBL/stns/lsfstns.tbl > lsystns.combined
You can easily recreate this file each time you install a new
GEMPAK release.
- modified the ~ldm/etc/pqact.gempak synoptic decoding entry
to use the new, combined station table file:
change:
#
# Synoptic land reports
#
WMO (^S[IM]V[IGNS])|(^SNV[INS])|(^S[IMN](W[KZ]|[^VW]))
PIPE decoders/dclsfc -v 2 -s lsystns.upc
-d data/gempak/logs/dclsfc.log
-e GEMTBL=/home/gempak/GEMPAK5.7.1/gempak/tables
data/gempak/syn/YYYYMMDD_syn.gem
to:
#
# Synoptic land reports
#
WMO|EXP (^S[IM]V[IGNS])|(^SNV[INS])|(^S[IMN](W[KZ]|[^VW]))
PIPE decoders/dclsfc -v 2 -s etc/lsystns.combined
-d data/gempak/logs/dclsfc.log
-e GEMTBL=/home/gempak/GEMPAK5.7.1/gempak/tables
data/gempak/syn/YYYYMMDD_syn.gem
>A day is so short to do all I want to.
I agree completely. I wasn't able to spend much time getting things
working on your system yesterday since a number of other things
were demanding attention.
>I saw the script and ran it manually. Its final
>product was the file ready for dclsfc, but not the
>decoded in gempak format. I guess, it's because the
>script was not completed, it only completed the 1st
>steps (reformatting). I've done the rest two steps
>manually and it worked fine.
You are correct. That version of the script was not complete. Also,
since I changed my mind about the best way for you to process the data
you want decoded, it has been further modified today.
re: latitudes and longitudes format in your station file
>Yes it was in lat/lon in degrees and minutes. But I've
>recalculated them. So lsfstns.vn is in correct format
>and values. ( So you wouldn't need to do with the
>numbers).
Excellent, thanks!
>I've noticed that there is some differences in the
>coordinates of our stations in the previous
>lsystns.tbl and our lsfstns.vn. The numbers normally
>differ about 0.01-0.02 degrees. But the extreme case
>is station 48920 (lon=11192 in lsystns.tbl, but
>lon=11145 in vnstns.prn which is the corresponding to
>our degree+minute file VN_synstns.txt). Is this
>difference negligible? (I'm not sure whether it's a
>political or technical problem!!)
The longitude value in lsystns.tbl represents 111.92 degrees. The
value in vnstns.prn represents 111.68 degrees. So, I would say that
the difference _is_ important. So, I made sure that 48920 is in
/home/gempak/etc/lsystns.vn.
As far as the problem being political or technical, I would say that it
is simply a technical one. We routinely find that station location
information is incorrect, so we correct it in the tables we send out
with our packages.
You man have noticed that the file:
$GEMTBL/stns/lsfstns.tbl
is actually a link to:
$GEMTBL/stns/lsystns.upc
lsystns.upc is the file that Chiz sends out with the Unidata GEMPAK
distribution. It contains additions to the table distributed by NCEP
($GEMTBL/stns/lsfstns.ncep) and modifications to things like station
locations.
By creating a combined station table consisting of your station
definitions followed by the ones from the GEMPAK distribution, you will
automatically use your definitions in place of the ones in the GEMPAK
distribution. Given this, I kept the definition for 48920 in
/home/gempak/lsystns.vn. So, you can modify and add to lsystns.vn
as you like to add new stations or correct values you find are
wrong in $GEMTBL/stns/lsystns.upc as you like. Just remember that
each time you update /home/gempak/lsystns.vn you should create
a new combined station table file using:
<as 'ldm'>
cd etc
cat /home/gempak/etc/lsystns.vn $GEMTBL/stns/lsfstns.tbl > lsystns.combined
>So, I've didn't change the corrdinates of the Vietnam
>stations which are present in lsystns.tbl.
Good. This will make your job easier in the future.
re: non-Vietnam stations in the AAXXnn data files
>Yes, they are correct stations! The 59046 is a
>Chinese one, and 98446 is the Phillipines one. The
>AAXX00 is all we've got from GTS, and it includes both
>our national stations and international ones.
OK.
So, here is the procedure I envision working on your system:
1) you get GTS data from somewhere (?) and bring it over to met_research3.
I suggest putting this data into a directory that will be scoured by
the 'bin/ldmadmin scour' action that is running out of 'ldm's
crontab. In a previous email, I suggested putting the file(s) in:
/home/nadata/ldm/raw/syn
There is already a file there, so the directory structure exists
and things can begin working.
2) after you have a new data file, you can run the cvtVNsyn.tcl
script in /home/ldm/util. All you have to specify is the fully
qualified name of the input file. Here is an example:
<as 'ldm'>
cd /home/ldm
util/cvtVNsyn.tcl /home/nadata/ldm/raw/syn/AAXX00
cvtVNsyn.tcl reads the input file and creates an output one named
AAXX00.cvrt in /tmp (it uses whatever the name portion is and adds
the suffix .cvrt). It then uses the LDM application 'pqing' to
'ing'est the product into the LDM queue as a product using the feed
type EXP. This will then be decoded by the action that was modified
in /home/ldm/etc/pqact.gempak.
3) there is one additional parameter you can specify for cvtVNsyn.tcl:
a indication that you want to log the output of the conversion
process. If you specify anything except 'no' or "NO", the log file
will be created in whatever directory you were in when the script is
run. It is unlikely that you will want to create the log file. I
added the ability to create the log file so I could monitor how the
script was proceeding.
The process now contains a simple demonstration of how one goes about
inserting products into the LDM queue using 'pqing' so they will be
decoded. The nice thing about 'pqing' is that it uses information in
the WMO header in the product as the LDM/IDD product header that is
matched by the regular expression that is compared to the action in
~ldm/etc/pqact.conf.
A procedure similar to this could be used to insert your model output
GRIB messages, but it is easier to use a different LDM utility,
pqinsert, to do this. The thing that you have to know about when
using pqinsert is how to create a product ID that will match the
dcgrib2 decoding action in /home/ldm/etc/pqact.gempak. There are
a number of dcgrib2 actions in pqact.gempak, but none exactly fit
your model data, so I suggest creating a new action that matches
your model data.
I added a new action to /home/ldm/etc/pqact.conf to decode your
HRM data. I added this action to pqact.conf instead of pqact.gempak
so that you don't have to change worry about it when you generate
an new pqact.gempak after a new GEMPAK install.
I also uncommented the running of pqact that uses pqact.conf in
/home/ldm/ldmd.conf, and then stopped and restarted the LDM
(after making changes in ldmd.conf, you have to stop and restart
for the changes to take effect).
Now, to insert one of your HRM GRIB files into the queue and have
the data decoded by dcgrib2, you would do the following:
<as 'ldm'>
pqinsert -f EXP -p "VSSS /mHRM" grib_pathname
Following the example for the synoptic data, I suggest putting
your HRM model output in /home/nadata/ldm/raw/GRIB. The pqinsert
invocation would then look something like:
pqinsert -f EXP -p "VSSS /mHRM" /home/nadata/ldm/raw/GRIB/hrm_2003111600f000
Now, on to some other questions you have asked:
>From address@hidden Thu Apr 8 03: 15:23 2004
>4) I am surprised that there is no symbol for cloud
>types in the surface observation plots. It seems like
>US forecasters don't need to see that information?
There are cloud symbols available in surface observation plots.
The don't come up by default in NMAP2, but you can do the plots.
For more information, take a look at the help provided in the
GEMPAK online documentation:
Unidata GEMPAK HomePage
http://my.unidata.ucar.edu/content/software/gempak/index.html
GEMPAK Manual Help Pages
http://my.unidata.ucar.edu/content/software/gempak/help_and_documentation/manual/index.html
Chapter 4: GEMPAK Programs
http://my.unidata.ucar.edu/content/software/gempak/help_and_documentation/manual/chap4/index.php
sfmap
http://my.unidata.ucar.edu/content/software/gempak/help_and_documentation/manual/chap4/chap4.php?prog=sfmap
On the last page, click on the 'SFPARM' link. this will give you
all of the parameters you can plot. In particular, look at the section
labeled 'CLOUD'. Down in the list, you will see:
CSYL - Low cloud type symbol
CSYM - Middle cloud type symbol
CSYH - High cloud type symbol
CSYT - Cloud type symbol on first reported level
These are the parameters names for cloud symbol plots.
>From address@hidden Tue Apr 13 09: 28:52 2004
>3) MY PREVIOUS QUESTIONS:
> + Degrib our model data automatically by ldm?
> + Meta files (how to create them)
> + Cloud symbols for synop observations
I believe that the information above answers the first and last
questions here, but what do you mean by "Meta files (how to create them)?
>4) AND SOME NEW QUESTIONS:
> + How can I set ldm to automatically delete the old
> files (older than a certain number of days)?
> + Is that possible to draw isolines based on the
> synoptic observations?
> + It's funny with the QUICKSCAT info. When I load
> it, NMAP2 shows that the information is available, but
> it shows nothing. The same with ATCF. NMAP2 gives no
> error for it. But as I looked in the datatype.tbl of
> gempak, the variables defining directory of the data
> is not defined. How NMAP2 can see that it's available?
> Is that a bug of NMAP2?
The first and second questions were answered in previous emails.
The third question is beyond my meager knowledge of GEMPAK. I will
ask Chiz to respond to this one.
>From address@hidden Wed Apr 14 09: 16:23 2004
>There is another aspect that may need to be
>considered. We have another 200 hydrological rain
>gauge stations which give us rainfall. Therefore,
>there won't be enough free space in WMO table to put
>them in.
Chiz's distribution of GEMPAK allows for a lot more stations
than the NCEP distribution. It may be the case that there
is enough room for your rainguage stations.
> + Are the rainfall observations shown in the same
> manner as meteorological surface observations?
I think so, yes.
> Or they
> are another kind of data? And there are different
> tables for rainfall stations?
I don't think so.
> + Is there another way to put surface observations in
> gempak format? Not from standard WMO text bulletins
> but from decoded data? This question arised because we
> have done some works on correcting the data defects
> (which are very common in our system due to our
> not-so-advanced infrastructure!) and it might be nice
> to utilize the corrected observations. I am not sure
> how dclsfc handle the bad data (just throw them
> away?).
Chiz will have to answer this one.
>Thank you as always.
No worries.
Cheers,
Tom
--
NOTE: All email exchanges with Unidata User Support are recorded in the
Unidata inquiry tracking system and then made publically available
through the web. If you do not want to have your interactions made
available in this way, you must let us know in each email you send to us.