[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
20040420: 20040319: Gempak - Solaris 8 - decoding drops with dcuair
- Subject: 20040420: 20040319: Gempak - Solaris 8 - decoding drops with dcuair
- Date: Tue, 20 Apr 2004 15:19:17 -0600
Megan,
You are decoding each of the drops100x.txt to the same test.gem file, which
only has 24 times allowed in it. The confusion is that I meant to merge the
decoded content from three separate files into a new created file (actually
using the SNMOD program) created with SNCFIL to allow more than 24 times.
You should be able to do the following:
dcuair -c 021004/0000 -b 2000 -m 24 -a 2000 test1.gem < drops1001.txt
dcuair -c 021004/0000 -b 2000 -m 24 -a 2000 test2.gem < drops1002.txt
dcuair -c 021004/0000 -b 2000 -m 24 -a 2000 test3.gem < drops1003.txt
You can ignore the "-p snmerg.pack" as it is the default.
Then, create a new sounding file with SNCFIL:
SNOUTF = testmerg.gem
SNPRMF =
STNFIL =
MRGDAT = no
TIMSTN = 200/2000
GEMPAK-SNCFIL>r
SNCFIL PARAMETERS:
New sounding file: testmerg.gem
Station file:
Number of stations in STNFIL: 0
Number of additional stations: 2000
Total number of stations: 2000
Total number of times: 200
This file will be an unmerged sounding file containing
mandatory and significant data below and above 100 mb.
Enter <cr> to accept parameters or type EXIT:
Parameters requested: SNOUTF,SNPRMF,STNFIL,MRGDAT,TIMSTN.
GEMPAK-SNCFIL>
Again here, the SNPRMF is blank and will use the snmerg.pack file.
Note that these are unmerged files from the decoder.
--------------------------------------------------------------
Now use SNMOD to copy over each existing file into the new testmer.gem file
SNFILE = test1.gem
SNOUTF = testmerg.gem
SNPARM = dset
AREA = dset
DATTIM = all
LEVELS = all
VCOORD = pres
TIMSTN = 200/2000
MRGDAT = no
IDNTYP = stnm
GEMPAK-SNMOD>r
<stuff>
SNFILE = test2.gem
GEMPAK-SNMOD>r
<stuff>
SNFILE = test3.gem
GEMPAK-SNMOD>r
<stuff>
At this point, you should be able to use your combined
file as merged or unmerged in SNLISt etc.
Steve CHiswell
>From: Megan Gentry <address@hidden>
>Organization: UCAR/Unidata
>Keywords: 200404202059.i3KKxgCT006461
>Thanks for your help!
>I do have a question about using SNMERG though. I've been using the following
>commands to try and combine soundings from three different days into one file.
> ..
>drops1001.txt
>drops1002.txt
>drops1003.txt
>The total number of dropsondes over all three days is 65. So, my -a parameter
> of
>2000 should be more than sufficient. However, when I look at the soundings in
>SNLIST, only the first 49 soundings processed are showing up.
>I can decode all three days into individual files, so I know that the problem
>isn't with the sounding data. Does snmerg only allow a certain number of
>soundings to be in one file? Or is it that one of the flags in my dcuair comm
> and
>isn't set correctly?
>Thanks,
>Megan Gentry
>
>
>Unidata Support wrote:
>
>> Megan,
>>
>> This sounds like you either don't have enough stations allowed in your
>> created file, or you have run out of time slots. my guess is that your
>> drops10.txt has more than 24 hours worth of data....for example,
>> in SNLIST, using DATTIM=list, you may see times from the previous day.
>>
>> The best way to work with this is to use the YYYYMMDD template
>> in your output file name, instead of test.gem. This way, each file will
>> have one day or 24 hours in the file. If you need to combine the data into
>> a single file later, then you can use SNMERG to copy multiple files
>> into a single file, but at least with the decoder, in order to accomplish
>> the hourly bins with -m 24, the initial step should be daily files.
>>
>> The other p[ossibility would be you are allowing space for 1000 additional
>> stations (the -a flag) in your data set....and since you have dropsondes,
>> they won't be in the snstns.tbl file. Depending on the number of dropsondes
>> you have, you may need to increase that.....but that wouldn't be likely for
>> broadcast data....but would be more likely if you had field experiment data.
>>
>> Steve Chiswell
>>
>> >From: "Megan Gentry" <address@hidden>
>> >Organization: UCAR/Unidata
>> >Keywords: 200403192136.i2JLasQB029999
>>
>> >Institution: NC State Forecasting Lab
>> >Package Version: 5.6
>> >Operating System: Solaris 8
>> >Hardware Information: Sun workstation
>> >Inquiry: I'm trying to decode several dropsondes using dcuair. I\'m having
> tr
>> > ouble getting them all decoded with hourly entries. I used the following
> com
>> > mand and all of the dropsondes were decoded...
>> >dcuair -c 021004/0000 -b 1000 -m 150 -a 1000 test.gem < drops10.txt
>> >However, this only produced entries every 3 hours. I read up on the -m opt
> ion
>> > and realized that I should have set it to 24 to have entries hourly. So I
> us
>> > ed the same command, only with the -m option changed to 24. This time I g
> ot
>> > hourly entries. However, when I used snlist to look at the dropsondes, th
> e o
>> > nes near the end of the dataset were missing. The dcuair.log file had the
> sa
>> > me number of bulletins read and processed with both commands.
>> >Is there another option that I need to change when I reset -m to 24?
>> >Thanks,
>> >Megan
>> >
>> >
>> >
>> --
>> ****************************************************************************
>> Unidata User Support UCAR Unidata Program
>> (303)497-8643 P.O. Box 3000
>> address@hidden Boulder, CO 80307
>> ----------------------------------------------------------------------------
>> Unidata WWW Service http://my.unidata.ucar.edu/content/support
>> ----------------------------------------------------------------------------
>> NOTE: All email exchanges with Unidata User Support are recorded in the
>> Unidata inquiry tracking system and then made publically available
>> through the web. If you do not want to have your interactions made
>> available in this way, you must let us know in each email you send to us.
>
--
NOTE: All email exchanges with Unidata User Support are recorded in the
Unidata inquiry tracking system and then made publically available
through the web. If you do not want to have your interactions made
available in this way, you must let us know in each email you send to us.