[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
20020619: unable to display NEXRCOMP data in McIDAS
- Subject: 20020619: unable to display NEXRCOMP data in McIDAS
- Date: Wed, 19 Jun 2002 12:58:25 -0600
>From: Gilbert Sebenste <address@hidden>
>Organization: NIU
>Keywords: 200206182234.g5IMYN621495 McIDAS-X 7.8 NEXRCOMP ADDE
Gilbert,
>I still am not getting the data to save on disk (weather2.admin). I am now
>requesting ALL of FNEXRAD, and the LDM-mcidas log shows it's supposedly
>saving the datafine. I can't find it, though. I went through a bunch of
>stuff to see if anything was missing/wrong, but I couldn't find anything.
I logged onto weather2 and looked in your pqact.conf file for the action
that saves the NEXRAD composites. It is:
#NEXRAD Level III radar composites in PNG-compressed GINI format
FNEXRAD ^rad/NEXRCOMP/(...)/(...)_(........)_(....)
PIPE -close pngg2gini -vl logs/ldm-mcidas.log
data/nexrad/NEXRCOMP/\2/\2_\3_\4
This tells me that you need to look in the ~ldm/data/nexrad/NEXRCOMP directory
for the files. When I did just that, I see them:
<as 'ldm'>
cd ~/data/nexrad/NEXRCOMP
weather2-niu ldm-2> ls
n0r/ n1p/ ntp/
weather2-niu ldm-4> cd n0r
weather2-niu ldm-4> ls
n0r_20020619_1539 n0r_20020619_1620 n0r_20020619_1700 n0r_20020619_1741
n0r_20020619_1549 n0r_20020619_1630 n0r_20020619_1710 n0r_20020619_1751
n0r_20020619_1559 n0r_20020619_1640 n0r_20020619_1720 n0r_20020619_1802
n0r_20020619_1609 n0r_20020619_1650 n0r_20020619_1730 n0r_20020619_1812
So, the files are being received AND decoded by pngg2gini.
Now, as 'mcidas':
The next step in the process of being able to view the files in McIDAS
is to create an ADDE dataset for them. The template BATCH file for
creating ADDE datasets for locally held data in McIDAS is
~mcidas/data/DSSERVE.BAT. This will setup datasets for a variety of
data that McIDAS sites can get/decode/save, but it is designed to be
copied to a local file which then gets edited so that site specific
information will be used (this is all in the online documentation, by
the way).
Since you already have datasets setup for the types of data that you
want to serve through your ADDE server, the simplest thing to do is
create a separate BATCH file with just the NEXRCOMP setup entries in
it. I created this for you by simply copying the NEXRCOMP entries out
of DSSERVE.BAT into ~mcidas/data/NEXRCOMP.BAT. Take a look.
The next step is to make a local copy of this file (since I will be
including a file called NEXRCOMP.BAT in the next and succeeding McIDAS
releases, and you don't want your work to be overwritten on a
subsequent installation), and then edit the local copy to set the file
location information to match your setup. I did this by creating
~mcidas/data/NIUNEXRC.BAT. Take a look at it.
After doing the editing, the final step in setting up the ADDE dataset
is to run the BATCH command on the BATCH file you just created:
<again, as 'mcidas'>
cd ~/workdata
batch.k NIUNEXRC.BAT
Then, you test the dataset definition using other McIDAS commands:
weather2-niu Mci-25> dataloc.k ADD NEXRCOMP weather2.admin.niu.edu
Group Name Server IP Address
-------------------- ----------------------------------------
NEXRCOMP WEATHER2.ADMIN.NIU.EDU
<LOCAL-DATA> indicates that data will be accessed from the local data directory.
DATALOC -- done
weather2-niu Mci-26> dsinfo.k I NEXRCOMP
Dataset Names of Type: IMAGE in Group: NEXRCOMP
Name NumPos Content
------------ ------ --------------------------------------
1KN0R-NAT 99999 1 km N0R US Base Reflectivity Composite
2KN1P-NAT 99999 2 km N1P US 1-hr Precip. Composite
4KNTP-NAT 99999 4 km NTP US Storm Total Precip. Composite
DSINFO -- done
weather2-niu Mci-32> imglist.k NEXRCOMP/1KN0R-NAT.ALL
Image file directory listing for:NEXRCOMP/1KN0R-NAT
Pos Satellite/ Date Time Center Band(s)
sensor Lat Lon
--- ------------- ------------ -------- ---- ---- ------------
1 RADAR 19 JUN 02170 15:39:00 TWX 27
2 RADAR 19 JUN 02170 15:49:00 TWX 27
3 RADAR 19 JUN 02170 15:59:00 TWX 27
4 RADAR 19 JUN 02170 16:09:00 TWX 27
5 RADAR 19 JUN 02170 16:20:00 TWX 27
6 RADAR 19 JUN 02170 16:30:00 TWX 27
7 RADAR 19 JUN 02170 16:40:00 TWX 27
8 RADAR 19 JUN 02170 16:50:00 TWX 27
9 RADAR 19 JUN 02170 17:00:00 TWX 27
10 RADAR 19 JUN 02170 17:10:00 TWX 27
11 RADAR 19 JUN 02170 17:20:00 TWX 27
12 RADAR 19 JUN 02170 17:30:00 TWX 27
13 RADAR 19 JUN 02170 17:41:00 TWX 27
14 RADAR 19 JUN 02170 17:51:00 TWX 27
15 RADAR 19 JUN 02170 18:02:00 TWX 27
16 RADAR 19 JUN 02170 18:12:00 TWX 27
17 RADAR 19 JUN 02170 18:22:00 TWX 27
imglist.k: done
After doing this, I verified that weather2 would serve the data to
machines not in the niu.edu domain by pointing my McIDAS-X session at
weather2 for the NEXRCOMP dataset. All worked as expected.
The next thing you will NEED to do is setup data scouring for the
composites. Since the files are large, you will need to attend to this
without much delay.
(As noted in a previous email) the way to setup data scouring is to FTP
the C shell script prune_gini.csh from the pub/ldm5/scour directory of
anonymous FTP from our FTP server, ftp.unidata.ucar.edu. Put this
script in a directory in the PATH of 'ldm' on weather2, and set the
read/write/execute permission so that the script is executable:
chmod +x prune_gini.csh
Then, edit prune_gini.csh and set the three variables that are needed
in the scour:
PATH
KEEP
areadir
PATH defines the search PATH for recursive invocations of prune_gini.csh.
For instance, if you put prune_gini.csh in the ~ldm/decoders directory,
PATH would become:
setenv PATH /home/ldm/decoders:${PATH}
You set KEEP to be the number of data files you want to keep on disk
at any time. The default is set to 12.
You set 'areadir' to the location of a directory under which lie the
subdirectories that contain the images you want to scour. Since
the images are being written into the /home/ldm/data/nexrad/NEXRCOMP/...
directories, I would set 'areadir' to /home/ldm/data/nexrad/NEXRCOMP:
set areadir=/home/ldm/data/nexrad/NEXRCOMP
Finally, you need to setup a crontab entry to run the script. Since
the images come in more-or-less every 10 minutes, you need to setup
scouring on a frequent basis, like every 15 minutes to half hour. The
reason for this is that 10 1k N0R national products will consume 140 MB
per hour!
I leave the scouring setup to you.
Tom