[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
20010806: cleanup script for xcd data
- Subject: 20010806: cleanup script for xcd data
- Date: Mon, 06 Aug 2001 17:42:54 -0600
>From: Wayne Bresky <address@hidden>
>Organization: Cornell
>Keywords: 200108061802.f76I2w118924 McIDAS-XCD mcsour.sh
Wayne,
>Is there a cleanup script specific to the xcd data? I modified the
>mcscour script to include the xcd directory in its search path,
mcscour.sh _is_ the script that will scour the XCD data. How much data
that ends up staying in the directory depends on how much data you decide
to keep. Modifying mcscour to include the XCD directory in its search
path is not necessary as long as the other environment variables are
setup correctly.
>but I
>still see over 3 GB of data in that directory. Would you say this
>amount is typical to be saving? Thanks.
It really depends on what you are decoding and what you are trying to
save. Here are the settings I recommend for a site that wants to keep
one day's worth of GRID data and three days worth of MD data:
qrtmdg.k GRID 5001 6000 1
doqtl.k 1 70 3
doqtl.k 71 80 3
doqtl.k 81 90 3
doqtl.k 91 100 3
delwxt.k 1 10
igu.k DEL 132
lwu.k DEL VIRT9001
lwu.k DEL VIRT9002
lwu.k DEL ROUTEPP.LOG
exit
The 'delwxt' entry in this list is the thing that scours the XCD index
and other files like *.XCD. The problem is that each .XCD file contains
the entire set of textual data for an entire day. Each one of these files
is about 300 MB in size. The other big disk user is typically GRID data
files (model output decoded into McIDAS GRID format). If one decides to
keep more than one day of all GRID files, the more than 3 GB of disk
can easily be used.
To specifically answer your question, the machine we have setup to use
as a dataserver for the McIDAS workshop that is currently underway has
about 5 GB of data in it. This number will drop to about 3 GB after
mcscour.sh is run (by cron) at 21:00.
Tom
>From address@hidden Tue Aug 7 07:13:22 2001
Thanks again, Tom.
Wayne