[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
19990810: FSL2 profiler pattern
- Subject: 19990810: FSL2 profiler pattern
- Date: Tue, 10 Aug 1999 12:52:58 -0600
Since this list was included in the original trouble report
on pattern matching the FSL2 profiler products yesterday,
I will send the resolution of the problem as well so
that lingering questions regarding whether the problem
was OS specific or a pattern/action problem can be answered.
**************************************************************
From address@hidden Tue Aug 10 10:07:01 1999
by unidata.ucar.edu (8.8.8/8.8.8) with SMTP id KAA27980;
Tue, 10 Aug 1999 10:07:01 -0600 (MDT)
Message-Id: <address@hidden>
To: Gilbert Sebenste <address@hidden>
cc: Robb Kambic <address@hidden>,
General Support <address@hidden>
Subject: 1999810: Beware, beware!
In-reply-to: Your message of "Tue, 10 Aug 1999 10:49:24 CDT."
Date: Tue, 10 Aug 1999 10:07:01 -0600
From: Unidata Support <address@hidden>
>
>
>But, here are my entries I use to capture it. Standard!
>
>#
># F S L P R O F I L E R S E C T I O N
>#
>#
># This is the stuff I want.
>#
>#FSL2 ^FSL\.NetCDF\.NOAAnet\.windprofiler\.01hr\.(.*)\..*
># FILE profiler/\1%m%d\3.hr
>#
>#FSL2 ^FSL\.NetCDF\.NOAAnet\.windprofiler\.06min\.(.*)\..*
># FILE profiler/\1%m%d\3.six
>
Gilbert,
In looking at your pattern actions shown above, you see that
you only have 1 pattern enclosed in parentheses, eg (.*)
However, your action references \1 and \3.
The problem you are having is that \3 is not defined and you are
getting garbage.
Steve Chiswell
Unidata User Support
************************************************************************
At present, it is left up to the person creating the pqact.conf
entry to ensure that the number of matched escape sequences matches
the number of patterns matched in the expression. We will look into
solutions to more gracefully eschew these flagrant errors.
One other note I want to make is regarding the products being "double"
the expected size. In general, the LDM product queue will avoid duplicate
products by determining if it already has the product in the queue.
However, if you delete the product queue and recreate it, then you
will receive previously obtained data if it is available from the upstream
feed site since you no longer have the data in your queue to check against.
As a result, if you are meerly using FILE actions, additional data products
will simply be appended to the already existing file. If the "-ovewrite"
option is used, then each file action will overwrite the previously
existing file.
I appologize in advance to those that did not wish to receive this
information, but in deference to those who might otherwise be lead
to believe that no support solution was available we hope that
this explains the previously reported problem.
Steve Chiswell