[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [conduit] Missing GFS files on conduit this morning



Hi Users,

The major data center outage is unfortunately still continuing. Admins have been working non stop with level 3 support in an effort to isolate the issue and restore services. We apologize for the impacts this is having on your operations. Please let me know if you have any questions.

Carissa Klemmer

On Tuesday, February 20, 2018, Ryan Hickman <address@hidden> wrote:
I, too, have the GFS 12 UTC GRIB2 files in a publicly available location should anyone wish to access them.

https://storage.googleapis.com/ah-maps-raw/GFS_20180220120000_20180220120000.grib2 (F001)
https://storage.googleapis.com/ah-maps-raw/GFS_20180220120000_20180220130000.grib2 (F002)
etc. etc. through 120 hours.
https://storage.googleapis.com/ah-maps-raw/GFS_20180220120000_20180225120000.grib2 (F120)

On Tue, Feb 20, 2018 at 11:45 AM, Patrick L. Francis <address@hidden> wrote:

somebody just asked where the files are when they looked in this directory :-)

my fault.. I didn't explain very well.. sorry! *ALIEN*

since GEMPAK is being phased out, I am writing a custom ingest package for our internal uses, and filing the products as each source sends them over NOAAPort

for example, GFS which was brought up today comes over the dish as:

 71550 20180220035934.747355  NGRID 19629065  YRPK40 KWBC 200000 !grib2/ncep/GFS/#003/201802200000F072/RELH/400 hPa PRES

where:
'ncep' is the source
'GFS' is the model
'#003' is the grid

so I chose to file things as
../root/www/dir/noaaport/dish/'source'/'model'/'grid'/

and then instead of a crazy filename.. simply pipe everything into
YYYYMMDD.HHHH.{F}HHH.grib

so all variables for each product are filed into the corresponding date, hour, and forecast hour by model type.

then with NIDS data for example, I file all products by:
../root/www/dir/noaaport/dish/nids/'source'/'radome'/'prod'/
YYYYMMDD.HHMM.nid

so to answer the unspoken person's question, while this is a work in progress for me internally, of course I don't mind sharing with the unidata community if you should need to grab something... unidata is good people :-)   Eventually I hope to file every single product that comes over noaaport, because what I'm doing is instead of copying from pqact.gempak, writing everything manually by parsing the notifyme output of every feed.. it just takes awhile.. but this way, products will be able to be used natively with MetPy and other new apps being developed :-)


cheers,

--patrick

-----------------------
Patrick L. Francis
AerisWeather.com

--



_______________________________________________
NOTE: All exchanges posted to Unidata maintained email lists are
recorded in the Unidata inquiry tracking system and made publicly
available through the web.  Users who post to any of the lists we
maintain are reminded to remove any personal information that they
do not want to be made public.


ldm-users mailing list
address@hidden
For list information or to unsubscribe,  visit: http://www.unidata.ucar.edu/mailing_lists/



--
Carissa Klemmer
NCEP Central Operations
Dataflow Team Lead
301-683-3835

_______________________________________________
NOTE: All exchanges posted to Unidata maintained email lists are
recorded in the Unidata inquiry tracking system and made publicly
available through the web.  Users who post to any of the lists we
maintain are reminded to remove any personal information that they
do not want to be made public.


conduit mailing list
address@hidden
For list information or to unsubscribe, visit: 
http://www.unidata.ucar.edu/mailing_lists/