This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
> Actually what I am seeing is the multiple opens/close causing the > problems, which is why the multiple files per forecast time causes > problems. The solution is to up the number of files that can be open > simultaneously by dcgrib2. > Harry, I'd be happy to up the number in the distribution. The value that comes from NCEP is 3, and I upped it to 5 years ago when that was the number of ETA projections coming in, and the LDM had a limit of 32 open streams- but no reason not to up it now. Steve Chiswell Unidata User Support > Unidata GEMPAK Support wrote: > >> Did you get my message about the ens002 files? > >> > > > > Yes, These are typically the legacy "mrf" products. I believe that NWS isn't > > going to transition them to GRIB2. but still part of the CONDUIT stream at > > the moment. > > > > The thinned grids are the most time consuming to stitch together since > > it requires the ability to read back the grid and insert the new bytes. I > > keep the grid in memory to avoid that if the pieces are in sucession, but > > otherwise, if not in order would be a IO hit. Given thenumber of > > GEFS ensembles in CONDUIT, you may want to comment out the thinned grid > > ensembles, eg: > > > > # Global Ensemble grids also on grids [I-P] > > #HDS ^H..... KWBK ([0-3][0-9])([0-2][0-9]) > > # PIPE decoders/dcgrib2 -d data/gempak/logs/dcgrib2_GFSensg.log > > # -e GEMTBL=/home/gempak/NAWIPS/gempak/tables > > > > > >> Also, why are these > >> split up by forecast hour instead of put into one file? > >> > >> CONDUIT data/nccf.*/(ge[cp][0-9][0-9])\.t[0-2][0-9]z\.pgrb2.* !(.*)! > >> PIPE gdecoders/dcgrib2 -v 1 -d > >> data/gempak/logs/dcgrib2_CONDUITens.log > >> -e GEMTBL=/usr/local/ldm/NAWIPS-5.10.3i/gempak/tables > >> data/gempak/model/ens/\1_YYYYMMDDHHfFFF.gem > >> # > >> # For now, don't mix TIGGE files with other ensembles since not all > >> parameters > >> # are available for all members > >> # data2/TIGGE/gep07.t06z.pgrb2cf00 > >> CONDUIT data2/TIGGE/(ge[cp][0-9][0-9])\.t[0-2][0-9]z\.pgrb2.* !(.*)! > >> PIPE gdecoders/dcgrib2 -v 1 -d > >> data/gempak/logs/dcgrib2_CONDUITens.log > >> -e GEMTBL=/usr/local/ldm/NAWIPS-5.10.3i/gempak/tables > >> data/gempak/model/tigge_gefs/\1_YYYYMMDDHHfFFF.gem > >> > >> > > > > I used NCEP's file structure where the forecast hours are in individual > > files > > due to the expanding number of parameters and levels that are being > > produced, > > and sort of a remnant of when we stuck all members into a single file and > > tacked > > on the Cxxx, Pxxx and Nxxx extensions. > > . > > The size of all hours in a single file would only be about 100 mb at the > > moment, which > > isn't bad, and it would make the disk directory easier to glean. > > The primary concern is to have the individual members in separate files for > > compatinbility with the ens_ functions, so that may be the best solution. > > > > I kept the TIGGE parameters separate since the number and levels of members > > was less than the rest, which caused some processing problems, but > > otherwise, > > not an issue. > > > > Steve Chiswell > > Unidata User Support > > > > Steve Chiswell > > Unidata User Support > > > > > > > > > >> Unidata User Support wrote: > >> > >>>> I am running into an NFS problem with the dcgrib2 decoder and the GFSens > >>>> files. I believe the problem is the constant opening and closing of > >>>> grid files since only 5 (MMFILE in gemprm.h) are allowed to be open at a > >>>> time. Can I just change the value of MAXGOPN in the main routine > >>>> (dcgrib.c), or do I need to change MMFILE in gemprm.h and rebuild GEMPAK? > >>>> > >>>> > >>>> > >>>> > >>> Harry, > >>> > >>> The value MMFILE would have to be changed in gemprm.h and GEMPRM.PRM > >>> and the entire distribution rebuilt. > >>> > >>> I noticed that NWS added grid 38 to the thinned ensembles where previously > >>> only grids 39-40 existed. I modified the gribkey.tbl to: > >>> ! Ensemble spectral AVN/MRF grids > >>> 007 2 ??? 037 data/gempak/model/ens/YYYYMMDDHH_ensthin.gem > >>> 20000 > >>> 007 2 ??? 038 data/gempak/model/ens/YYYYMMDDHH_ensthin.gem > >>> 20000 > >>> 007 2 ??? 039 data/gempak/model/ens/YYYYMMDDHH_ensthin.gem > >>> 20000 > >>> 007 2 ??? 040 data/gempak/model/ens/YYYYMMDDHH_ensthin.gem > >>> 20000 > >>> 007 2 ??? 041 data/gempak/model/ens/YYYYMMDDHH_ensthin.gem > >>> 20000 > >>> 007 2 ??? 042 data/gempak/model/ens/YYYYMMDDHH_ensthin.gem > >>> 20000 > >>> > >>> that should help in only having to open 1 ensthin file instead of multiple > >>> YYYYMMDDHHfFFF_ens@@@.gem files. > >>> > >>> Steve Chiswell > >>> Unidata User Support > >>> > >>> > >>> Ticket Details > >>> =================== > >>> Ticket ID: UDO-760622 > >>> Department: Support GEMPAK > >>> Priority: Normal > >>> Status: Closed > >>> > >>> > >> > > > > > > Ticket Details > > =================== > > Ticket ID: UDO-760622 > > Department: Support GEMPAK > > Priority: Normal > > Status: Closed > > > > Ticket Details =================== Ticket ID: UDO-760622 Department: Support GEMPAK Priority: Normal Status: Closed