This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
>From: Owen Cooper <address@hidden> >Organization: Aeronomy Laboratory/NOAA >Keywords: 200406080008.i5808gtK000586 McIDAS SCHEMA Flexpart Hi Owen, >Thank you for the quick reply. Before I try writing a new SCHEMA I nedd >to get a better understanding of how the SCHEMA format actually works. OK. >I have attached a text file that my current schema converts to an MD >file. The first few line of the text file are: > >LON LAT CO >UNIT=*-1 X X > -141.5 9.5 1.55309999 > -140.5 9.5 0.997270942 > -139.5 9.5 0.696673512 > -138.5 9.5 1.1179049 > -137.5 9.5 0.6285339 > -136.5 9.5 0.888016224 > -135.5 9.5 2.95988703 > -134.5 9.5 4.80974627 > -133.5 9.5 7.33905411 > -132.5 9.5 6.97319412 > -131.5 9.5 4.3494668 > -130.5 9.5 3.28649211 > >The SCHEMA file is: > >" NAME VSN DATE ID TEXTID >" ---- --- ---- -- ------ >SCHEMA FLXP 1 01168 0 "FLEXPART CO CONCENTRATIONS > >"KEY SCALE UNITS DESCRIPTION >"--- ----- ----- ----------- > >ROWS 90 "90 ROWS > >LAT 4 DEG "LATITUDE (DECIMAL DEG) > >COLUMNS 360 "DEFAULT # OF COLS > >LON 4 DEG "LONGITUDE (DECIMAL DEG) > >DATA "START OF DATA SECTION > >CO 2 CON "COD CONCENTRATION > >ENDSCHEMA >What I don't get is this: >The SCHEMA thinks the latitude is a row with 90 entries, and the >longitude is a column with 360 entries. I would rephrase this to say that the schema dictates that the latitude will be represented in the ROWS of a 3D matrix; longitude will be written into the COLUMNS of a 3D matrix; and the CO values will be written into the DATA dimension of a 3D matrix. The schema represents where these items are to be read/written. >And then the CO values follow >as data. But the lat lon and CO are all written to the text file as >side-by-side columns, each with 13,998 entries. So I guess I just >don't understand how it knows how to treat each column. The first line in the data file defines what each column of data is. In this case, column 1 values represent longitude (negative west); column 2 values represent latitude (postive north); column 3 values represent data at the Lat,Lon point. The first line is read by the conversion routine, TXT2MD, so that it knows what values in each column represents. The second line -- which is optional -- specifies how to transform the values in the various columns. In this case, the longitudes are multiplied by '-1' to convert them to west positive convention. The LAT and CO values are left untouched (the 'X' says take the default action which is do nothing). TXT2MD also reads the schema that is registered in SCHEMA and sets up write routines that know where the LAT, LON, and CO values should be written in the MD file. re: how often the output file needs to be created >(but I need to make it for just >one time per file...this is because I make these files in real-time as >soon as the data are available, every 3 hours) You don't need to make the MD files so that there is only one time per file. TXT2MD can append to an existing file, and I am suggesting that you use DAY and TIME in the ROW headers. Each time you have a new TIME in the same DAY, you will simply be adding a new ROW to the output MD file. >With the new SCHEMA that you suggested would I enter the date >info. first, and then tack on the lat lon and CO columns? Like this? > >Day Time >2004160 12:00 >LON LAT CO >UNIT=*-1 X X > -141.5 9.5 1.55309999 > -140.5 9.5 0.997270942 > -139.5 9.5 0.696673512 > -138.5 9.5 1.1179049 > -137.5 9.5 0.6285339 > -136.5 9.5 0.888016224 > -135.5 9.5 2.95988703 > -134.5 9.5 4.80974627 > -133.5 9.5 7.33905411 > -132.5 9.5 6.97319412 > -131.5 9.5 4.3494668 > -130.5 9.5 3.28649211 No, you will have to include the DAY and TIME in each line, like: LON LAT CO DAY TIME UNIT=*-1 X X X X -141.5 9.5 1.55309999 2004160 12:00 -140.5 9.5 0.997270942 2004160 12:00 -139.5 9.5 0.696673512 2004160 12:00 -138.5 9.5 1.1179049 2004160 12:00 -137.5 9.5 0.6285339 2004160 12:00 -136.5 9.5 0.888016224 2004160 12:00 -135.5 9.5 2.95988703 2004160 12:00 -134.5 9.5 4.80974627 2004160 12:00 -133.5 9.5 7.33905411 2004160 12:00 -132.5 9.5 6.97319412 2004160 12:00 -131.5 9.5 4.3494668 2004160 12:00 -130.5 9.5 3.28649211 2004160 12:00 Or, equivalently as: DAY TIME LON LAT CO UNIT=X X *-1 X X 2004160 12:00 -141.5 9.5 1.55309999 2004160 12:00 -140.5 9.5 0.997270942 2004160 12:00 -139.5 9.5 0.696673512 2004160 12:00 -138.5 9.5 1.1179049 2004160 12:00 -137.5 9.5 0.6285339 2004160 12:00 -136.5 9.5 0.888016224 2004160 12:00 -135.5 9.5 2.95988703 2004160 12:00 -134.5 9.5 4.80974627 2004160 12:00 -133.5 9.5 7.33905411 2004160 12:00 -132.5 9.5 6.97319412 2004160 12:00 -131.5 9.5 4.3494668 2004160 12:00 -130.5 9.5 3.28649211 It actually doesn't matter which column you put any particular parameter value in as long as the first line correctly specifies the ordering. Also, the spacing between the key names in the first line mean nothing, so the following are equivalent: DAY TIME LON LAT CO DAY TIME LON LAT CO DAY TIME LON LAT CO The only reason I put multiple spaces between the key names is so that they can line up with the columns of values and be more readable. One thing that is NOT optional, however, is the length of the key names. These must be between 1 and 4 characters in length. >Thanks No worries. Cheers, Tom -- NOTE: All email exchanges with Unidata User Support are recorded in the Unidata inquiry tracking system and then made publically available through the web. If you do not want to have your interactions made available in this way, you must let us know in each email you send to us.