This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
I have also been able to uncompress the ARX NIDS data but I needed to do some tricks to get it to work. This is what I did... its pretty ugly!! It turns out that NWS has split the NIDS file into 4000 byte blocks and then compressed the data using the zlib compress utility. Then the compressed blocks are sent out sequentially as part of the whole product. This means you have separate the blocks to decode. The problem is that the uncompress utility doesn't work that effectively. The problem is you don't know the size of the compressed blocks. The block sizes are buried in the SBN header which is masked off by the LDM. So you have to guess what the block sizes are. The uncompress utility will return the size of the uncompressed data but won't reveal the size of the compressed data. What I've done is to forward through the data 1000 bytes at a time and then try to uncompress the data. If the function returns an error, I increment through the data a byte at a time until I get a valid return on the uncompression. This is just plain ugly.... What I'm looking for is a simpler solution!! If there is another utility that gives me the compressed block sizes, this would make my job much easier! BTW, I read where the uncompress utility will uncompress all the data if the file is memory mapped??? Otherwise, it only works a block at a time. So Pete, can you forward the Perl script so I can check what functions are used?? Dan.... from cloudy Chicago!!