[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[netCDF #DQV-341880]: Compressing large sets of NetCDF data
- Subject: [netCDF #DQV-341880]: Compressing large sets of NetCDF data
- Date: Sun, 25 Feb 2018 13:30:56 -0700
I am not sure what can be done. You will probably need to
find a different compression algorithm. The HDF5 group
provides access to a variety of filters that you might
check out. The list is here:
https://support.hdfgroup.org/services/contributions.html#filters
>
> I have sets of model inputs/outputs as NetCDF totaling about 8 TB. I have
> already compressed them using ncopy -d 2, but it isn't small enough.
>
> I had to delete the uncompressed NetCDFs as I ran the model, in order to
> not run out of space. I tried using ncopy -d 9 with the already compressed
> output and it did not reduce the size significantly, or at all in some
> cases.
>
> I also tried to tar the NetCDF directory, but this is inconvenient because
> it took ~1.5 weeks to compress about 2 TB of data.
>
> What am I doing wrong or is there an easier way to compress these files so
> that they may be transferred and stored on a different server away from
> Cheyenne?
>
> Thank you,
>
> --
> *Danielle Tijerina*
> Colorado School of Mines
> MS Hydrologic Science and Engineering
> address@hidden
>
>
=Dennis Heimbigner
Unidata
Ticket Details
===================
Ticket ID: DQV-341880
Department: Support netCDF
Priority: Critical
Status: Closed
===================
NOTE: All email exchanges with Unidata User Support are recorded in the Unidata
inquiry tracking system and then made publicly available through the web. If
you do not want to have your interactions made available in this way, you must
let us know in each email you send to us.