This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
Hi Rose, > I'm using netcdf 3.6.3 and ifort v11 to compile and run my program. It > compiles and runs just fine, but I'm having a problem with the size of > the files it generates. > > I'm writing files which contain approx 40 variables, each are 350*150*40 > doubles. The files *should* be around 250Mb. But, the file appears to > take up 4Tb: > > du -h MYFILE.nc > 254M MYFILE.nc > > ll -h MYFILE.nc > 4.0T MYFILE.nc What is the "ll" command? It may be an alias to ls with a custom set of options, but I don't have such a command on my Linux or Solaris systems. > What is causing such a discrepancy in the file size? > > Copying a 4TB file (let alone a few hundred of them) is very complicated. > > Any suggestions would be greatly appreciated. Could you send the output of "ncdump -h MYFILE.nc" so we can see the schema for the file and verify that it should only be 254MB? Thanks. --Russ Russ Rew UCAR Unidata Program address@hidden http://www.unidata.ucar.edu Ticket Details =================== Ticket ID: YFQ-701665 Department: Support netCDF Priority: Normal Status: Closed