[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: 951212: netcdf filesize on Alpha
- Subject: Re: 951212: netcdf filesize on Alpha
- Date: Tue, 12 Dec 1995 12:34:14 -0700
> Subject: 19951211 File too large on DEC/Alpha
>Organization: .
>Keywords: 199512120224.AA07405
Hi Arlindo,
> For the first time I experienced the following error from netCDF
> (lastest version I got from Unidata's anon ftp site a couple of weeks
> ago):
>
> ncvarput: xdr_NC_fill
> ncvarput: NCcoordck fill, var taux, rec 1175: File too large
>
> (I my running this under
> DEC OSF/1 V3.2 Worksystem Software (Rev. 214)
> DEC OSF/1 V3.2A (Rev. 17); Fri Apr 21 17:17:00 EDT 1995 )
>
> To be sure, the file is rather large at this point:
>
> - -rw-r--r-- 1 dasilva system 262144000 Dec 11 02:43 nbot_x.nc
>
> but I have run the same model, creating similar files from my SGI Indy
> without any problems. My guess is that this problem is related to
> the particular XDR implementation, right?
>
> My question: any simple fix?
I can't reproduce this problem. The limit on the size of netCDF files is
supposed to be determined by the size of byte offsets for locations in the
file. These are currently signed 32-bit integers, so that should permit a 2
Gbyte netCDF file. I've just created a 600 Mbyte netCDF file on a DEC Alpha
running Digital Unix 3.2C. The file was on a remotely mounted file system,
but I don't think that should make any difference.
I created the 600 Mbyte file by using "ncgen -b big.cdl" on the following
file, named "big.cdl":
netcdf big {
dimensions:
p = 600; // number of Mbytes
m = 100;
n = 1000;
variables:
byte x0(p,m,n);
float y0(p,m,n);
byte x1(p,m,n);
float y1(p,m,n);
int z;
data:
z = 1;
}
By increasing the size of the "p" dimension, I expect this would work for
"p" as big as 2000 for a 2 Gbyte file, but I don't currently have the disk
space to test it.
______________________________________________________________________________
Russ Rew UCAR Unidata Program
address@hidden http://www.unidata.ucar.edu