[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: 19990315: "file too big" error message: IRIX64 & Fortran 90 7.2
- Subject: Re: 19990315: "file too big" error message: IRIX64 & Fortran 90 7.2
- Date: Mon, 15 Mar 1999 08:57:52 -0700
Dear Sophie,
>Date: Mon, 15 Mar 1999 07:59:52 +0100 (MET)
>From: Sophie Valcke <address@hidden>
>Organization: France/CERFACS
>To: Steve Emmerson <address@hidden>
>Subject: Re: 19990310: "file too big" error message: IRIX64 & Fortran 90 7.2
>Keywords: 199903100908.CAA04412
In the above message, you wrote:
> > Are the files necessary in order to diagnose your problem (i.e. does
> > the program read from unit 14 before exhibiting the "file too big"
> > error)? If it does, then would you please modify your program so that
> > it exhibits the "file too big" problem without requiring 330 MB of
> > existing files. Note that if the "netCDF" problem only occurs in the
> > presence of 330 MB of existing files, then the problem probably isn't in
> > the netCDF layer -- but rather with the rest of your system.
> As I told you in my first message, the problem only arises when I try to
> write a field in a netCDF file such that the resulting netCDF file would
> be bigger than about 105 Mbytes:
> "It looks like the maximum netCDF file that I can create is of the
> order of 105 Mbytes. I do not understand as there
> is no file size limit on my system and I read that the netCDF file
> size limit is 2Gbytes."
> For smaller fields and smaller netCDF files everything works OK. As you
> said, the problem is probably with the rest of my system, but I really do
> not understand with what!
We've tested the netCDF library with very large files (e.g. 2Gbytes).
Does you disk have enough room for more than 105 Mbytes? You can
discover the available free space on your disk with the "df" utility:
$ df -k /tmp
Filesystem Type kbytes use avail %use Mounted on
/dev/root xfs 4307148 3300104 1007044 77 /
The above file partition has 1.007004 Gbytes of free space available.
If I tried to create a 1.5 Gbyte file on it, I would expect to get a
"file too big" error.
--------
Steve Emmerson <http://www.unidata.ucar.edu>