[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[netCDF #OHV-990519]: large file support problems in 4.0.1-beta3?
- Subject: [netCDF #OHV-990519]: large file support problems in 4.0.1-beta3?
- Date: Tue, 24 Mar 2009 13:12:39 -0600
Mary,
> I'm using hdf5-1.8.2. Does it need to be built with special options?
Not that I know of, but Ed builds the hdf5 library I use, and he won't be
back for a couple of days. You could check the file lib/libhdf5.settings
in your HDF5 installation, to make sure the setting
Linux Large File Support (LFS): yes
says "yes" instead of "no". If it's "no", then I would look for a
configure option that would fix that.
> I have about 67 Gb available, but maybe there's a quota limit?
You should be able to tell if your being limited by quotas using the
ulimit command. The options to use depend on what shell you're using,
since it's a shell built-in.
> Is there a way to tell if the tests are failing because I don't have a
> large file enabled NetCDF, versus not having enough free disk space?
The fact that your "make check" succeeded on the "quick_large_file" tests,
which create large files with "holes" (unwritten data blocks), means the
netCDF library is working OK with 64-bit file offsets. The first test that
fails is trying to write all the values in a variable with more than
2**32 values, so it actually needs more than 2 GiBytes of free disk space.
You might try running this, just to make sure you can write a large
file independent of netCDF:
dd if=/dev/zero bs=1000000 count=3000 of=./largefile
ls -l largefile
rm largefile
which should write a 3 GByte file named "largefile" in the current directory,
verify its size, and remove it.
--Russ
Russ Rew UCAR Unidata Program
address@hidden http://www.unidata.ucar.edu
Ticket Details
===================
Ticket ID: OHV-990519
Department: Support netCDF
Priority: Normal
Status: Closed