This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
> Institution: Vrije Universiteit Amsterdam > Package Version: 3.6.2 > Operating System: linux > Hardware Information: solaris > Inquiry: I have a question, I just installed the netcdf 3.6.2 librari and I > am trying to read a file larger than 2 Gb. It doesn't work, I get an error: > value too large for defined data type. When I started searching for this > error, I think I found that I have to do something different during the > installation of the librari, is that correct? Could you please tell me what > I should do extra in order to be able to read and write large files? > > Thank you very much, > Sincerely, > Karin Rebel > > > What program give you that error? It is not a netCDF error message. On your platform, netCDF 3.6.2 should be able to handle large files without further effort on your part. Check this by rerunning netCDF's configure script, with the �enable-large-file-tests option. Then rerun make check. This will cause the netCDF makefiles to test that large files work on your machine, but the ability to do so is turned on by default - the only thing this configure option does is test it. Please let me know if this works for you. Thanks, Ed Ticket Details =================== Ticket ID: FQR-194289 Department: Support netCDF Priority: Normal Status: Open