This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
Hello Sean, It seems like the failure is happening at runtime; the executables are using the wrong shared library. I can think of two possible solutions for this. 1. You may be able to use 'LD_LIBRARY_PATH' to specify the correct directory; I'm not sure if this will work, but it is worth a try. $ LD_LIBRARY_PATH=/path/to/lib make check 2. Recompile netcdf statically. If you pass '--enable-static --disable-shared' at configure time, you will build static libraries. This way you will not have to worry about linking against a different netcdf library at runtime. Let me know if neither of these help, or if I can answer any other questions for you! -Ward > Hi! > > I've installed both zlib-1.2.8 and hdf5-1.8.12 successfully ("make > check" passes all tests). However, when I go to install netcdf-4.3.2 > I consistently get a failure in the same 2 checks, tst_h_scalar & > tst_h.scalar.sh. When I examine the test-suite.log file it appears as > though it is trying to link to an older HDF5 installation in /usr/local, > yet I am using the CPPFLAGS and LDFLAGS to explicitly state where it > should look for HDF5-1.8.12. Is there anyway to ensure that it does > this? Setting CPPFLAGS and LDFLAGS doesn't seem to do the trick. > > Sean Hartery > > Ticket Details =================== Ticket ID: VVS-381242 Department: Support netCDF Priority: Normal Status: Closed