This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
> Dear Russell, > > I've been trying to compile Trilinos with NetCDF support, but it > always failed with with > > ================= *snip* ================= > [...] > >> [ 41%] Built target Ioss > >> [ 41%] Building CXX object > >> packages/Trios/ioss/src/exodusII/CMakeFiles/Ioex.dir/Ioex_Internals.C.o > >> /apps/antwerpen/turing/harpertown/software/impi/3.2.1.009/include64/mpi.h(122): > >> error: invalid combination of type specifiers > >> typedef int MPI_Comm; > >> ^ > >> > >> /apps/antwerpen/turing/harpertown/software/impi/3.2.1.009/include64/mpi.h(305): > >> error: invalid combination of type specifiers > >> typedef int MPI_Info; > >> ^ > >> > >> compilation aborted for > >> /user/antwerpen/200/vsc20017/data/software/trilinos/dev/master/source/packages/Trios/ioss/src/exodusII/Ioex_Internals.C > >> (code 2) > >> make[2]: *** > >> [packages/Trios/ioss/src/exodusII/CMakeFiles/Ioex.dir/Ioex_Internals.C.o] > >> Error 2 > >> make[1]: *** > >> [packages/Trios/ioss/src/exodusII/CMakeFiles/Ioex.dir/all] Error 2 > >> make: *** [all] Error 2 > ================= *snap* ================= > > One of the Trilinos developers suggested this may be related to > <http://www.unidata.ucar.edu/software/netcdf/docs/known_problems.html#include_mpi_order>. > > Is there a known workaround yet? Would this bug still be present in > the latest beta-version of NetCDF too? > > Cheers, > Nico > > Howdy Nico! Sorry it has taken us so long to respond to your request. We have been swamped in support these last few months. I would first check to make sure that the netCDF you are using has been built for parallel access. That requires that the underlying HDF5 library be built with --enable-parallel, and that both HDF5 and netCDF are built with the same MPI compiler wrapper you are using for your parallel I/O program. I would strongly suggest you try the latest beta release, or the daily snapshot. Much has changed with parallel I/O and these problems may disappear. ftp://ftp.unidata.ucar.edu/pub/netcdf/snapshot/netcdf-4-daily.tar.gz Please let me know if this doesn't work for you. Thanks, Ed Ticket Details =================== Ticket ID: RTK-145327 Department: Support netCDF Priority: Critical Status: Closed