This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
Hi Peter, Back on June 2, I wrote a somewhat garbled answer to your question: > > I was naively trying to write an mpi program that > > called netcdf where each processor wrote to the > > same netcdf file. Each processor was trying to > > write to the appropriate spot in the file. I was > > trying to mimic what happens with a direct > > access file in fortran. This is the usual way > > I write things out with mpi and fortran. Can > > this be done ? My first attempts seemed to be > > okay, but then i made time unlimited > > and got junk out ? > > The current implementation of netCDF is not designed to support > multiple processors writing to the same file concurrently. So the > simple answer is it cannot be done using the current netCDF library. > You can do parallel reads, but there can only be one writer. > > We are beginning the development of a parallel version of netCDF based > on HDF5 and its use of MPI-IO. There are two other groups of > researchers developing parallel netCDF I/O using MPI-IO directly > (Northwestern/Argonne and NERSC). No one has something ready to use > yet, so unfortunately you'll have to serialize your netCDF output or > use a different scientific data access interface, such as HDF5, until > something is ready. Our project with NCSA is funded by NASA and > delivery of the reference implementation is currently planned to be > available in Summer 2005. There has been some significant progress in both the Northwestern/Argonne effort: http://www-unix.mcs.anl.gov/parallel-netcdf/ and the NERSC version http://hpcf.nersc.gov/software/libs/io/netcdf/sp/pusage.html It is possible to try out both of these now, with code that can be downloaded from their respective sites. The NERSC version has a Fortran interface. If you try either of these out, we'd be interested in the results. --Russ _____________________________________________________________________ Russ Rew UCAR Unidata Program address@hidden http://my.unidata.ucar.edu