This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
Erik, > To: address@hidden > cc: address@hidden > From: Erik gilje <address@hidden> > Subject: time dependency > Organization: ? > Keywords: 200207022053.g62Krlu01763 netCDF F90 The above message contained the following: > Hi, I am running a time dependent piece of code, and I > am trying to write to a netcdf data set inside a loop, > I had previously just saved the data in an array then > had that read into a netcdf dataset, however, we are > planning to us this on a much more complicated program > in which it will be necessary to write to the data set > in a loop. However, when I write to the dataset ina > loop the program takes much longer (I estimate over > 1,000 times longer). So I was wondering what the > problem might be. Here is the code : I don't believe there is a problem. Writing 1000 values one value at a time takes longer than writing all 1000 values at once. > > Heres where I declare the netcdf dataset: > status = NF90_CREATE("newoutput.nc", NF90_SHARE, cid) > if (status /= nf90_noerr) call handle_err(status) > status = NF90_DEF_DIM(cid, "y", (ns*nt), y) > if (status /= nf90_noerr) call handle_err(status) > status = NF90_DEF_VAR(cid, "ex", NF90_INT, y, a) > if (status /= nf90_noerr) call handle_err(status) > status = NF90_DEF_VAR(cid, "elvn", NF90_INT, y, b) > if (status /= nf90_noerr) call handle_err(status) > status = NF90_DEF_VAR(cid, "veln", NF90_INT, y, c) > if (status /= nf90_noerr) call handle_err(status) > > status = NF90_ENDDEF(cid) > if (status /= nf90_noerr) call > handle_err(status) > > status = NF90_CLOSE(cid) > > > Loop code: > > > DO i=1, ns > status= nf90_open("newoutput.nc", nf90_Write, cid) > if (status /= nf90_noerr) call handle_err(status) > status = NF90_put_var(cid, a, x(i), (/i+(k-1)*ns/)) > if (status /= nf90_noerr) call > handle_err(status) > status = NF90_put_var(cid, b, elvn(i), (/i+(k-1)*ns/)) > if (status /= nf90_noerr) call handle_err(status) > status = NF90_PUT_VAR(cid, c, veln(i), (/i+(k-1)*ns/)) > if (status /= nf90_noerr) call handle_err(status) > status = NF90_CLOSE(cid) > if (status /= nf90_noerr) call handle_err(status) > > END DO > > I hope you can offer me some guidance, thank you. > > erik From the above code, I don't see why you don't write all of the "x", "elvn", and "veln" arrays at one time. There is also no reason to close and then reopen the dataset. And, unless you're planning on sharing the dataset with another program in real-time, use of the NF90_SHARE flag is not necessary and may slow things down. If you know that you're going to write every value, then you should turn off prefilling in order to save time. Regards, Steve Emmerson <http://www.unidata.ucar.edu>