This archive contains answers to questions sent to Unidata support through mid-2025. Note that the archive is no longer being updated. We provide the archive for reference; many of the answers presented here remain technically correct, even if somewhat outdated. For the most up-to-date information on the use of NSF Unidata software and data services, please consult the Software Documentation first.
Wei, I'd like to download the file and check it here. I have space for a 35 GB file, but you'd need to make it available for ftp or http access, as I don't have easy access to files on glade currently. Is that possible? --Russ On Fri, Sep 26, 2014 at 11:37 AM, Wei Huang <address@hidden > wrote: > New Client Reply: Write a netcdf4classic file > > Joe, > > Here is NCAR's Yellowstone system info: > > uname -a > Linux geyser05 2.6.32-358.el6.x86_64 #1 SMP Tue Jan 29 11:47:41 EST 2013 > x86_64 x86_64 x86_64 GNU/Linux > > Barbara, and Russ, > > Attached is a C code which I can reproduce this problem on NCAR's > Yellowstone > by read the NCL generated NetCDF file. > > Thank you all for check on this issue. > > Wei > > > > Wei > > > ================================================ > 1850 Table Mesa Dr. > Boulder, CO 80307 > Phone: 303-497-8924 > > On Fri, Sep 26, 2014 at 11:06 AM, <address@hidden> wrote: > > > Hi Wei, > > > > I probably should have figured that out! > > Joe Lee will look at this and we'll see what the next step is. > > I'll bring it up in the HDF5 meeting on Monday, as well. > > > > -Barbara > > > > > > On Fri, 26 Sep 2014, Wei Huang wrote: > > > > Barbara, > >> > >> I am a NCL developer in the NCL team. > >> As I mentioned in my last email, it seems that the problem > >> shows up when we tried to read netCDF4classic file quite some > >> million times. And the same script which worked with HDF5-1.8.12. > >> This read end-up only read each point at time with the way > >> the user coded in NCL. We know it is not an efficient way to read > >> data, but our code should not fail, and so HDF5 should not fail either. > >> > >> If you need, I can FTP the data file to you, and you can try to read > this > >> data point by point with HDF5, > >> which may reproduce the problem. > >> If Russ wants to investigate more, I can FTP the data as well. > >> > >> Thanks for looking at this issue. > >> > >> Wei > >> > >> ================================================ > >> 1850 Table Mesa Dr. > >> Boulder, CO 80307 > >> Phone: 303-497-8924 > >> > >> On Fri, Sep 26, 2014 at 7:57 AM, <address@hidden> wrote: > >> > >> Hi Wei, > >>> > >>> I'm not sure we'll be able to determine the issue with your NCL > >>> scripts. However, I did find someone here who has used NCL before. > >>> I will forward your scripts to him, in case he can help. > >>> > >>> He suggested you contact the ncl-talk mailing list, as they may be > >>> familiar with this issue. It looks like you have to subscribe to > >>> ncl-talk. See: > >>> > >>> https://www.ncl.ucar.edu/Support/email_lists.shtml > >>> > >>> In the mean-time, we will see if we can determine anything from the > >>> scripts. > >>> > >>> -Barbara > >>> address@hidden > >>> > >>> > >>> > >>> I have been able to reproduce this issue with a simple NCL scripts. > >>> > >>>> Attached are two NCL scripts, where wrt.ncl created a data file. > >>>> Then read.ncl read a sub-set of the data in a loop. > >>>> > >>>> The read script can demonstrate that HDF5 error start to appear > >>>> loop index 414. (where the first 414 read just normal). > >>>> > >>>> I understand that it seems not fair to send you NCL scripts, > >>>> but that is how we got the problem, and little bit more frustrated > >>>> is that it seems the same code worked fine on my Mac, > >>>> but failed on NCAR's yellowstone. > >>>> > >>>> I hope I can give you more info, but that is what I got so far. > >>>> > >>>> Thanks, > >>>> > >>>> Wei > >>>> > >>>> > >>>> ================================================ > >>>> 1850 Table Mesa Dr. > >>>> Boulder, CO 80307 > >>>> Phone: 303-497-8924 > >>>> > >>>> On Thu, Sep 25, 2014 at 8:16 AM, <address@hidden> wrote: > >>>> > >>>> > >>>> Hi Wei, > >>>>> > >>>>> Thanks for the update. > >>>>> I'll wait to hear from you. > >>>>> > >>>>> -Barbara > >>>>> address@hidden > >>>>> > >>>>> > >>>>> On Wed, 24 Sep 2014, Wei Huang wrote: > >>>>> > >>>>> Barbara, > >>>>> > >>>>> > >>>>>> I am able to repeat this issue with a very short NCL script. > >>>>>> We'll do more debug and will let you know if we still have problem. > >>>>>> > >>>>>> Thanks, > >>>>>> > >>>>>> Wei > >>>>>> > >>>>>> ================================================ > >>>>>> 1850 Table Mesa Dr. > >>>>>> Boulder, CO 80307 > >>>>>> Phone: 303-497-8924 > >>>>>> > >>>>>> On Wed, Sep 24, 2014 at 3:01 PM, Wei Huang <address@hidden> > wrote: > >>>>>> > >>>>>> Barbara, > >>>>>> > >>>>>> > >>>>>>> The problem came from a user running NCL. > >>>>>>> We got the data (two netcdf files, total of 35G), and could > reproduce > >>>>>>> the > >>>>>>> issue when we compile NCL with HDF5-1.8.13, and NetCDF-4.3.2. > >>>>>>> But there is no problem when we use HDF5-1.8.12, and NetCDF-4.3.2. > >>>>>>> Then we try to reproduce the problem without read those two large > >>>>>>> files, > >>>>>>> but it seems NCL works with HDF5-1.8.13 and NetCDF-4.3.2 with > >>>>>>> simple code. > >>>>>>> > >>>>>>> This kind of put us in a place of no where, which we do not know > >>>>>>> where > >>>>>>> and how to debug this problem. That is why I send the question to > >>>>>>> netcdf-support. > >>>>>>> I do not know if want to send you the data and NCL script to debug > >>>>>>> this issue. If you feel the data is too much, and the problem is > from > >>>>>>> NCL, > >>>>>>> could you please check with your developers to see if they have any > >>>>>>> idea? > >>>>>>> > >>>>>>> Thanks, > >>>>>>> > >>>>>>> Wei > >>>>>>> > >>>>>>> ================================================ > >>>>>>> 1850 Table Mesa Dr. > >>>>>>> Boulder, CO 80307 > >>>>>>> Phone: 303-497-8924 > >>>>>>> > >>>>>>> On Wed, Sep 24, 2014 at 12:39 PM, <address@hidden> wrote: > >>>>>>> > >>>>>>> Hi Wei, > >>>>>>> > >>>>>>> > >>>>>>>> Yes, please do send us a way to reproduce the issue. (I don't know > >>>>>>>> what > >>>>>>>> the problem is off-hand.) > >>>>>>>> > >>>>>>>> Thanks! > >>>>>>>> -Barbara > >>>>>>>> > >>>>>>>> ====================== > >>>>>>>> Barbara Jones > >>>>>>>> The HDF Group Helpdesk > >>>>>>>> address@hidden > >>>>>>>> ====================== > >>>>>>>> > >>>>>>>> > >>>>>>>> Hi Wei, > >>>>>>>> > >>>>>>>> > >>>>>>>> I will forward this question to address@hidden, because the > >>>>>>>>> symptoms > >>>>>>>>> are > >>>>>>>>> something that doesn't work with HDF5-1.8.13 but did work with > >>>>>>>>> HDF5-1.8.12. > >>>>>>>>> > >>>>>>>>> They may ask you for a way to duplicate the problem, unless they > >>>>>>>>> can > >>>>>>>>> tell what > >>>>>>>>> the problem is from the HDF5 error messages you have supplied. > >>>>>>>>> > >>>>>>>>> --Russ > >>>>>>>>> > >>>>>>>>> We try to create a netcdf file with write mode: netcdf4classic. > >>>>>>>>> > >>>>>>>>> > >>>>>>>>> It works with a test program, but failed with a real case, with > >>>>>>>>>> the > >>>>>>>>>> error > >>>>>>>>>> message below. > >>>>>>>>>> > >>>>>>>>>> We use HDF5-1.8.13 with netcdf4.3.2. > >>>>>>>>>> > >>>>>>>>>> If compiled with HDF5-1.8.12, and netcdf4.3.2, the real case > >>>>>>>>>> worked. > >>>>>>>>>> > >>>>>>>>>> We have checked the real case code few times, but did not find > >>>>>>>>>> anything > >>>>>>>>>> wrong. > >>>>>>>>>> > >>>>>>>>>> Thanks for any help on this issue. > >>>>>>>>>> > >>>>>>>>>> Wei > >>>>>>>>>> > >>>>>>>>>> -------------------------------- > >>>>>>>>>> > >>>>>>>>>> HDF5-DIAG: Error detected in HDF5 (1.8.13) thread 0: > >>>>>>>>>> #000: H5P.c line 303 in H5Pcreate(): unable to create property > >>>>>>>>>> list > >>>>>>>>>> major: Property lists > >>>>>>>>>> minor: Unable to create file > >>>>>>>>>> #001: H5Pint.c line 1747 in H5P_create_id(): unable to atomize > >>>>>>>>>> property > >>>>>>>>>> list > >>>>>>>>>> major: Property lists > >>>>>>>>>> minor: Unable to register new atom > >>>>>>>>>> #002: H5I.c line 895 in H5I_register(): can't insert ID node > into > >>>>>>>>>> skip > >>>>>>>>>> list > >>>>>>>>>> major: Object atom > >>>>>>>>>> minor: Unable to insert object > >>>>>>>>>> #003: H5SL.c line 995 in H5SL_insert(): can't create new skip > list > >>>>>>>>>> node > >>>>>>>>>> major: Skip Lists > >>>>>>>>>> minor: Unable to insert object > >>>>>>>>>> #004: H5SL.c line 687 in H5SL_insert_common(): can't insert > >>>>>>>>>> duplicate > >>>>>>>>>> key > >>>>>>>>>> major: Skip Lists > >>>>>>>>>> minor: Unable to insert object > >>>>>>>>>> > >>>>>>>>>> ================================================ > >>>>>>>>>> 1850 Table Mesa Dr. > >>>>>>>>>> Boulder, CO 80307 > >>>>>>>>>> Phone: 303-497-8924 > >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> Russ Rew UCAR Unidata > >>>>>>>>>> Program > >>>>>>>>>> > >>>>>>>>>> address@hidden > >>>>>>>>> http://www.unidata.ucar.edu > >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > >>>>>>>>> Ticket Details > >>>>>>>>> =================== > >>>>>>>>> Ticket ID: MRA-585540 > >>>>>>>>> Department: Support netCDF > >>>>>>>>> Priority: Normal > >>>>>>>>> Status: Closed > >>>>>>>>> > >>>>>>>>> > >>>>>>>>> -- > >>>>>>>>> > >>>>>>>>> > >>>>>>>> > >>>>>>>> > >>>>>>>> > >>>>>>> > >>>>>>> -- > >>>>>> > >>>>> > >>>>> > >>>>> > >>>> -- > >>> > >>> > >> > > -- > > > > > > Ticket Details > =================== > Ticket ID: MRA-585540 > Department: Support netCDF > Priority: Normal > Status: Open > Link: > https://www.unidata.ucar.edu/esupport/staff/index.php?_m=tickets&_a=viewticket&ticketid=24503 > Ticket Details =================== Ticket ID: MRA-585540 Department: Support netCDF Priority: Normal Status: Open