[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[netCDF #YFQ-701665]: netcdf file size error
- Subject: [netCDF #YFQ-701665]: netcdf file size error
- Date: Tue, 15 Mar 2011 13:58:19 -0600
Rose,
> Thanks so much for your reply. Yes, ll is an alias for ls -l, sorry
> about that.
>
> I've attached the ncdump file.
When I store the CDL file in test.cdl and create the corresponding
netCDF file (with a single time, set by adding "TIME = 1;" to the
data: section of the CDL file) using ncgen, I get:
$ ncgen -b test.cdl; ls -l test.nc
-rw-rw-r-- 1 russ ustaff 259195500 Mar 15 13:27 test.nc
which is 259 MB, about what you would expect. If I make a file with
2 records, with "TIME = 1, 2;" in the CDL data section, I get a file
about twice as big:
$ ncgen -b test2.cdl; ls -l test2.nc
-rw-rw-r-- 1 russ ustaff 511506428 Mar 15 13:28 test2.nc
To have 2 TB, the file would have to contain about 7927 records.
This is with the current version of ncgen linked to the current
netCDF 4.1.2-beta3 library. I just tried the same thing with the same
results using the ncgen in the 3.6.3 netCDF distribution, linked to that
version of the library.
It sounds like there may be an error in your Fortran program or you are
writing many more records than you think. Note that if you try to write
just the 7927th record, the netCDF library will create a very large file
with just that record written.
--Russ
> ----- Message de address@hidden ---------
> Date : Tue, 15 Mar 2011 12:24:23 -0600
> De : Unidata netCDF Support <address@hidden>
> Répondre à : address@hidden
> Objet : [netCDF #YFQ-701665]: netcdf file size error
> À : address@hidden
> Cc : address@hidden
>
>
> > Hi Rose,
> >
> >> I'm using netcdf 3.6.3 and ifort v11 to compile and run my program. It
> >> compiles and runs just fine, but I'm having a problem with the size of
> >> the files it generates.
> >>
> >> I'm writing files which contain approx 40 variables, each are 350*150*40
> >> doubles. The files *should* be around 250Mb. But, the file appears to
> >> take up 4Tb:
> >>
> >> du -h MYFILE.nc
> >> 254M MYFILE.nc
> >>
> >> ll -h MYFILE.nc
> >> 4.0T MYFILE.nc
> >
> > What is the "ll" command? It may be an alias to ls with a custom set of
> > options, but I don't have such a command on my Linux or Solaris systems.
> >
> >> What is causing such a discrepancy in the file size?
> >>
> >> Copying a 4TB file (let alone a few hundred of them) is very complicated.
> >>
> >> Any suggestions would be greatly appreciated.
> >
> > Could you send the output of "ncdump -h MYFILE.nc" so we can see the schema
> > for the file and verify that it should only be 254MB? Thanks.
> >
> > --Russ
> >
> > Russ Rew UCAR Unidata Program
> > address@hidden http://www.unidata.ucar.edu
> >
> >
> >
> > Ticket Details
> > ===================
> > Ticket ID: YFQ-701665
> > Department: Support netCDF
> > Priority: Normal
> > Status: Closed
> >
> >
>
>
> ----- Fin du message de address@hidden -----
>
>
>
> --
>
> Rose CAMPBELL
> Universite de la Mediterranee (Aix-Marseille II)
> http://annuaire.univmed.fr/showuser.php?uid=campbell
>
Russ Rew UCAR Unidata Program
address@hidden http://www.unidata.ucar.edu
Ticket Details
===================
Ticket ID: YFQ-701665
Department: Support netCDF
Priority: Normal
Status: Closed