[petsc-users] Error in creating compressed data using HDF5

Danyang Su danyang.su at gmail.com
Tue Sep 3 14:22:43 CDT 2019


Hi Barry and Matt,

It turns out to be my stupid error in testing hdf5 chunk size. Different 
chunk sizes have been passed which is not allowed. After setting the 
same chunk size, the code now works.

Thanks,

Danyang

On 2019-09-03 1:58 a.m., Matthew Knepley wrote:
> On Tue, Sep 3, 2019 at 1:57 AM Danyang Su via petsc-users 
> <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>> wrote:
>
>     Unfortunately, the master branch with hdf-1.10.5 returns similar
>     error.
>
>
> It looks like its complaining about the group_id. Are you sure its 
> correct?
>
>    Matt
>
>     Danyang
>
>     On 2019-09-02 5:31 p.m., Danyang Su wrote:
>     > Hi Barry,
>     >
>     > Yes, I have already included zlib and szlib during the
>     configuration.
>     > I will try the dev version to see if it works.
>     >
>     > Thanks,
>     >
>     > Danyang
>     >
>     > On 2019-09-02 12:45 p.m., Smith, Barry F. wrote:
>     >>    You could try the master branch of PETSc that uses a much more
>     >> recent branch of hdf5
>     >>
>     >>    When you did the --download-hdf5 did you also do
>     --download-zlib
>     >> and --download-szlib (though I would hope hdf5 would give you a
>     very
>     >> useful error message that they need to be installed instead of the
>     >> vague error message they do provide.)
>     >>
>     >>     Barry
>     >>
>     >>
>     >>> On Sep 2, 2019, at 2:24 PM, Danyang Su via petsc-users
>     >>> <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>> wrote:
>     >>>
>     >>> Dear All,
>     >>>
>     >>> Not sure if this is the right place to ask hdf5 question. I
>     >>> installed hdf5 through PETSc configuration
>     --download-hdf5=yes. The
>     >>> code runs without problem except the function to create
>     compressed
>     >>> data (red part shown below).
>     >>>
>     >>>      !c create local memory space and hyperslab
>     >>>      call h5screate_simple_f(hdf5_ndim, hdf5_dsize,
>     >>> memspace,           &
>     >>>                              hdf5_ierr)
>     >>>      call h5sselect_hyperslab_f(memspace,
>     >>> H5S_SELECT_SET_F,             &
>     >>>                                 hdf5_offset, hdf5_count,
>     >>> hdf5_ierr,     &
>     >>>                                 hdf5_stride, hdf5_block)
>     >>>
>     >>>      !c create the global file space and hyperslab
>     >>>      call h5screate_simple_f(hdf5_ndim,hdf5_gdsize,filespace, &
>     >>>                              hdf5_ierr)
>     >>>      call h5sselect_hyperslab_f(filespace,
>     >>> H5S_SELECT_SET_F,            &
>     >>>                                 hdf5_goffset, hdf5_count,
>     >>> hdf5_ierr,    &
>     >>>                                 hdf5_stride, hdf5_block)
>     >>>
>     >>>      !c create a data chunking property
>     >>>      call h5pcreate_f(H5P_DATASET_CREATE_F, chunk_id, hdf5_ierr)
>     >>>      call h5pset_chunk_f(chunk_id, hdf5_ndim, hdf5_csize,
>     hdf5_ierr)
>     >>>
>     >>>      !c create compressed data, dataset must be chunked for
>     compression
>     >>>      !c the following cause crash in hdf5 library, check when new
>     >>>      !c hdf5 version is available
>     >>>
>     >>>      ! Set ZLIB / DEFLATE Compression using compression level 6.
>     >>>      ! To use SZIP Compression comment out these lines.
>     >>>      !call h5pset_deflate_f(chunk_id, 6, hdf5_ierr)
>     >>>
>     >>>      ! Uncomment these lines to set SZIP Compression
>     >>>      !szip_options_mask = H5_SZIP_NN_OM_F
>     >>>      !szip_pixels_per_block = 16
>     >>>      !call H5Pset_szip_f(chunk_id,
>     >>> szip_options_mask,                    &
>     >>>      !                   szip_pixels_per_block, hdf5_ierr)
>     >>>
>     >>>      !c create the dataset id
>     >>>      call h5dcreate_f(group_id, dataname,
>     >>> H5T_NATIVE_INTEGER,           &
>     >>>                       filespace, dset_id,
>     >>> hdf5_ierr,                    &
>     >>>                       dcpl_id=chunk_id)
>     >>>
>     >>>      !c create a data transfer property
>     >>>      call h5pcreate_f(H5P_DATASET_XFER_F, xlist_id, hdf5_ierr)
>     >>>      call h5pset_dxpl_mpio_f(xlist_id,
>     >>> H5FD_MPIO_COLLECTIVE_F,          &
>     >>>                              hdf5_ierr)
>     >>>
>     >>>      !c write the dataset collectively
>     >>>      call h5dwrite_f(dset_id, H5T_NATIVE_INTEGER, dataset,
>     >>> hdf5_dsize,  &
>     >>>                      hdf5_ierr,
>     >>> file_space_id=filespace,                &
>     >>>                      mem_space_id=memspace, xfer_prp = xlist_id)
>     >>>
>     >>>      call h5dclose_f(dset_id, hdf5_ierr)
>     >>>
>     >>>      !c close resources
>     >>>      call h5sclose_f(filespace, hdf5_ierr)
>     >>>      call h5sclose_f(memspace, hdf5_ierr)
>     >>>      call h5pclose_f(chunk_id, hdf5_ierr)
>     >>>      call h5pclose_f(xlist_id, hdf5_ierr)
>     >>>
>     >>>
>     >>>
>     >>> Both h5pset_deflate_f and H5Pset_szip_f crashes the code with
>     error
>     >>> information as shown below. If I comment out h5pset_deflate_f and
>     >>> H5Pset_szip_f, then everything works fine.
>     >>>
>     >>> HDF5-DIAG: Error detected in HDF5 (1.8.18) MPI-process 0:
>     >>>    #000: H5D.c line 194 in H5Dcreate2(): unable to create dataset
>     >>>      major: Dataset
>     >>>      minor: Unable to initialize object
>     >>>    #001: H5Dint.c line 455 in H5D__create_named(): unable to
>     create
>     >>> and link to dataset
>     >>>      major: Dataset
>     >>>      minor: Unable to initialize object
>     >>>    #002: H5L.c line 1638 in H5L_link_object(): unable to
>     create new
>     >>> link to object
>     >>>      major: Links
>     >>>      minor: Unable to initialize object
>     >>>    #003: H5L.c line 1882 in H5L_create_real(): can't insert link
>     >>>      major: Symbol table
>     >>>      minor: Unable to insert object
>     >>>    #004: H5Gtraverse.c line 861 in H5G_traverse(): internal path
>     >>> traversal failed
>     >>>      major: Symbol table
>     >>>      minor: Object not found
>     >>>
>     >>> Does anyone encounter this kind of error before?
>     >>>
>     >>> Kind regards,
>     >>>
>     >>> Danyang
>     >>>
>
>
>
> -- 
> What most experimenters take for granted before they begin their 
> experiments is infinitely more interesting than any results to which 
> their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ 
> <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190903/62627453/attachment.html>


More information about the petsc-users mailing list