[petsc-dev] MPI Vec and HDF5 output

Brad Aagaard baagaard at usgs.gov
Fri Jan 28 10:23:12 CST 2011


We are trying to get HDF5 output working in PyLith using the PETSc HDF5 
viewer. I get an HDF5 error

H5Screate_simple(): zero sized dimension for non-unlimited dimension,
VecView_MPI_HDF5() line 771 in petsc-dev/src/vec/vec/impls/mpi/pdvec.c

when I have a Vec that has a zero local size on a processor (the global 
size is nonzero). The Vec layout is correct in that we expect some 
processors to have a local size of zero (the field is over only a 
portion of the domain).

The collective write of the Vec creates the filespace using the global 
size and the memspace using the local size. It is in trying to create 
the memspace that the error occurs. I tried adjusting pdvec.c so that it 
creates a null memspace when the local size is zero but then I get an 
error during the write about differences in dimensions (the number of 
dimensions in the filespace and memspace don't agree). I was unable to 
find any info on this type of use case in the HDF5 documentation.

I have attached a toy example that illustrates the problem.
mpiexec -n 1 test_view [creates the expected test.h5 file]
mpiexec -n 2 test_view [generates the above error]

Thanks,
Brad
-------------- next part --------------
A non-text attachment was scrubbed...
Name: test_view.tgz
Type: application/x-gtar
Size: 661 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20110128/5a5bc5fa/attachment.gtar>


More information about the petsc-dev mailing list