[petsc-dev] MPI Vec and HDF5 output

Barry Smith bsmith at mcs.anl.gov
Fri Jan 28 17:08:29 CST 2011


  Is there anyway we can get this into the petsc-dev source so it is fixed for everyone in the future?

   Thanks

   Barry

On Jan 28, 2011, at 2:54 PM, Brad Aagaard wrote:

> Barry-
> 
> It turns out the workaround was explained deep in some of the HDF5 documentation. I pushed the fix. If the local Vec size is zero, then I created a null memspace and reset the filespace to none.
> 
> Brad
> 
> 
> On 01/28/2011 10:00 AM, Barry Smith wrote:
>> 
>>   Sounds like a bug report/inquiry to HDF5 folks.
>> 
>>    Barry
>> 
>> 
>> On Jan 28, 2011, at 10:23 AM, Brad Aagaard wrote:
>> 
>>> We are trying to get HDF5 output working in PyLith using the PETSc HDF5 viewer. I get an HDF5 error
>>> 
>>> H5Screate_simple(): zero sized dimension for non-unlimited dimension,
>>> VecView_MPI_HDF5() line 771 in petsc-dev/src/vec/vec/impls/mpi/pdvec.c
>>> 
>>> when I have a Vec that has a zero local size on a processor (the global size is nonzero). The Vec layout is correct in that we expect some processors to have a local size of zero (the field is over only a portion of the domain).
>>> 
>>> The collective write of the Vec creates the filespace using the global size and the memspace using the local size. It is in trying to create the memspace that the error occurs. I tried adjusting pdvec.c so that it creates a null memspace when the local size is zero but then I get an error during the write about differences in dimensions (the number of dimensions in the filespace and memspace don't agree). I was unable to find any info on this type of use case in the HDF5 documentation.
>>> 
>>> I have attached a toy example that illustrates the problem.
>>> mpiexec -n 1 test_view [creates the expected test.h5 file]
>>> mpiexec -n 2 test_view [generates the above error]
>>> 
>>> Thanks,
>>> Brad
>>> <test_view.tgz>
>> 
>> 
> 




More information about the petsc-dev mailing list