[petsc-users] Parallel writing in HDF5-1.12.0 when some processors have no data to write

Danyang Su danyang.su at gmail.com
Thu Jun 11 21:58:13 CDT 2020


Hi Barry,

The HDF5 calls fail. I reconfigure PETSc with HDF 1.10.5 version and it works fine on different platforms. So, it is more likely there is a bug in the latest HDF version.

Thanks.

All the best,

Danyang



On June 11, 2020 5:58:28 a.m. PDT, Barry Smith <bsmith at petsc.dev> wrote:
>
>Are you making HDF5 calls that fail or is it PETSc routines calling
>HDF5 that fail? 
>
>Regardless it sounds like the easiest fix is to switch back to the
>previous HDF5 and wait for HDF5 to fix what sounds to be a bug.
>
>   Barry
>
>
>> On Jun 11, 2020, at 1:05 AM, Danyang Su <danyang.su at gmail.com> wrote:
>> 
>> Hi All,
>>  
>> Sorry to send the previous incomplete email accidentally. 
>>  
>> After updating to HDF5-1.12.0, I got some problem if some processors
>have no data to write or not necessary to write. Since parallel writing
>is collective, I cannot disable those processors from writing. For the
>old version, there seems no such problem. So far, the problem only
>occurs on Linux using GNU compiler. The same code has no problem using
>intel compiler or latest gnu compiler on MacOS. 
>>  
>> I have already included h5sselect_none in the code for those
>processors without data. But it does not take effect. The problem is
>documented in the following link (How do you write data when one
>process doesn't have or need to write data ?).
>>  https://support.hdfgroup.org/HDF5/hdf5-quest.html#par-nodata
><https://support.hdfgroup.org/HDF5/hdf5-quest.html#par-nodata>
>>  
>> Similar problem has also been reported on HDF Forum by others.
>> https://forum.hdfgroup.org/t/bug-on-hdf5-1-12-0-fortran-parallel/6864
><https://forum.hdfgroup.org/t/bug-on-hdf5-1-12-0-fortran-parallel/6864>
>>  
>> Any suggestion for that?
>>  
>> Thanks,
>>  
>> Danyang

-- 
Sent from my Android device with K-9 Mail. Please excuse my brevity.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200611/7c1d86c0/attachment.html>


More information about the petsc-users mailing list