[petsc-users] DMView and DMLoad

Berend van Wachem berend.vanwachem at ovgu.de
Wed Nov 17 09:16:46 CST 2021

Dear Vaclav, Lawrence, Koki,

Thanks for your help! Following your advice and following your example 
we are able to save and load the DM with a wrapped Vector in h5 format 
(PETSC_VIEWER_HDF5_PETSC) successfully.

For saving, we use something similar to:

     DMPlexTopologyView(dm, viewer);
     DMClone(dm, &sdm);
     DMPlexSectionView(dm, viewer, sdm);
     DMGetLocalVector(sdm, &vec);
     DMPlexLocalVectorView(dm, viewer, sdm, vec);

and for loading:

     DMCreate(PETSC_COMM_WORLD, &dm);
     DMSetType(dm, DMPLEX);
       PetscViewerPushFormat(viewer, PETSC_VIEWER_HDF5_PETSC);
     DMPlexTopologyLoad(dm, viewer, &sfO);
     DMPlexLabelsLoad(dm, viewer);
     DMPlexCoordinatesLoad(dm, viewer);
     PetscSFCompose(sfO, sfDist, &sf);
     DMClone(dm, &sdm);
     DMPlexSectionLoad(dm, viewer, sdm, sf, &globalDataSF, &localDataSF);
     DMGetLocalVector(sdm, &vec);
     DMPlexLocalVectorLoad(dm, viewer, sdm, localDataSF, vec);

This works fine for non-periodic DMs but for periodic cases the line:

     DMPlexCoordinatesLoad(dm, H5Viewer);

delivers the error message: invalid argument and the number of loaded 
coordinates does not match the number of vertices.

Is this a known shortcoming, or have we forgotten something to load 
periodic DMs?

Best regards,


On 9/22/21 20:59, Hapla Vaclav wrote:
> To avoid confusions here, Berend seems to be specifically demanding XDMF 
> (PETSC_VIEWER_HDF5_XDMF). The stuff we are now working on is parallel 
> checkpointing in our own HDF5 format (PETSC_VIEWER_HDF5_PETSC), I will 
> make a series of MRs on this topic in the following days.
> For XDMF, we are specifically missing the ability to write/load DMLabels 
> properly. XDMF uses specific cell-local numbering for faces for 
> specification of face sets, and face-local numbering for specification 
> of edge sets, which is not great wrt DMPlex design. And ParaView doesn't 
> show any of these properly so it's hard to debug. Matt, we should talk 
> about this soon.
> Berend, for now, could you just load the mesh initially from XDMF and 
> then use our PETSC_VIEWER_HDF5_PETSC format for subsequent saving/loading?
> Thanks,
> Vaclav
>> On 17 Sep 2021, at 15:46, Lawrence Mitchell <wence at gmx.li 
>> <mailto:wence at gmx.li>> wrote:
>> Hi Berend,
>>> On 14 Sep 2021, at 12:23, Matthew Knepley <knepley at gmail.com 
>>> <mailto:knepley at gmail.com>> wrote:
>>> On Tue, Sep 14, 2021 at 5:15 AM Berend van Wachem 
>>> <berend.vanwachem at ovgu.de <mailto:berend.vanwachem at ovgu.de>> wrote:
>>> Dear PETSc-team,
>>> We are trying to save and load distributed DMPlex and its associated
>>> physical fields (created with DMCreateGlobalVector)  (Uvelocity,
>>> VVelocity,  ...) in HDF5_XDMF format. To achieve this, we do the 
>>> following:
>>> 1) save in the same xdmf.h5 file:
>>> DMView( DM         , H5_XDMF_Viewer );
>>> VecView( UVelocity, H5_XDMF_Viewer );
>>> 2) load the dm:
>>> DMPlexCreateFromfile(PETSC_COMM_WORLD, Filename, PETSC_TRUE, DM);
>>> 3) load the physical field:
>>> VecLoad( UVelocity, H5_XDMF_Viewer );
>>> There are no errors in the execution, but the loaded DM is distributed
>>> differently to the original one, which results in the incorrect
>>> placement of the values of the physical fields (UVelocity etc.) in the
>>> domain.
>>> This approach is used to restart the simulation with the last saved DM.
>>> Is there something we are missing, or there exists alternative routes to
>>> this goal? Can we somehow get the IS of the redistribution, so we can
>>> re-distribute the vector data as well?
>>> Many thanks, best regards,
>>> Hi Berend,
>>> We are in the midst of rewriting this. We want to support saving 
>>> multiple meshes, with fields attached to each,
>>> and preserving the discretization (section) information, and allowing 
>>> us to load up on a different number of
>>> processes. We plan to be done by October. Vaclav and I are doing this 
>>> in collaboration with Koki Sagiyama,
>>> David Ham, and Lawrence Mitchell from the Firedrake team.
>> The core load/save cycle functionality is now in PETSc main. So if 
>> you're using main rather than a release, you can get access to it now. 
>> This section of the manual shows an example of how to do 
>> thingshttps://petsc.org/main/docs/manual/dmplex/#saving-and-loading-data-with-hdf5 
>> <https://petsc.org/main/docs/manual/dmplex/#saving-and-loading-data-with-hdf5>
>> Let us know if things aren't clear!
>> Thanks,
>> Lawrence

More information about the petsc-users mailing list