[petsc-users] DMView and DMLoad

Lawrence Mitchell wence at gmx.li
Fri Sep 17 08:46:58 CDT 2021


Hi Berend,

> On 14 Sep 2021, at 12:23, Matthew Knepley <knepley at gmail.com> wrote:
> 
> On Tue, Sep 14, 2021 at 5:15 AM Berend van Wachem <berend.vanwachem at ovgu.de> wrote:
> Dear PETSc-team,
> 
> We are trying to save and load distributed DMPlex and its associated 
> physical fields (created with DMCreateGlobalVector)  (Uvelocity, 
> VVelocity,  ...) in HDF5_XDMF format. To achieve this, we do the following:
> 
> 1) save in the same xdmf.h5 file:
> DMView( DM         , H5_XDMF_Viewer );
> VecView( UVelocity, H5_XDMF_Viewer );
> 
> 2) load the dm:
> DMPlexCreateFromfile(PETSC_COMM_WORLD, Filename, PETSC_TRUE, DM);
> 
> 3) load the physical field:
> VecLoad( UVelocity, H5_XDMF_Viewer );
> 
> There are no errors in the execution, but the loaded DM is distributed 
> differently to the original one, which results in the incorrect 
> placement of the values of the physical fields (UVelocity etc.) in the 
> domain.
> 
> This approach is used to restart the simulation with the last saved DM. 
> Is there something we are missing, or there exists alternative routes to 
> this goal? Can we somehow get the IS of the redistribution, so we can 
> re-distribute the vector data as well?
> 
> Many thanks, best regards,
> 
> Hi Berend,
> 
> We are in the midst of rewriting this. We want to support saving multiple meshes, with fields attached to each,
> and preserving the discretization (section) information, and allowing us to load up on a different number of
> processes. We plan to be done by October. Vaclav and I are doing this in collaboration with Koki Sagiyama,
> David Ham, and Lawrence Mitchell from the Firedrake team.

The core load/save cycle functionality is now in PETSc main. So if you're using main rather than a release, you can get access to it now. This section of the manual shows an example of how to do things https://petsc.org/main/docs/manual/dmplex/#saving-and-loading-data-with-hdf5

Let us know if things aren't clear!

Thanks,

Lawrence


More information about the petsc-users mailing list