[petsc-users] DMView and DMLoad

Sagiyama, Koki k.sagiyama at imperial.ac.uk
Thu Nov 18 17:26:14 CST 2021


Hi Berend,

I was not able to reproduce the issue you are having, but the following 1D example (and similar 2D examples) worked fine for me using the latest PETSc. Please note that DMPlexCoordinatesLoad() now takes a PetscSF object as the third argument, but the default behavior is unchanged.

/* test_periodic_io.c */

#include <petscdmplex.h>
#include <petscsf.h>
#include <petsclayouthdf5.h>

int main(int argc, char **argv)
{
  DM                 dm;
  Vec                coordinates;
  PetscViewer        viewer;
  PetscViewerFormat  format = PETSC_VIEWER_HDF5_PETSC;
  PetscSF            sfO;
  PetscErrorCode     ierr;

  ierr = PetscInitialize(&argc, &argv, NULL, NULL); if (ierr) return ierr;
  /* Save */
  ierr = PetscViewerHDF5Open(PETSC_COMM_WORLD, "periodic_example.h5", FILE_MODE_WRITE, &viewer);CHKERRQ(ierr);
  {
    DM              pdm;
    PetscInt        dim = 1;
    const PetscInt  faces[1] = {4};
    DMBoundaryType  periodicity[] = {DM_BOUNDARY_PERIODIC};
    PetscInt        overlap = 1;

    ierr = DMPlexCreateBoxMesh(PETSC_COMM_WORLD, dim, PETSC_FALSE, faces, NULL, NULL, periodicity, PETSC_TRUE, &dm);CHKERRQ(ierr);
    ierr = DMPlexDistribute(dm, overlap, NULL, &pdm);CHKERRQ(ierr);
    ierr = DMDestroy(&dm);CHKERRQ(ierr);
    dm = pdm;
    ierr = PetscObjectSetName((PetscObject)dm, "periodicDM");CHKERRQ(ierr);
  }
  ierr = DMGetCoordinates(dm, &coordinates);CHKERRQ(ierr);
  ierr = PetscPrintf(PETSC_COMM_WORLD, "Coordinates before saving:\n");CHKERRQ(ierr);
  ierr = VecView(coordinates, NULL);CHKERRQ(ierr);
  ierr = PetscViewerPushFormat(viewer, format);CHKERRQ(ierr);
  ierr = DMPlexTopologyView(dm, viewer);CHKERRQ(ierr);
  ierr = DMPlexCoordinatesView(dm, viewer);CHKERRQ(ierr);
  ierr = PetscViewerPopFormat(viewer);CHKERRQ(ierr);
  ierr = DMDestroy(&dm);CHKERRQ(ierr);
  ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr);
  /* Load */
  ierr = PetscViewerHDF5Open(PETSC_COMM_WORLD, "periodic_example.h5", FILE_MODE_READ, &viewer);CHKERRQ(ierr);
  ierr = DMCreate(PETSC_COMM_WORLD, &dm);CHKERRQ(ierr);
  ierr = DMSetType(dm, DMPLEX);CHKERRQ(ierr);
  ierr = PetscObjectSetName((PetscObject)dm, "periodicDM");CHKERRQ(ierr);
  ierr = PetscViewerPushFormat(viewer, format);CHKERRQ(ierr);
  ierr = DMPlexTopologyLoad(dm, viewer, &sfO);CHKERRQ(ierr);
  ierr = DMPlexCoordinatesLoad(dm, viewer, sfO);CHKERRQ(ierr);
  ierr = PetscViewerPopFormat(viewer);CHKERRQ(ierr);
  ierr = DMGetCoordinates(dm, &coordinates);CHKERRQ(ierr);
  ierr = PetscPrintf(PETSC_COMM_WORLD, "Coordinates after loading:\n");CHKERRQ(ierr);
  ierr = VecView(coordinates, NULL);CHKERRQ(ierr);
  ierr = PetscSFDestroy(&sfO);CHKERRQ(ierr);
  ierr = DMDestroy(&dm);CHKERRQ(ierr);
  ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr);
  ierr = PetscFinalize();
  return ierr;
}

mpiexec -n 2 ./test_periodic_io

Coordinates before saving:
Vec Object: coordinates 2 MPI processes
  type: mpi
Process [0]
0.
Process [1]
0.25
0.5
0.75
Coordinates after loading:
Vec Object: vertices 2 MPI processes
  type: mpi
Process [0]
0.
0.25
0.5
0.75
Process [1]

I would also like to note that, with the latest update, we can optionally load coordinates directly on the distributed dm as (using your notation):

  /* Distribute dm */
  ...
  PetscSFCompose(sfO, sfDist, &sf);
  DMPlexCoordinatesLoad(dm, viewer, sf);

To use this feature, we need to pass "-dm_plex_view_hdf5_storage_version 2.0.0" option when saving topology/coordinates.


Thanks,
Koki
________________________________
From: Berend van Wachem <berend.vanwachem at ovgu.de>
Sent: Wednesday, November 17, 2021 3:16 PM
To: Hapla Vaclav <vaclav.hapla at erdw.ethz.ch>; PETSc users list <petsc-users at mcs.anl.gov>; Lawrence Mitchell <wence at gmx.li>; Sagiyama, Koki <k.sagiyama at imperial.ac.uk>
Subject: Re: [petsc-users] DMView and DMLoad


*******************
This email originates from outside Imperial. Do not click on links and attachments unless you recognise the sender.
If you trust the sender, add them to your safe senders list https://spam.ic.ac.uk/SpamConsole/Senders.aspx to disable email stamping for this address.
*******************
Dear Vaclav, Lawrence, Koki,

Thanks for your help! Following your advice and following your example
(https://petsc.org/main/docs/manual/dmplex/#saving-and-loading-data-with-hdf5)
we are able to save and load the DM with a wrapped Vector in h5 format
(PETSC_VIEWER_HDF5_PETSC) successfully.

For saving, we use something similar to:

     DMPlexTopologyView(dm, viewer);
     DMClone(dm, &sdm);
     ...
     DMPlexSectionView(dm, viewer, sdm);
     DMGetLocalVector(sdm, &vec);
     ...
     DMPlexLocalVectorView(dm, viewer, sdm, vec);

and for loading:

     DMCreate(PETSC_COMM_WORLD, &dm);
     DMSetType(dm, DMPLEX);
         ...
       PetscViewerPushFormat(viewer, PETSC_VIEWER_HDF5_PETSC);
     DMPlexTopologyLoad(dm, viewer, &sfO);
     DMPlexLabelsLoad(dm, viewer);
     DMPlexCoordinatesLoad(dm, viewer);
     PetscViewerPopFormat(viewer);
     ...
     PetscSFCompose(sfO, sfDist, &sf);
     ...
     DMClone(dm, &sdm);
     DMPlexSectionLoad(dm, viewer, sdm, sf, &globalDataSF, &localDataSF);
     DMGetLocalVector(sdm, &vec);
     ...
     DMPlexLocalVectorLoad(dm, viewer, sdm, localDataSF, vec);


This works fine for non-periodic DMs but for periodic cases the line:

     DMPlexCoordinatesLoad(dm, H5Viewer);

delivers the error message: invalid argument and the number of loaded
coordinates does not match the number of vertices.

Is this a known shortcoming, or have we forgotten something to load
periodic DMs?

Best regards,

Berend.



On 9/22/21 20:59, Hapla Vaclav wrote:
> To avoid confusions here, Berend seems to be specifically demanding XDMF
> (PETSC_VIEWER_HDF5_XDMF). The stuff we are now working on is parallel
> checkpointing in our own HDF5 format (PETSC_VIEWER_HDF5_PETSC), I will
> make a series of MRs on this topic in the following days.
>
> For XDMF, we are specifically missing the ability to write/load DMLabels
> properly. XDMF uses specific cell-local numbering for faces for
> specification of face sets, and face-local numbering for specification
> of edge sets, which is not great wrt DMPlex design. And ParaView doesn't
> show any of these properly so it's hard to debug. Matt, we should talk
> about this soon.
>
> Berend, for now, could you just load the mesh initially from XDMF and
> then use our PETSC_VIEWER_HDF5_PETSC format for subsequent saving/loading?
>
> Thanks,
>
> Vaclav
>
>> On 17 Sep 2021, at 15:46, Lawrence Mitchell <wence at gmx.li
>> <mailto:wence at gmx.li>> wrote:
>>
>> Hi Berend,
>>
>>> On 14 Sep 2021, at 12:23, Matthew Knepley <knepley at gmail.com
>>> <mailto:knepley at gmail.com>> wrote:
>>>
>>> On Tue, Sep 14, 2021 at 5:15 AM Berend van Wachem
>>> <berend.vanwachem at ovgu.de <mailto:berend.vanwachem at ovgu.de>> wrote:
>>> Dear PETSc-team,
>>>
>>> We are trying to save and load distributed DMPlex and its associated
>>> physical fields (created with DMCreateGlobalVector)  (Uvelocity,
>>> VVelocity,  ...) in HDF5_XDMF format. To achieve this, we do the
>>> following:
>>>
>>> 1) save in the same xdmf.h5 file:
>>> DMView( DM         , H5_XDMF_Viewer );
>>> VecView( UVelocity, H5_XDMF_Viewer );
>>>
>>> 2) load the dm:
>>> DMPlexCreateFromfile(PETSC_COMM_WORLD, Filename, PETSC_TRUE, DM);
>>>
>>> 3) load the physical field:
>>> VecLoad( UVelocity, H5_XDMF_Viewer );
>>>
>>> There are no errors in the execution, but the loaded DM is distributed
>>> differently to the original one, which results in the incorrect
>>> placement of the values of the physical fields (UVelocity etc.) in the
>>> domain.
>>>
>>> This approach is used to restart the simulation with the last saved DM.
>>> Is there something we are missing, or there exists alternative routes to
>>> this goal? Can we somehow get the IS of the redistribution, so we can
>>> re-distribute the vector data as well?
>>>
>>> Many thanks, best regards,
>>>
>>> Hi Berend,
>>>
>>> We are in the midst of rewriting this. We want to support saving
>>> multiple meshes, with fields attached to each,
>>> and preserving the discretization (section) information, and allowing
>>> us to load up on a different number of
>>> processes. We plan to be done by October. Vaclav and I are doing this
>>> in collaboration with Koki Sagiyama,
>>> David Ham, and Lawrence Mitchell from the Firedrake team.
>>
>> The core load/save cycle functionality is now in PETSc main. So if
>> you're using main rather than a release, you can get access to it now.
>> This section of the manual shows an example of how to do
>> thingshttps://petsc.org/main/docs/manual/dmplex/#saving-and-loading-data-with-hdf5
>> <https://petsc.org/main/docs/manual/dmplex/#saving-and-loading-data-with-hdf5>
>>
>> Let us know if things aren't clear!
>>
>> Thanks,
>>
>> Lawrence
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20211118/602d7301/attachment.html>


More information about the petsc-users mailing list