[petsc-users] DMView and DMLoad
Sagiyama, Koki
k.sagiyama at imperial.ac.uk
Tue Dec 7 09:50:09 CST 2021
Hi Berend,
I made some small changes to your code to successfully compile it and defined a periodic dm using DMPlexCreateBoxMesh(), but otherwise your code worked fine.
I think we would like to see a complete minimal failing example. Can you make the working example that I pasted in earlier email fail just by modifying the dm (i.e., using the periodic mesh you are actually using)?
Thanks,
Koki
________________________________
From: Berend van Wachem <berend.vanwachem at ovgu.de>
Sent: Monday, December 6, 2021 3:39 PM
To: Sagiyama, Koki <k.sagiyama at imperial.ac.uk>; Hapla Vaclav <vaclav.hapla at erdw.ethz.ch>; PETSc users list <petsc-users at mcs.anl.gov>; Lawrence Mitchell <wence at gmx.li>
Subject: Re: [petsc-users] DMView and DMLoad
Dear Koki,
Thanks for your email. In the example of your last email
DMPlexCoordinatesLoad() takes sF0 (PetscSF) as a third argument. In our
code this modification does not fix the error when loading a periodic
dm. Are we doing something wrong? I've included an example code at the
bottom of this email, including the error output.
Thanks and best regards,
Berend
/**** Write DM + Vec restart ****/
PetscViewerHDF5Open(PETSC_COMM_WORLD, "result", FILE_MODE_WRITE, &H5Viewer);
PetscObjectSetName((PetscObject)dm, "plexA");
PetscViewerPushFormat(H5Viewer, PETSC_VIEWER_HDF5_PETSC);
DMPlexTopologyView(dm, H5Viewer);
DMPlexLabelsView(dm, H5Viewer);
DMPlexCoordinatesView(dm, H5Viewer);
PetscViewerPopFormat(H5Viewer);
DM sdm;
PetscSection s;
DMClone(dm, &sdm);
PetscObjectSetName((PetscObject)sdm, "dmA");
DMGetGlobalSection(dm, &s);
DMSetGlobalSection(sdm, s);
DMPlexSectionView(dm, H5Viewer, sdm);
Vec vec, vecOld;
PetscScalar *array, *arrayOld, *xVecArray, *xVecArrayOld;
PetscInt numPoints;
DMGetGlobalVector(sdm, &vec);
DMGetGlobalVector(sdm, &vecOld);
/*** Fill the vectors vec and vecOld ***/
VecGetArray(vec, &array);
VecGetArray(vecOld, &arrayOld);
VecGetLocalSize(xGlobalVector, &numPoints);
VecGetArray(xGlobalVector, &xVecArray);
VecGetArray(xOldGlobalVector, &xVecArrayOld);
for (i = 0; i < numPoints; i++) /* Loop over all internal mesh points */
{
array[i] = xVecArray[i];
arrayOld[i] = xVecArrayOld[i];
}
VecRestoreArray(vec, &array);
VecRestoreArray(vecOld, &arrayOld);
VecRestoreArray(xGlobalVector, &xVecArray);
VecRestoreArray(xOldGlobalVector, &xVecArrayOld);
PetscObjectSetName((PetscObject)vec, "vecA");
PetscObjectSetName((PetscObject)vecOld, "vecB");
DMPlexGlobalVectorView(dm, H5Viewer, sdm, vec);
DMPlexGlobalVectorView(dm, H5Viewer, sdm, vecOld);
PetscViewerDestroy(&H5Viewer);
/*** end of writing ****/
/*** Load ***/
PetscViewerHDF5Open(PETSC_COMM_WORLD, "result", FILE_MODE_READ, &H5Viewer);
DMCreate(PETSC_COMM_WORLD, &dm);
DMSetType(dm, DMPLEX);
PetscObjectSetName((PetscObject)dm, "plexA");
PetscViewerPushFormat(H5Viewer, PETSC_VIEWER_HDF5_PETSC);
DMPlexTopologyLoad(dm, H5Viewer, &sfO);
DMPlexLabelsLoad(dm, H5Viewer);
DMPlexCoordinatesLoad(dm, H5Viewer, sfO);
PetscViewerPopFormat(H5Viewer);
DMPlexDistribute(dm, Options->Mesh.overlap, &sfDist, &distributedDM);
if (distributedDM) {
DMDestroy(&dm);
dm = distributedDM;
PetscObjectSetName((PetscObject)dm, "plexA");
}
PetscSFCompose(sfO, sfDist, &sf);
PetscSFDestroy(&sfO);
PetscSFDestroy(&sfDist);
DMClone(dm, &sdm);
PetscObjectSetName((PetscObject)sdm, "dmA");
DMPlexSectionLoad(dm, H5Viewer, sdm, sf, &globalDataSF, &localDataSF);
/** Load the Vectors **/
DMGetGlobalVector(sdm, &Restart_xGlobalVector);
VecSet(Restart_xGlobalVector,0.0);
PetscObjectSetName((PetscObject)Restart_xGlobalVector, "vecA");
DMPlexGlobalVectorLoad(dm, H5Viewer, sdm,
globalDataSF,Restart_xGlobalVector);
DMGetGlobalVector(sdm, &Restart_xOldGlobalVector);
VecSet(Restart_xOldGlobalVector,0.0);
PetscObjectSetName((PetscObject)Restart_xOldGlobalVector, "vecB");
DMPlexGlobalVectorLoad(dm, H5Viewer, sdm, globalDataSF,
Restart_xOldGlobalVector);
PetscViewerDestroy(&H5Viewer);
/**** The error message when loading is the following ************/
Creating and distributing mesh
[0]PETSC ERROR: --------------------- Error Message
--------------------------
[0]PETSC ERROR: Invalid argument
[0]PETSC ERROR: Number of coordinates loaded 17128 does not match number
of vertices 8000
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.16.1-435-g007f11b901
GIT Date: 2021-12-01 14:31:21 +0000
[0]PETSC ERROR: ./MF3 on a linux-gcc-openmpi-opt named
ivt24.ads.uni-magdeburg.de by berend Mon Dec 6 16:11:21 2021
[0]PETSC ERROR: Configure options --with-p4est=yes --with-partemis
--with-metis --with-debugging=no --download-metis=yes
--download-parmetis=yes --with-errorchecking=no --download-hdf5
--download-zlib --download-p4est
[0]PETSC ERROR: #1 DMPlexCoordinatesLoad_HDF5_V0_Private() at
/home/berend/src/petsc_main/src/dm/impls/plex/plexhdf5.c:1387
[0]PETSC ERROR: #2 DMPlexCoordinatesLoad_HDF5_Internal() at
/home/berend/src/petsc_main/src/dm/impls/plex/plexhdf5.c:1419
[0]PETSC ERROR: #3 DMPlexCoordinatesLoad() at
/home/berend/src/petsc_main/src/dm/impls/plex/plex.c:2070
[0]PETSC ERROR: #4 RestartMeshDM() at
/home/berend/src/eclipseworkspace/multiflow/src/io/restartmesh.c:81
[0]PETSC ERROR: #5 CreateMeshDM() at
/home/berend/src/eclipseworkspace/multiflow/src/mesh/createmesh.c:61
[0]PETSC ERROR: #6 main() at
/home/berend/src/eclipseworkspace/multiflow/src/general/main.c:132
[0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: --download-hdf5
[0]PETSC ERROR: --download-metis=yes
[0]PETSC ERROR: --download-p4est
[0]PETSC ERROR: --download-parmetis=yes
[0]PETSC ERROR: --download-zlib
[0]PETSC ERROR: --with-debugging=no
[0]PETSC ERROR: --with-errorchecking=no
[0]PETSC ERROR: --with-metis
[0]PETSC ERROR: --with-p4est=yes
[0]PETSC ERROR: --with-partemis
[0]PETSC ERROR: -d results
[0]PETSC ERROR: -o run.mf
[0]PETSC ERROR: ----------------End of Error Message -------send entire
error message to petsc-maint at mcs.anl.gov----------
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 62.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
On 11/19/21 00:26, Sagiyama, Koki wrote:
> Hi Berend,
>
> I was not able to reproduce the issue you are having, but the following
> 1D example (and similar 2D examples) worked fine for me using the latest
> PETSc. Please note that DMPlexCoordinatesLoad() now takes a PetscSF
> object as the third argument, but the default behavior is unchanged.
>
> /* test_periodic_io.c */
>
> #include <petscdmplex.h>
> #include <petscsf.h>
> #include <petsclayouthdf5.h>
>
> int main(int argc, char **argv)
> {
> DM dm;
> Vec coordinates;
> PetscViewer viewer;
> PetscViewerFormat format = PETSC_VIEWER_HDF5_PETSC;
> PetscSF sfO;
> PetscErrorCode ierr;
>
> ierr = PetscInitialize(&argc, &argv, NULL, NULL); if (ierr) return ierr;
> /* Save */
> ierr = PetscViewerHDF5Open(PETSC_COMM_WORLD, "periodic_example.h5",
> FILE_MODE_WRITE, &viewer);CHKERRQ(ierr);
> {
> DM pdm;
> PetscInt dim = 1;
> const PetscInt faces[1] = {4};
> DMBoundaryType periodicity[] = {DM_BOUNDARY_PERIODIC};
> PetscInt overlap = 1;
>
> ierr = DMPlexCreateBoxMesh(PETSC_COMM_WORLD, dim, PETSC_FALSE,
> faces, NULL, NULL, periodicity, PETSC_TRUE, &dm);CHKERRQ(ierr);
> ierr = DMPlexDistribute(dm, overlap, NULL, &pdm);CHKERRQ(ierr);
> ierr = DMDestroy(&dm);CHKERRQ(ierr);
> dm = pdm;
> ierr = PetscObjectSetName((PetscObject)dm, "periodicDM");CHKERRQ(ierr);
> }
> ierr = DMGetCoordinates(dm, &coordinates);CHKERRQ(ierr);
> ierr = PetscPrintf(PETSC_COMM_WORLD, "Coordinates before
> saving:\n");CHKERRQ(ierr);
> ierr = VecView(coordinates, NULL);CHKERRQ(ierr);
> ierr = PetscViewerPushFormat(viewer, format);CHKERRQ(ierr);
> ierr = DMPlexTopologyView(dm, viewer);CHKERRQ(ierr);
> ierr = DMPlexCoordinatesView(dm, viewer);CHKERRQ(ierr);
> ierr = PetscViewerPopFormat(viewer);CHKERRQ(ierr);
> ierr = DMDestroy(&dm);CHKERRQ(ierr);
> ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr);
> /* Load */
> ierr = PetscViewerHDF5Open(PETSC_COMM_WORLD, "periodic_example.h5",
> FILE_MODE_READ, &viewer);CHKERRQ(ierr);
> ierr = DMCreate(PETSC_COMM_WORLD, &dm);CHKERRQ(ierr);
> ierr = DMSetType(dm, DMPLEX);CHKERRQ(ierr);
> ierr = PetscObjectSetName((PetscObject)dm, "periodicDM");CHKERRQ(ierr);
> ierr = PetscViewerPushFormat(viewer, format);CHKERRQ(ierr);
> ierr = DMPlexTopologyLoad(dm, viewer, &sfO);CHKERRQ(ierr);
> ierr = DMPlexCoordinatesLoad(dm, viewer, sfO);CHKERRQ(ierr);
> ierr = PetscViewerPopFormat(viewer);CHKERRQ(ierr);
> ierr = DMGetCoordinates(dm, &coordinates);CHKERRQ(ierr);
> ierr = PetscPrintf(PETSC_COMM_WORLD, "Coordinates after
> loading:\n");CHKERRQ(ierr);
> ierr = VecView(coordinates, NULL);CHKERRQ(ierr);
> ierr = PetscSFDestroy(&sfO);CHKERRQ(ierr);
> ierr = DMDestroy(&dm);CHKERRQ(ierr);
> ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr);
> ierr = PetscFinalize();
> return ierr;
> }
>
> mpiexec -n 2 ./test_periodic_io
>
> Coordinates before saving:
> Vec Object: coordinates 2 MPI processes
> type: mpi
> Process [0]
> 0.
> Process [1]
> 0.25
> 0.5
> 0.75
> Coordinates after loading:
> Vec Object: vertices 2 MPI processes
> type: mpi
> Process [0]
> 0.
> 0.25
> 0.5
> 0.75
> Process [1]
>
> I would also like to note that, with the latest update, we can
> optionally load coordinates directly on the distributed dm as (using
> your notation):
>
> /* Distribute dm */
> ...
> PetscSFCompose(sfO, sfDist, &sf);
> DMPlexCoordinatesLoad(dm, viewer, sf);
>
> To use this feature, we need to pass "-dm_plex_view_hdf5_storage_version
> 2.0.0" option when saving topology/coordinates.
>
>
> Thanks,
> Koki
> ------------------------------------------------------------------------
> *From:* Berend van Wachem <berend.vanwachem at ovgu.de>
> *Sent:* Wednesday, November 17, 2021 3:16 PM
> *To:* Hapla Vaclav <vaclav.hapla at erdw.ethz.ch>; PETSc users list
> <petsc-users at mcs.anl.gov>; Lawrence Mitchell <wence at gmx.li>; Sagiyama,
> Koki <k.sagiyama at imperial.ac.uk>
> *Subject:* Re: [petsc-users] DMView and DMLoad
>
> *******************
> This email originates from outside Imperial. Do not click on links and
> attachments unless you recognise the sender.
> If you trust the sender, add them to your safe senders list
> https://spam.ic.ac.uk/SpamConsole/Senders.aspx
> <https://spam.ic.ac.uk/SpamConsole/Senders.aspx> to disable email
> stamping for this address.
> *******************
> Dear Vaclav, Lawrence, Koki,
>
> Thanks for your help! Following your advice and following your example
> (https://petsc.org/main/docs/manual/dmplex/#saving-and-loading-data-with-hdf5
> <https://petsc.org/main/docs/manual/dmplex/#saving-and-loading-data-with-hdf5>)
>
> we are able to save and load the DM with a wrapped Vector in h5 format
> (PETSC_VIEWER_HDF5_PETSC) successfully.
>
> For saving, we use something similar to:
>
> DMPlexTopologyView(dm, viewer);
> DMClone(dm, &sdm);
> ...
> DMPlexSectionView(dm, viewer, sdm);
> DMGetLocalVector(sdm, &vec);
> ...
> DMPlexLocalVectorView(dm, viewer, sdm, vec);
>
> and for loading:
>
> DMCreate(PETSC_COMM_WORLD, &dm);
> DMSetType(dm, DMPLEX);
> ...
> PetscViewerPushFormat(viewer, PETSC_VIEWER_HDF5_PETSC);
> DMPlexTopologyLoad(dm, viewer, &sfO);
> DMPlexLabelsLoad(dm, viewer);
> DMPlexCoordinatesLoad(dm, viewer);
> PetscViewerPopFormat(viewer);
> ...
> PetscSFCompose(sfO, sfDist, &sf);
> ...
> DMClone(dm, &sdm);
> DMPlexSectionLoad(dm, viewer, sdm, sf, &globalDataSF, &localDataSF);
> DMGetLocalVector(sdm, &vec);
> ...
> DMPlexLocalVectorLoad(dm, viewer, sdm, localDataSF, vec);
>
>
> This works fine for non-periodic DMs but for periodic cases the line:
>
> DMPlexCoordinatesLoad(dm, H5Viewer);
>
> delivers the error message: invalid argument and the number of loaded
> coordinates does not match the number of vertices.
>
> Is this a known shortcoming, or have we forgotten something to load
> periodic DMs?
>
> Best regards,
>
> Berend.
>
>
>
> On 9/22/21 20:59, Hapla Vaclav wrote:
>> To avoid confusions here, Berend seems to be specifically demanding XDMF
>> (PETSC_VIEWER_HDF5_XDMF). The stuff we are now working on is parallel
>> checkpointing in our own HDF5 format (PETSC_VIEWER_HDF5_PETSC), I will
>> make a series of MRs on this topic in the following days.
>>
>> For XDMF, we are specifically missing the ability to write/load DMLabels
>> properly. XDMF uses specific cell-local numbering for faces for
>> specification of face sets, and face-local numbering for specification
>> of edge sets, which is not great wrt DMPlex design. And ParaView doesn't
>> show any of these properly so it's hard to debug. Matt, we should talk
>> about this soon.
>>
>> Berend, for now, could you just load the mesh initially from XDMF and
>> then use our PETSC_VIEWER_HDF5_PETSC format for subsequent saving/loading?
>>
>> Thanks,
>>
>> Vaclav
>>
>>> On 17 Sep 2021, at 15:46, Lawrence Mitchell <wence at gmx.li
>>> <mailto:wence at gmx.li <mailto:wence at gmx.li>>> wrote:
>>>
>>> Hi Berend,
>>>
>>>> On 14 Sep 2021, at 12:23, Matthew Knepley <knepley at gmail.com
>>>> <mailto:knepley at gmail.com <mailto:knepley at gmail.com>>> wrote:
>>>>
>>>> On Tue, Sep 14, 2021 at 5:15 AM Berend van Wachem
>>>> <berend.vanwachem at ovgu.de <mailto:berend.vanwachem at ovgu.de <mailto:berend.vanwachem at ovgu.de>>> wrote:
>>>> Dear PETSc-team,
>>>>
>>>> We are trying to save and load distributed DMPlex and its associated
>>>> physical fields (created with DMCreateGlobalVector) (Uvelocity,
>>>> VVelocity, ...) in HDF5_XDMF format. To achieve this, we do the
>>>> following:
>>>>
>>>> 1) save in the same xdmf.h5 file:
>>>> DMView( DM , H5_XDMF_Viewer );
>>>> VecView( UVelocity, H5_XDMF_Viewer );
>>>>
>>>> 2) load the dm:
>>>> DMPlexCreateFromfile(PETSC_COMM_WORLD, Filename, PETSC_TRUE, DM);
>>>>
>>>> 3) load the physical field:
>>>> VecLoad( UVelocity, H5_XDMF_Viewer );
>>>>
>>>> There are no errors in the execution, but the loaded DM is distributed
>>>> differently to the original one, which results in the incorrect
>>>> placement of the values of the physical fields (UVelocity etc.) in the
>>>> domain.
>>>>
>>>> This approach is used to restart the simulation with the last saved DM.
>>>> Is there something we are missing, or there exists alternative routes to
>>>> this goal? Can we somehow get the IS of the redistribution, so we can
>>>> re-distribute the vector data as well?
>>>>
>>>> Many thanks, best regards,
>>>>
>>>> Hi Berend,
>>>>
>>>> We are in the midst of rewriting this. We want to support saving
>>>> multiple meshes, with fields attached to each,
>>>> and preserving the discretization (section) information, and allowing
>>>> us to load up on a different number of
>>>> processes. We plan to be done by October. Vaclav and I are doing this
>>>> in collaboration with Koki Sagiyama,
>>>> David Ham, and Lawrence Mitchell from the Firedrake team.
>>>
>>> The core load/save cycle functionality is now in PETSc main. So if
>>> you're using main rather than a release, you can get access to it now.
>>> This section of the manual shows an example of how to do
>>> thingshttps://petsc.org/main/docs/manual/dmplex/#saving-and-loading-data-with-hdf5
>>> <https://petsc.org/main/docs/manual/dmplex/#saving-and-loading-data-with-hdf5
> <https://petsc.org/main/docs/manual/dmplex/#saving-and-loading-data-with-hdf5>>
>>>
>>> Let us know if things aren't clear!
>>>
>>> Thanks,
>>>
>>> Lawrence
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20211207/fed5a59d/attachment-0001.html>
More information about the petsc-users
mailing list