<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=us-ascii">
<style type="text/css" style="display:none;"> P {margin-top:0;margin-bottom:0;} </style>
</head>
<body dir="ltr">
<div style="font-family: Calibri, Arial, Helvetica, sans-serif; font-size: 12pt; color: rgb(0, 0, 0);">
Hi Berend,</div>
<div style="font-family: Calibri, Arial, Helvetica, sans-serif; font-size: 12pt; color: rgb(0, 0, 0);">
<br>
</div>
<div style="font-family: Calibri, Arial, Helvetica, sans-serif; font-size: 12pt; color: rgb(0, 0, 0);">
I made some small changes to your code to successfully compile it and defined a periodic dm using DMPlexCreateBoxMesh(), but otherwise your code worked fine.</div>
<div style="font-family: Calibri, Arial, Helvetica, sans-serif; font-size: 12pt; color: rgb(0, 0, 0);">
I think we would like to see a complete minimal failing example. Can you make the working example that I pasted in earlier email fail just by modifying the dm<font size="2"><span style="font-size:11pt"> (i.e., using the periodic mesh you are actually using)</span></font>?
<br>
</div>
<div style="font-family: Calibri, Arial, Helvetica, sans-serif; font-size: 12pt; color: rgb(0, 0, 0);">
<br>
</div>
<div style="font-family: Calibri, Arial, Helvetica, sans-serif; font-size: 12pt; color: rgb(0, 0, 0);">
Thanks,</div>
<div style="font-family: Calibri, Arial, Helvetica, sans-serif; font-size: 12pt; color: rgb(0, 0, 0);">
Koki<br>
</div>
<div id="appendonsend"></div>
<hr style="display:inline-block;width:98%" tabindex="-1">
<div id="divRplyFwdMsg" dir="ltr"><font face="Calibri, sans-serif" style="font-size:11pt" color="#000000"><b>From:</b> Berend van Wachem <berend.vanwachem@ovgu.de><br>
<b>Sent:</b> Monday, December 6, 2021 3:39 PM<br>
<b>To:</b> Sagiyama, Koki <k.sagiyama@imperial.ac.uk>; Hapla Vaclav <vaclav.hapla@erdw.ethz.ch>; PETSc users list <petsc-users@mcs.anl.gov>; Lawrence Mitchell <wence@gmx.li><br>
<b>Subject:</b> Re: [petsc-users] DMView and DMLoad</font>
<div> </div>
</div>
<div class="BodyFragment"><font size="2"><span style="font-size:11pt;">
<div class="PlainText">Dear Koki,<br>
<br>
Thanks for your email. In the example of your last email <br>
DMPlexCoordinatesLoad() takes sF0 (PetscSF) as a third argument. In our <br>
code this modification does not fix the error when loading a periodic <br>
dm. Are we doing something wrong? I've included an example code at the <br>
bottom of this email, including the error output.<br>
<br>
Thanks and best regards,<br>
Berend<br>
<br>
<br>
/**** Write DM + Vec restart ****/<br>
PetscViewerHDF5Open(PETSC_COMM_WORLD, "result", FILE_MODE_WRITE, &H5Viewer);<br>
PetscObjectSetName((PetscObject)dm, "plexA");<br>
PetscViewerPushFormat(H5Viewer, PETSC_VIEWER_HDF5_PETSC);<br>
DMPlexTopologyView(dm, H5Viewer);<br>
DMPlexLabelsView(dm, H5Viewer);<br>
DMPlexCoordinatesView(dm, H5Viewer);<br>
PetscViewerPopFormat(H5Viewer);<br>
<br>
DM sdm;<br>
PetscSection s;<br>
<br>
DMClone(dm, &sdm);<br>
PetscObjectSetName((PetscObject)sdm, "dmA");<br>
DMGetGlobalSection(dm, &s);<br>
DMSetGlobalSection(sdm, s);<br>
DMPlexSectionView(dm, H5Viewer, sdm);<br>
<br>
Vec vec, vecOld;<br>
PetscScalar *array, *arrayOld, *xVecArray, *xVecArrayOld;<br>
PetscInt numPoints;<br>
<br>
DMGetGlobalVector(sdm, &vec);<br>
DMGetGlobalVector(sdm, &vecOld);<br>
<br>
/*** Fill the vectors vec and vecOld ***/<br>
VecGetArray(vec, &array);<br>
VecGetArray(vecOld, &arrayOld);<br>
VecGetLocalSize(xGlobalVector, &numPoints);<br>
VecGetArray(xGlobalVector, &xVecArray);<br>
VecGetArray(xOldGlobalVector, &xVecArrayOld);<br>
<br>
for (i = 0; i < numPoints; i++) /* Loop over all internal mesh points */<br>
{<br>
array[i] = xVecArray[i];<br>
arrayOld[i] = xVecArrayOld[i];<br>
}<br>
<br>
VecRestoreArray(vec, &array);<br>
VecRestoreArray(vecOld, &arrayOld);<br>
VecRestoreArray(xGlobalVector, &xVecArray);<br>
VecRestoreArray(xOldGlobalVector, &xVecArrayOld);<br>
<br>
PetscObjectSetName((PetscObject)vec, "vecA");<br>
PetscObjectSetName((PetscObject)vecOld, "vecB");<br>
DMPlexGlobalVectorView(dm, H5Viewer, sdm, vec);<br>
DMPlexGlobalVectorView(dm, H5Viewer, sdm, vecOld);<br>
PetscViewerDestroy(&H5Viewer);<br>
/*** end of writing ****/<br>
<br>
/*** Load ***/<br>
PetscViewerHDF5Open(PETSC_COMM_WORLD, "result", FILE_MODE_READ, &H5Viewer);<br>
DMCreate(PETSC_COMM_WORLD, &dm);<br>
DMSetType(dm, DMPLEX);<br>
PetscObjectSetName((PetscObject)dm, "plexA");<br>
PetscViewerPushFormat(H5Viewer, PETSC_VIEWER_HDF5_PETSC);<br>
DMPlexTopologyLoad(dm, H5Viewer, &sfO);<br>
DMPlexLabelsLoad(dm, H5Viewer);<br>
DMPlexCoordinatesLoad(dm, H5Viewer, sfO);<br>
PetscViewerPopFormat(H5Viewer);<br>
<br>
DMPlexDistribute(dm, Options->Mesh.overlap, &sfDist, &distributedDM);<br>
if (distributedDM) {<br>
DMDestroy(&dm);<br>
dm = distributedDM;<br>
PetscObjectSetName((PetscObject)dm, "plexA");<br>
}<br>
<br>
PetscSFCompose(sfO, sfDist, &sf);<br>
PetscSFDestroy(&sfO);<br>
PetscSFDestroy(&sfDist);<br>
<br>
DMClone(dm, &sdm);<br>
PetscObjectSetName((PetscObject)sdm, "dmA");<br>
DMPlexSectionLoad(dm, H5Viewer, sdm, sf, &globalDataSF, &localDataSF);<br>
<br>
/** Load the Vectors **/<br>
DMGetGlobalVector(sdm, &Restart_xGlobalVector);<br>
VecSet(Restart_xGlobalVector,0.0);<br>
<br>
PetscObjectSetName((PetscObject)Restart_xGlobalVector, "vecA");<br>
DMPlexGlobalVectorLoad(dm, H5Viewer, sdm, <br>
globalDataSF,Restart_xGlobalVector);<br>
DMGetGlobalVector(sdm, &Restart_xOldGlobalVector);<br>
VecSet(Restart_xOldGlobalVector,0.0);<br>
<br>
PetscObjectSetName((PetscObject)Restart_xOldGlobalVector, "vecB");<br>
DMPlexGlobalVectorLoad(dm, H5Viewer, sdm, globalDataSF, <br>
Restart_xOldGlobalVector);<br>
<br>
PetscViewerDestroy(&H5Viewer);<br>
<br>
<br>
/**** The error message when loading is the following ************/<br>
<br>
Creating and distributing mesh<br>
[0]PETSC ERROR: --------------------- Error Message <br>
--------------------------<br>
[0]PETSC ERROR: Invalid argument<br>
[0]PETSC ERROR: Number of coordinates loaded 17128 does not match number <br>
of vertices 8000<br>
[0]PETSC ERROR: See <a href="https://petsc.org/release/faq/">https://petsc.org/release/faq/</a> for trouble shooting.<br>
[0]PETSC ERROR: Petsc Development GIT revision: v3.16.1-435-g007f11b901 <br>
GIT Date: 2021-12-01 14:31:21 +0000<br>
[0]PETSC ERROR: ./MF3 on a linux-gcc-openmpi-opt named <br>
ivt24.ads.uni-magdeburg.de by berend Mon Dec 6 16:11:21 2021<br>
[0]PETSC ERROR: Configure options --with-p4est=yes --with-partemis <br>
--with-metis --with-debugging=no --download-metis=yes <br>
--download-parmetis=yes --with-errorchecking=no --download-hdf5 <br>
--download-zlib --download-p4est<br>
[0]PETSC ERROR: #1 DMPlexCoordinatesLoad_HDF5_V0_Private() at <br>
/home/berend/src/petsc_main/src/dm/impls/plex/plexhdf5.c:1387<br>
[0]PETSC ERROR: #2 DMPlexCoordinatesLoad_HDF5_Internal() at <br>
/home/berend/src/petsc_main/src/dm/impls/plex/plexhdf5.c:1419<br>
[0]PETSC ERROR: #3 DMPlexCoordinatesLoad() at <br>
/home/berend/src/petsc_main/src/dm/impls/plex/plex.c:2070<br>
[0]PETSC ERROR: #4 RestartMeshDM() at <br>
/home/berend/src/eclipseworkspace/multiflow/src/io/restartmesh.c:81<br>
[0]PETSC ERROR: #5 CreateMeshDM() at <br>
/home/berend/src/eclipseworkspace/multiflow/src/mesh/createmesh.c:61<br>
[0]PETSC ERROR: #6 main() at <br>
/home/berend/src/eclipseworkspace/multiflow/src/general/main.c:132<br>
[0]PETSC ERROR: PETSc Option Table entries:<br>
[0]PETSC ERROR: --download-hdf5<br>
[0]PETSC ERROR: --download-metis=yes<br>
[0]PETSC ERROR: --download-p4est<br>
[0]PETSC ERROR: --download-parmetis=yes<br>
[0]PETSC ERROR: --download-zlib<br>
[0]PETSC ERROR: --with-debugging=no<br>
[0]PETSC ERROR: --with-errorchecking=no<br>
[0]PETSC ERROR: --with-metis<br>
[0]PETSC ERROR: --with-p4est=yes<br>
[0]PETSC ERROR: --with-partemis<br>
[0]PETSC ERROR: -d results<br>
[0]PETSC ERROR: -o run.mf<br>
[0]PETSC ERROR: ----------------End of Error Message -------send entire <br>
error message to petsc-maint@mcs.anl.gov----------<br>
--------------------------------------------------------------------------<br>
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD<br>
with errorcode 62.<br>
<br>
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.<br>
You may or may not see output from other processes, depending on<br>
exactly when Open MPI kills them.<br>
--------------------------------------------------------------------------<br>
<br>
<br>
<br>
<br>
<br>
On 11/19/21 00:26, Sagiyama, Koki wrote:<br>
> Hi Berend,<br>
> <br>
> I was not able to reproduce the issue you are having, but the following <br>
> 1D example (and similar 2D examples) worked fine for me using the latest <br>
> PETSc. Please note that DMPlexCoordinatesLoad() now takes a PetscSF <br>
> object as the third argument, but the default behavior is unchanged.<br>
> <br>
> /* test_periodic_io.c */<br>
> <br>
> #include <petscdmplex.h><br>
> #include <petscsf.h><br>
> #include <petsclayouthdf5.h><br>
> <br>
> int main(int argc, char **argv)<br>
> {<br>
> DM dm;<br>
> Vec coordinates;<br>
> PetscViewer viewer;<br>
> PetscViewerFormat format = PETSC_VIEWER_HDF5_PETSC;<br>
> PetscSF sfO;<br>
> PetscErrorCode ierr;<br>
> <br>
> ierr = PetscInitialize(&argc, &argv, NULL, NULL); if (ierr) return ierr;<br>
> /* Save */<br>
> ierr = PetscViewerHDF5Open(PETSC_COMM_WORLD, "periodic_example.h5", <br>
> FILE_MODE_WRITE, &viewer);CHKERRQ(ierr);<br>
> {<br>
> DM pdm;<br>
> PetscInt dim = 1;<br>
> const PetscInt faces[1] = {4};<br>
> DMBoundaryType periodicity[] = {DM_BOUNDARY_PERIODIC};<br>
> PetscInt overlap = 1;<br>
> <br>
> ierr = DMPlexCreateBoxMesh(PETSC_COMM_WORLD, dim, PETSC_FALSE, <br>
> faces, NULL, NULL, periodicity, PETSC_TRUE, &dm);CHKERRQ(ierr);<br>
> ierr = DMPlexDistribute(dm, overlap, NULL, &pdm);CHKERRQ(ierr);<br>
> ierr = DMDestroy(&dm);CHKERRQ(ierr);<br>
> dm = pdm;<br>
> ierr = PetscObjectSetName((PetscObject)dm, "periodicDM");CHKERRQ(ierr);<br>
> }<br>
> ierr = DMGetCoordinates(dm, &coordinates);CHKERRQ(ierr);<br>
> ierr = PetscPrintf(PETSC_COMM_WORLD, "Coordinates before <br>
> saving:\n");CHKERRQ(ierr);<br>
> ierr = VecView(coordinates, NULL);CHKERRQ(ierr);<br>
> ierr = PetscViewerPushFormat(viewer, format);CHKERRQ(ierr);<br>
> ierr = DMPlexTopologyView(dm, viewer);CHKERRQ(ierr);<br>
> ierr = DMPlexCoordinatesView(dm, viewer);CHKERRQ(ierr);<br>
> ierr = PetscViewerPopFormat(viewer);CHKERRQ(ierr);<br>
> ierr = DMDestroy(&dm);CHKERRQ(ierr);<br>
> ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr);<br>
> /* Load */<br>
> ierr = PetscViewerHDF5Open(PETSC_COMM_WORLD, "periodic_example.h5", <br>
> FILE_MODE_READ, &viewer);CHKERRQ(ierr);<br>
> ierr = DMCreate(PETSC_COMM_WORLD, &dm);CHKERRQ(ierr);<br>
> ierr = DMSetType(dm, DMPLEX);CHKERRQ(ierr);<br>
> ierr = PetscObjectSetName((PetscObject)dm, "periodicDM");CHKERRQ(ierr);<br>
> ierr = PetscViewerPushFormat(viewer, format);CHKERRQ(ierr);<br>
> ierr = DMPlexTopologyLoad(dm, viewer, &sfO);CHKERRQ(ierr);<br>
> ierr = DMPlexCoordinatesLoad(dm, viewer, sfO);CHKERRQ(ierr);<br>
> ierr = PetscViewerPopFormat(viewer);CHKERRQ(ierr);<br>
> ierr = DMGetCoordinates(dm, &coordinates);CHKERRQ(ierr);<br>
> ierr = PetscPrintf(PETSC_COMM_WORLD, "Coordinates after <br>
> loading:\n");CHKERRQ(ierr);<br>
> ierr = VecView(coordinates, NULL);CHKERRQ(ierr);<br>
> ierr = PetscSFDestroy(&sfO);CHKERRQ(ierr);<br>
> ierr = DMDestroy(&dm);CHKERRQ(ierr);<br>
> ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr);<br>
> ierr = PetscFinalize();<br>
> return ierr;<br>
> }<br>
> <br>
> mpiexec -n 2 ./test_periodic_io<br>
> <br>
> Coordinates before saving:<br>
> Vec Object: coordinates 2 MPI processes<br>
> type: mpi<br>
> Process [0]<br>
> 0.<br>
> Process [1]<br>
> 0.25<br>
> 0.5<br>
> 0.75<br>
> Coordinates after loading:<br>
> Vec Object: vertices 2 MPI processes<br>
> type: mpi<br>
> Process [0]<br>
> 0.<br>
> 0.25<br>
> 0.5<br>
> 0.75<br>
> Process [1]<br>
> <br>
> I would also like to note that, with the latest update, we can <br>
> optionally load coordinates directly on the distributed dm as (using <br>
> your notation):<br>
> <br>
> /* Distribute dm */<br>
> ...<br>
> PetscSFCompose(sfO, sfDist, &sf);<br>
> DMPlexCoordinatesLoad(dm, viewer, sf);<br>
> <br>
> To use this feature, we need to pass "-dm_plex_view_hdf5_storage_version <br>
> 2.0.0" option when saving topology/coordinates.<br>
> <br>
> <br>
> Thanks,<br>
> Koki<br>
> ------------------------------------------------------------------------<br>
> *From:* Berend van Wachem <berend.vanwachem@ovgu.de><br>
> *Sent:* Wednesday, November 17, 2021 3:16 PM<br>
> *To:* Hapla Vaclav <vaclav.hapla@erdw.ethz.ch>; PETSc users list <br>
> <petsc-users@mcs.anl.gov>; Lawrence Mitchell <wence@gmx.li>; Sagiyama, <br>
> Koki <k.sagiyama@imperial.ac.uk><br>
> *Subject:* Re: [petsc-users] DMView and DMLoad<br>
> <br>
> *******************<br>
> This email originates from outside Imperial. Do not click on links and <br>
> attachments unless you recognise the sender.<br>
> If you trust the sender, add them to your safe senders list <br>
> <a href="https://spam.ic.ac.uk/SpamConsole/Senders.aspx">https://spam.ic.ac.uk/SpamConsole/Senders.aspx</a>
<br>
> <<a href="https://spam.ic.ac.uk/SpamConsole/Senders.aspx">https://spam.ic.ac.uk/SpamConsole/Senders.aspx</a>> to disable email
<br>
> stamping for this address.<br>
> *******************<br>
> Dear Vaclav, Lawrence, Koki,<br>
> <br>
> Thanks for your help! Following your advice and following your example<br>
> (<a href=""></a>https://petsc.org/main/docs/manual/dmplex/#saving-and-loading-data-with-hdf5
<br>
> <<a href="https://petsc.org/main/docs/manual/dmplex/#saving-and-loading-data-with-hdf5">https://petsc.org/main/docs/manual/dmplex/#saving-and-loading-data-with-hdf5</a>>)
<br>
> <br>
> we are able to save and load the DM with a wrapped Vector in h5 format<br>
> (PETSC_VIEWER_HDF5_PETSC) successfully.<br>
> <br>
> For saving, we use something similar to:<br>
> <br>
> DMPlexTopologyView(dm, viewer);<br>
> DMClone(dm, &sdm);<br>
> ...<br>
> DMPlexSectionView(dm, viewer, sdm);<br>
> DMGetLocalVector(sdm, &vec);<br>
> ...<br>
> DMPlexLocalVectorView(dm, viewer, sdm, vec);<br>
> <br>
> and for loading:<br>
> <br>
> DMCreate(PETSC_COMM_WORLD, &dm);<br>
> DMSetType(dm, DMPLEX);<br>
> ...<br>
> PetscViewerPushFormat(viewer, PETSC_VIEWER_HDF5_PETSC);<br>
> DMPlexTopologyLoad(dm, viewer, &sfO);<br>
> DMPlexLabelsLoad(dm, viewer);<br>
> DMPlexCoordinatesLoad(dm, viewer);<br>
> PetscViewerPopFormat(viewer);<br>
> ...<br>
> PetscSFCompose(sfO, sfDist, &sf);<br>
> ...<br>
> DMClone(dm, &sdm);<br>
> DMPlexSectionLoad(dm, viewer, sdm, sf, &globalDataSF, &localDataSF);<br>
> DMGetLocalVector(sdm, &vec);<br>
> ...<br>
> DMPlexLocalVectorLoad(dm, viewer, sdm, localDataSF, vec);<br>
> <br>
> <br>
> This works fine for non-periodic DMs but for periodic cases the line:<br>
> <br>
> DMPlexCoordinatesLoad(dm, H5Viewer);<br>
> <br>
> delivers the error message: invalid argument and the number of loaded<br>
> coordinates does not match the number of vertices.<br>
> <br>
> Is this a known shortcoming, or have we forgotten something to load<br>
> periodic DMs?<br>
> <br>
> Best regards,<br>
> <br>
> Berend.<br>
> <br>
> <br>
> <br>
> On 9/22/21 20:59, Hapla Vaclav wrote:<br>
>> To avoid confusions here, Berend seems to be specifically demanding XDMF <br>
>> (PETSC_VIEWER_HDF5_XDMF). The stuff we are now working on is parallel <br>
>> checkpointing in our own HDF5 format (PETSC_VIEWER_HDF5_PETSC), I will <br>
>> make a series of MRs on this topic in the following days.<br>
>> <br>
>> For XDMF, we are specifically missing the ability to write/load DMLabels <br>
>> properly. XDMF uses specific cell-local numbering for faces for <br>
>> specification of face sets, and face-local numbering for specification <br>
>> of edge sets, which is not great wrt DMPlex design. And ParaView doesn't <br>
>> show any of these properly so it's hard to debug. Matt, we should talk <br>
>> about this soon.<br>
>> <br>
>> Berend, for now, could you just load the mesh initially from XDMF and <br>
>> then use our PETSC_VIEWER_HDF5_PETSC format for subsequent saving/loading?<br>
>> <br>
>> Thanks,<br>
>> <br>
>> Vaclav<br>
>> <br>
>>> On 17 Sep 2021, at 15:46, Lawrence Mitchell <wence@gmx.li <br>
>>> <<a href=""></a>mailto:wence@gmx.li <<a href="mailto:wence@gmx.li">mailto:wence@gmx.li</a>>>> wrote:<br>
>>><br>
>>> Hi Berend,<br>
>>><br>
>>>> On 14 Sep 2021, at 12:23, Matthew Knepley <knepley@gmail.com <br>
>>>> <<a href=""></a>mailto:knepley@gmail.com <<a href="mailto:knepley@gmail.com">mailto:knepley@gmail.com</a>>>> wrote:<br>
>>>><br>
>>>> On Tue, Sep 14, 2021 at 5:15 AM Berend van Wachem <br>
>>>> <berend.vanwachem@ovgu.de <<a href=""></a>mailto:berend.vanwachem@ovgu.de <<a href="mailto:berend.vanwachem@ovgu.de">mailto:berend.vanwachem@ovgu.de</a>>>> wrote:<br>
>>>> Dear PETSc-team,<br>
>>>><br>
>>>> We are trying to save and load distributed DMPlex and its associated<br>
>>>> physical fields (created with DMCreateGlobalVector) (Uvelocity,<br>
>>>> VVelocity, ...) in HDF5_XDMF format. To achieve this, we do the <br>
>>>> following:<br>
>>>><br>
>>>> 1) save in the same xdmf.h5 file:<br>
>>>> DMView( DM , H5_XDMF_Viewer );<br>
>>>> VecView( UVelocity, H5_XDMF_Viewer );<br>
>>>><br>
>>>> 2) load the dm:<br>
>>>> DMPlexCreateFromfile(PETSC_COMM_WORLD, Filename, PETSC_TRUE, DM);<br>
>>>><br>
>>>> 3) load the physical field:<br>
>>>> VecLoad( UVelocity, H5_XDMF_Viewer );<br>
>>>><br>
>>>> There are no errors in the execution, but the loaded DM is distributed<br>
>>>> differently to the original one, which results in the incorrect<br>
>>>> placement of the values of the physical fields (UVelocity etc.) in the<br>
>>>> domain.<br>
>>>><br>
>>>> This approach is used to restart the simulation with the last saved DM.<br>
>>>> Is there something we are missing, or there exists alternative routes to<br>
>>>> this goal? Can we somehow get the IS of the redistribution, so we can<br>
>>>> re-distribute the vector data as well?<br>
>>>><br>
>>>> Many thanks, best regards,<br>
>>>><br>
>>>> Hi Berend,<br>
>>>><br>
>>>> We are in the midst of rewriting this. We want to support saving <br>
>>>> multiple meshes, with fields attached to each,<br>
>>>> and preserving the discretization (section) information, and allowing <br>
>>>> us to load up on a different number of<br>
>>>> processes. We plan to be done by October. Vaclav and I are doing this <br>
>>>> in collaboration with Koki Sagiyama,<br>
>>>> David Ham, and Lawrence Mitchell from the Firedrake team.<br>
>>><br>
>>> The core load/save cycle functionality is now in PETSc main. So if <br>
>>> you're using main rather than a release, you can get access to it now. <br>
>>> This section of the manual shows an example of how to do <br>
>>> thingshttps://petsc.org/main/docs/manual/dmplex/#saving-and-loading-data-with-hdf5<br>
>>> <<a href=""></a>https://petsc.org/main/docs/manual/dmplex/#saving-and-loading-data-with-hdf5
<br>
> <<a href="https://petsc.org/main/docs/manual/dmplex/#saving-and-loading-data-with-hdf5">https://petsc.org/main/docs/manual/dmplex/#saving-and-loading-data-with-hdf5</a>>><br>
>>><br>
>>> Let us know if things aren't clear!<br>
>>><br>
>>> Thanks,<br>
>>><br>
>>> Lawrence<br>
>> <br>
</div>
</span></font></div>
</body>
</html>