<div dir="ltr"><div>Hi Matt, <br></div><div><br></div><div>Thank you for working that fast !</div><div>I pulled your branch and tested your solution but unfortunately it crashed (I did not change the piece of code I sent you before for the creation of the DM). I am sending you the full error listing in a private e-mail so you can see what happened exactly.</div><div><br></div><div>Thanks !!<br></div><div><br></div><div><div><div dir="ltr" class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div dir="ltr"><div><div><div><div><div>Thibault</div></div></div></div></div></div></div></div></div></div></div><br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">Le jeu. 8 juil. 2021 à 23:06, Matthew Knepley <<a href="mailto:knepley@gmail.com">knepley@gmail.com</a>> a écrit :<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">On Thu, Jul 8, 2021 at 7:35 AM Thibault Bridel-Bertomeu <<a href="mailto:thibault.bridelbertomeu@gmail.com" target="_blank">thibault.bridelbertomeu@gmail.com</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>Hi Matthew, <br></div><div><br></div><div>Thank you for your answer ! So I tried to add those steps, and I have the same behavior as the one described in this thread :</div><div><br></div><div><a href="https://lists.mcs.anl.gov/pipermail/petsc-dev/2015-July/017978.html" target="_blank">https://lists.mcs.anl.gov/pipermail/petsc-dev/2015-July/017978.html</a></div><div><br></div><div><i>[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>[0]PETSC ERROR: Object is in wrong state<br>[0]PETSC ERROR: DM global to natural SF was not created.<br>You must call DMSetUseNatural() before DMPlexDistribute().<br><br>[0]PETSC ERROR: See <a href="https://www.mcs.anl.gov/petsc/documentation/faq.html" target="_blank">https://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br>[0]PETSC ERROR: Petsc Development GIT revision: v3.14.4-671-g707297fd510 GIT Date: 2021-02-24 22:50:05 +0000<br>[0]PETSC ERROR: /ccc/work/cont001/ocre/bridelbert/EULERIAN2D/bin/eulerian2D on a named inti1401 by bridelbert Thu Jul 8 07:50:24 2021<br>[0]PETSC ERROR: Configure options --with-clean=1 --prefix=/ccc/work/cont001/ocre/bridelbert/04-PETSC/build_uns3D_inti --with-make-np=8 --with-windows-graphics=0 --with-debugging=1 --download-mpich-shared=0 --with-x=0 --with-pthread=0 --with-valgrind=0 --PETSC_ARCH=INTI_UNS3D --with-fc=/ccc/products/openmpi-2.0.4/gcc--8.3.0/default/bin/mpifort --with-cc=/ccc/products/openmpi-2.0.4/gcc--8.3.0/default/bin/mpicc --with-cxx=/ccc/products/openmpi-2.0.4/gcc--8.3.0/default/bin/mpicxx --with-openmp=0 --download-sowing=/ccc/work/cont001/ocre/bridelbert/v1.1.26-p1.tar.gz --download-metis=/ccc/work/cont001/ocre/bridelbert/git.metis.tar.gz --download-parmetis=/ccc/work/cont001/ocre/bridelbert/git.parmetis.tar.gz --download-fblaslapack=/ccc/work/cont001/ocre/bridelbert/git.fblaslapack.tar.gz --with-cmake-dir=/ccc/products/cmake-3.13.3/system/default<br>[0]PETSC ERROR: #1 DMPlexGlobalToNaturalBegin() line 247 in /ccc/work/cont001/ocre/bridelbert/04-PETSC/src/dm/impls/plex/plexnatural.c<br>[0]PETSC ERROR: #2 User provided function() line 0 in User file</i></div><div><br></div><div>The creation of my DM is as follow :</div><div><br></div><div><i> ! Read mesh from file name 'meshname'<br> call DMPlexCreateFromFile(PETSC_COMM_WORLD, meshname, PETSC_TRUE, dm, ierr); CHKERRA(ierr)<br><br> ! Distribute on processors<br> call DMSetUseNatural(dm, PETSC_TRUE, ierr) ; CHKERRA(ierr)<br> ! Start with connectivity<br> call DMSetBasicAdjacency(dm, PETSC_TRUE, PETSC_FALSE, ierr) ; CHKERRA(ierr)<br><br> ! Distribute on processors<br> call DMPlexDistribute(dm, overlap, PETSC_NULL_SF, dmDist, ierr) ; CHKERRA(ierr)<br><br> ! Security check<br> if (dmDist /= PETSC_NULL_DM) then<br> ! Destroy previous dm<br> call DMDestroy(dm, ierr) ; CHKERRA(ierr)<br> ! Replace with dmDist<br> dm = dmDist<br> end if<br><br> ! Finalize setup of the object<br> call DMSetFromOptions(dm, ierr) ; CHKERRA(ierr)<br><br> ! Boundary condition with ghost cells<br> call DMPlexConstructGhostCells(dm, PETSC_NULL_CHARACTER, PETSC_NULL_INTEGER, dmGhost, ierr); CHKERRA(ierr)<br><br> ! Security check<br> if (dmGhost /= PETSC_NULL_DM) then<br> ! Destroy previous dm<br> call DMDestroy(dm, ierr) ; CHKERRA(ierr)<br> ! Replace with dmGhost<br> dm = dmGhost<br> end if</i><br></div><div><br></div><div>And I write my vector as follow :</div><div><br></div><div><i> call DMCreateGlobalVector(dm, Xnat, ierr); CHKERRA(ierr)<br> call PetscObjectSetName(Xnat, "NaturalSolution", ierr); CHKERRA(ierr)<br> call DMPlexGlobalToNaturalBegin(dm, X, Xnat, ierr); CHKERRA(ierr)<br> call DMPlexGlobalToNaturalEnd(dm, X, Xnat, ierr); CHKERRA(ierr)<br> call DMGetOutputSequenceNumber(dm, save_seqnum, save_seqval, ierr); CHKERRA(ierr)<br> call DMSetOutputSequenceNumber(dm, -1, 0.d0, ierr); CHKERRA(ierr)<br> write(filename,'(A,I8.8,A)') "restart_", stepnum, ".bin"<br> call PetscViewerCreate(PETSC_COMM_WORLD, binViewer, ierr); CHKERRA(ierr)<br> call PetscViewerSetType(binViewer, PETSCVIEWERBINARY, ierr); CHKERRA(ierr)<br> call PetscViewerFileSetMode(binViewer, FILE_MODE_WRITE, ierr); CHKERRA(ierr);<br> call PetscViewerBinarySetUseMPIIO(binViewer, PETSC_TRUE, ierr); CHKERRA(ierr);<br> call PetscViewerFileSetName(binViewer, trim(filename), ierr); CHKERRA(ierr)<br> ! call VecView(X, binViewer, ierr); CHKERRA(ierr)<br> call VecView(Xnat, binViewer, ierr); CHKERRA(ierr)<br> call PetscViewerDestroy(binViewer, ierr); CHKERRA(ierr)<br> call DMSetOutputSequenceNumber(dm, save_seqnum, save_seqval, ierr); CHKERRA(ierr)</i></div><div><br></div><div>Did you find the time to fix the bug you are mentioning in the thread above regarding the passing of the natural property when calling DMPlexConstructGhostCells ?</div></div></blockquote><div><br></div><div>No, thanks for reminding me. I coded up what I think is a fix, but I do not have a test yet. Maybe you can help me make one. Here is the branch</div><div><br></div><div> <a href="https://gitlab.com/petsc/petsc/-/commits/knepley/fix-plex-natural-fvghost" target="_blank">https://gitlab.com/petsc/petsc/-/commits/knepley/fix-plex-natural-fvghost</a></div><div><br></div><div>Can you run and see if it fixes that error? Then we can figure out the best way to do your output.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>Thanks !!<br></div><div><br></div><div><div><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div dir="ltr"><div><div><div><div><div>Thibault</div></div></div></div></div></div></div></div></div></div></div><br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">Le mer. 7 juil. 2021 à 23:46, Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>> a écrit :<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">On Wed, Jul 7, 2021 at 3:49 PM Thibault Bridel-Bertomeu <<a href="mailto:thibault.bridelbertomeu@gmail.com" target="_blank">thibault.bridelbertomeu@gmail.com</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">Hi Dave,<div><br></div><div>Thank you for your fast answer.</div><div><br></div><div>To postprocess the files in python, I use the PetscBinaryIO package that is provided with PETSc, yes.</div><div><br></div><div>I load the file like this :</div><div><br></div><div><i>import numpy as np<br>import meshio<br>import PetscBinaryIO as pio<br>import matplotlib as mpl<br>import matplotlib.pyplot as plt<br>import <a href="http://matplotlib.cm" target="_blank">matplotlib.cm</a> as cm<br><br>mpl.use('Agg')</i><br></div><div><br></div><div><i>restartname = "restart_00001001.bin"<br>print("Reading {} ...".format(restartname))<br>io = pio.PetscBinaryIO()<br>fh = open(restartname)<br>objecttype = io.readObjectType(fh)<br>data = None<br>if objecttype == 'Vec':<br> data = io.readVec(fh)<br>print("Size of data = ", data.size)<br>print("Size of a single variable (4 variables) = ", data.size / 4)<br>assert(np.isclose(data.size / 4.0, np.floor(data.size / 4.0)))</i><br></div><div><br></div><div>Then I load the mesh (it's from Gmsh so I use the meshio package) :</div><div><br></div><div><i>meshname = "ForwardFacing.msh"<br>print("Reading {} ...".format(meshname))<br>mesh = meshio.read(meshname)<br>print("Number of vertices = ", mesh.points.shape[0])<br>print("Number of cells = ", mesh.cells_dict['quad'].shape[0])</i><br></div><div><br></div><div>From the 'data' and the 'mesh' I use tricontourf from matplotlib to plot the figure.</div><div><br></div><div>I removed the call to ...SetUseMPIIO... and it gives the same kind of data yes (I attached a figure of the data obtained with the binary viewer without MPI I/O).</div><div><br></div><div>Maybe it's just a connectivity issue ? Maybe the way the Vec is written by the PETSc viewer somehow does not match the connectivity from the ori Gmsh file but some other connectivity of the partitionned DMPlex ?</div></div></blockquote><div><br></div><div>Yes, when you distribute the mesh, it gets permuted so that each piece is contiguous. This happens on all meshes (DMDA, DMStag, DMPlex, DMForest).</div><div>When it is written out, it just concatenates that ordering, or what we usually call the "global order" since it is the order of a global vector.</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div> If so, is there a way to get the latter ?</div></div></blockquote><div><br></div><div>If you call</div><div><br></div><div> <a href="https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMSetUseNatural.html" target="_blank">https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMSetUseNatural.html</a></div><div><br></div><div>before distribution, then a mapping back to the original ordering will be saved. You can use</div><div>that mapping with a global vector and an original vector</div><div><br></div><div> <a href="https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DMPLEX/DMPlexGlobalToNaturalBegin.html" target="_blank">https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DMPLEX/DMPlexGlobalToNaturalBegin.html</a></div><div><br></div><div>to get a vector in the original ordering. However, you would also need to understand how you want that output.</div><div>The ExodusII viewer uses this by default since people how use it (Blaise) generally want that. Most people using</div><div>HDF5 (me) want the new order since it is faster. Plex ex15 and ex26 show some manipulations using this mapping.</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div> I know the binary viewer does not work on DMPlex,</div></div></blockquote><div><br></div><div>You can output the Vec in native format, which will use the GlobalToNatural reordering. It will not output the Plex,</div><div>but you will have the values in the order you expect.</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div> the VTK viewer yields a corrupted dataset</div></div></blockquote><div><br></div><div>VTK is not supported. We support Paraview through the Xdmf extension to HDF5.</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div> and I have issues with HDF5 viewer with MPI (see another recent thread of mine) ...</div></div></blockquote><div><br></div><div>I have not been able to reproduce this yet.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>Thanks again for your help !!</div><div><br clear="all"><div><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div dir="ltr"><div><div><div><div><div>Thibault </div></div></div></div></div></div></div></div></div></div></div></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">Le mer. 7 juil. 2021 à 20:54, Dave May <<a href="mailto:dave.mayhem23@gmail.com" target="_blank">dave.mayhem23@gmail.com</a>> a écrit :<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><br></div><div><br><div class="gmail_quote"><div dir="ltr">On Wed 7. Jul 2021 at 20:41, Thibault Bridel-Bertomeu <<a href="mailto:thibault.bridelbertomeu@gmail.com" target="_blank">thibault.bridelbertomeu@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">Dear all, <div><br></div><div>I have been having issues with large Vec (based on DMPLex) and massive MPI I/O ... it looks like the data that is written by the Petsc Binary Viewer is gibberish for large meshes split on a high number of processes. For instance, I am using a mesh that has around 50 million cells, split on 1024 processors.</div><div>The computation seems to run fine, the timestep computed from the data makes sense so I think internally everything is fine. But when I look at the solution (one example attached) it's noise - at this point it should show a bow shock developing on the left near the step.</div><div>The piece of code I use is below for the output :</div><div><br></div><div> call DMGetOutputSequenceNumber(dm, save_seqnum, save_seqval, ierr); CHKERRA(ierr)<br> call DMSetOutputSequenceNumber(dm, -1, 0.d0, ierr); CHKERRA(ierr)<br> write(filename,'(A,I8.8,A)') "restart_", stepnum, ".bin"<br> call PetscViewerCreate(PETSC_COMM_WORLD, binViewer, ierr); CHKERRA(ierr)<br> call PetscViewerSetType(binViewer, PETSCVIEWERBINARY, ierr); CHKERRA(ierr)<br> call PetscViewerFileSetMode(binViewer, FILE_MODE_WRITE, ierr); CHKERRA(ierr);<br> call PetscViewerBinarySetUseMPIIO(binViewer, PETSC_TRUE, ierr); CHKERRA(ierr);<br> </div></div></blockquote><div dir="auto"><br></div><div dir="auto">Do you get the correct output if you don’t call the function above (or equivalently use PETSC_FALSE)</div><div dir="auto"><br></div><div dir="auto"><br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>call PetscViewerFileSetName(binViewer, trim(filename), ierr); CHKERRA(ierr)<br> call VecView(X, binViewer, ierr); CHKERRA(ierr)<br> call PetscViewerDestroy(binViewer, ierr); CHKERRA(ierr)<br> call DMSetOutputSequenceNumber(dm, save_seqnum, save_seqval, ierr); CHKERRA(ierr)<br></div><div><br></div><div>I do not think there is anything wrong with it but of course I would be happy to hear your feedback.</div><div>Nonetheless my question was : how far have you tested the binary mpi i/o of a Vec ? Does it make some sense that for a 50 million cell mesh split on 1024 processes, it could somehow fail ?</div><div>Or is it my python drawing method that is completely incapable of handling this dataset ? (paraview displays the same thing though so I'm not sure ...)</div></div></blockquote><div dir="auto"><br></div><div dir="auto">Are you using the python provided tools within petsc to load the Vec from file?</div><div dir="auto"><br></div><div dir="auto"><br></div><div dir="auto">Thanks,</div><div dir="auto">Dave</div><div dir="auto"><br></div><div dir="auto"><br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div></div><div><br></div><div>Thank you very much for your advice and help !!!</div></div><div dir="ltr"><div><br></div><div><div><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div dir="ltr"><div><div><div><div><div>Thibault </div></div></div></div></div></div></div></div></div></div></div></div></div>
</blockquote></div></div>
</blockquote></div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>
</blockquote></div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>
</blockquote></div>