[petsc-users] Scaling of the Petsc Binary Viewer
Matthew Knepley
knepley at gmail.com
Wed Jul 7 16:45:52 CDT 2021
On Wed, Jul 7, 2021 at 3:49 PM Thibault Bridel-Bertomeu <
thibault.bridelbertomeu at gmail.com> wrote:
> Hi Dave,
>
> Thank you for your fast answer.
>
> To postprocess the files in python, I use the PetscBinaryIO package that
> is provided with PETSc, yes.
>
> I load the file like this :
>
>
>
>
>
>
>
>
> *import numpy as npimport meshioimport PetscBinaryIO as pioimport
> matplotlib as mplimport matplotlib.pyplot as pltimport matplotlib.cm
> <http://matplotlib.cm> as cmmpl.use('Agg')*
>
>
>
>
>
>
>
>
>
>
>
> *restartname = "restart_00001001.bin"print("Reading {}
> ...".format(restartname))io = pio.PetscBinaryIO()fh =
> open(restartname)objecttype = io.readObjectType(fh)data = Noneif objecttype
> == 'Vec': data = io.readVec(fh)print("Size of data = ",
> data.size)print("Size of a single variable (4 variables) = ", data.size /
> 4)assert(np.isclose(data.size / 4.0, np.floor(data.size / 4.0)))*
>
> Then I load the mesh (it's from Gmsh so I use the meshio package) :
>
>
>
>
>
> *meshname = "ForwardFacing.msh"print("Reading {}
> ...".format(meshname))mesh = meshio.read(meshname)print("Number of vertices
> = ", mesh.points.shape[0])print("Number of cells = ",
> mesh.cells_dict['quad'].shape[0])*
>
> From the 'data' and the 'mesh' I use tricontourf from matplotlib to plot
> the figure.
>
> I removed the call to ...SetUseMPIIO... and it gives the same kind of data
> yes (I attached a figure of the data obtained with the binary viewer
> without MPI I/O).
>
> Maybe it's just a connectivity issue ? Maybe the way the Vec is written by
> the PETSc viewer somehow does not match the connectivity from the ori Gmsh
> file but some other connectivity of the partitionned DMPlex ?
>
Yes, when you distribute the mesh, it gets permuted so that each piece is
contiguous. This happens on all meshes (DMDA, DMStag, DMPlex, DMForest).
When it is written out, it just concatenates that ordering, or what we
usually call the "global order" since it is the order of a global vector.
> If so, is there a way to get the latter ?
>
If you call
https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMSetUseNatural.html
before distribution, then a mapping back to the original ordering will be
saved. You can use
that mapping with a global vector and an original vector
https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DMPLEX/DMPlexGlobalToNaturalBegin.html
to get a vector in the original ordering. However, you would also need to
understand how you want that output.
The ExodusII viewer uses this by default since people how use it (Blaise)
generally want that. Most people using
HDF5 (me) want the new order since it is faster. Plex ex15 and ex26 show
some manipulations using this mapping.
> I know the binary viewer does not work on DMPlex,
>
You can output the Vec in native format, which will use the GlobalToNatural
reordering. It will not output the Plex,
but you will have the values in the order you expect.
> the VTK viewer yields a corrupted dataset
>
VTK is not supported. We support Paraview through the Xdmf extension to
HDF5.
> and I have issues with HDF5 viewer with MPI (see another recent thread of
> mine) ...
>
I have not been able to reproduce this yet.
Thanks,
Matt
> Thanks again for your help !!
>
> Thibault
>
> Le mer. 7 juil. 2021 à 20:54, Dave May <dave.mayhem23 at gmail.com> a écrit :
>
>>
>>
>> On Wed 7. Jul 2021 at 20:41, Thibault Bridel-Bertomeu <
>> thibault.bridelbertomeu at gmail.com> wrote:
>>
>>> Dear all,
>>>
>>> I have been having issues with large Vec (based on DMPLex) and massive
>>> MPI I/O ... it looks like the data that is written by the Petsc Binary
>>> Viewer is gibberish for large meshes split on a high number of processes.
>>> For instance, I am using a mesh that has around 50 million cells, split on
>>> 1024 processors.
>>> The computation seems to run fine, the timestep computed from the data
>>> makes sense so I think internally everything is fine. But when I look at
>>> the solution (one example attached) it's noise - at this point it should
>>> show a bow shock developing on the left near the step.
>>> The piece of code I use is below for the output :
>>>
>>> call DMGetOutputSequenceNumber(dm, save_seqnum,
>>> save_seqval, ierr); CHKERRA(ierr)
>>> call DMSetOutputSequenceNumber(dm, -1, 0.d0, ierr);
>>> CHKERRA(ierr)
>>> write(filename,'(A,I8.8,A)') "restart_", stepnum, ".bin"
>>> call PetscViewerCreate(PETSC_COMM_WORLD, binViewer,
>>> ierr); CHKERRA(ierr)
>>> call PetscViewerSetType(binViewer, PETSCVIEWERBINARY,
>>> ierr); CHKERRA(ierr)
>>> call PetscViewerFileSetMode(binViewer, FILE_MODE_WRITE,
>>> ierr); CHKERRA(ierr);
>>> call PetscViewerBinarySetUseMPIIO(binViewer, PETSC_TRUE,
>>> ierr); CHKERRA(ierr);
>>>
>>>
>>
>> Do you get the correct output if you don’t call the function above (or
>> equivalently use PETSC_FALSE)
>>
>>
>> call PetscViewerFileSetName(binViewer, trim(filename), ierr);
>>> CHKERRA(ierr)
>>> call VecView(X, binViewer, ierr); CHKERRA(ierr)
>>> call PetscViewerDestroy(binViewer, ierr); CHKERRA(ierr)
>>> call DMSetOutputSequenceNumber(dm, save_seqnum,
>>> save_seqval, ierr); CHKERRA(ierr)
>>>
>>> I do not think there is anything wrong with it but of course I would be
>>> happy to hear your feedback.
>>> Nonetheless my question was : how far have you tested the binary mpi i/o
>>> of a Vec ? Does it make some sense that for a 50 million cell mesh split on
>>> 1024 processes, it could somehow fail ?
>>> Or is it my python drawing method that is completely incapable of
>>> handling this dataset ? (paraview displays the same thing though so I'm not
>>> sure ...)
>>>
>>
>> Are you using the python provided tools within petsc to load the Vec from
>> file?
>>
>>
>> Thanks,
>> Dave
>>
>>
>>
>>> Thank you very much for your advice and help !!!
>>>
>>> Thibault
>>>
>>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210707/d618ecdc/attachment.html>
More information about the petsc-users
mailing list