[petsc-users] Scaling of the Petsc Binary Viewer
Thibault Bridel-Bertomeu
thibault.bridelbertomeu at gmail.com
Thu Jul 8 06:34:48 CDT 2021
Hi Matthew,
Thank you for your answer ! So I tried to add those steps, and I have the
same behavior as the one described in this thread :
https://lists.mcs.anl.gov/pipermail/petsc-dev/2015-July/017978.html
*[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------[0]PETSC
ERROR: Object is in wrong state[0]PETSC ERROR: DM global to natural SF was
not created.You must call DMSetUseNatural() before
DMPlexDistribute().[0]PETSC ERROR: See
https://www.mcs.anl.gov/petsc/documentation/faq.html
<https://www.mcs.anl.gov/petsc/documentation/faq.html> for trouble
shooting.[0]PETSC ERROR: Petsc Development GIT revision:
v3.14.4-671-g707297fd510 GIT Date: 2021-02-24 22:50:05 +0000[0]PETSC
ERROR: /ccc/work/cont001/ocre/bridelbert/EULERIAN2D/bin/eulerian2D on a
named inti1401 by bridelbert Thu Jul 8 07:50:24 2021[0]PETSC ERROR:
Configure options --with-clean=1
--prefix=/ccc/work/cont001/ocre/bridelbert/04-PETSC/build_uns3D_inti
--with-make-np=8 --with-windows-graphics=0 --with-debugging=1
--download-mpich-shared=0 --with-x=0 --with-pthread=0 --with-valgrind=0
--PETSC_ARCH=INTI_UNS3D
--with-fc=/ccc/products/openmpi-2.0.4/gcc--8.3.0/default/bin/mpifort
--with-cc=/ccc/products/openmpi-2.0.4/gcc--8.3.0/default/bin/mpicc
--with-cxx=/ccc/products/openmpi-2.0.4/gcc--8.3.0/default/bin/mpicxx
--with-openmp=0
--download-sowing=/ccc/work/cont001/ocre/bridelbert/v1.1.26-p1.tar.gz
--download-metis=/ccc/work/cont001/ocre/bridelbert/git.metis.tar.gz
--download-parmetis=/ccc/work/cont001/ocre/bridelbert/git.parmetis.tar.gz
--download-fblaslapack=/ccc/work/cont001/ocre/bridelbert/git.fblaslapack.tar.gz
--with-cmake-dir=/ccc/products/cmake-3.13.3/system/default[0]PETSC ERROR:
#1 DMPlexGlobalToNaturalBegin() line 247 in
/ccc/work/cont001/ocre/bridelbert/04-PETSC/src/dm/impls/plex/plexnatural.c[0]PETSC
ERROR: #2 User provided function() line 0 in User file*
The creation of my DM is as follow :
* ! Read mesh from file name 'meshname' call
DMPlexCreateFromFile(PETSC_COMM_WORLD, meshname, PETSC_TRUE, dm, ierr);
CHKERRA(ierr) ! Distribute on processors call
DMSetUseNatural(dm, PETSC_TRUE, ierr) ;
CHKERRA(ierr) ! Start with connectivity call
DMSetBasicAdjacency(dm, PETSC_TRUE, PETSC_FALSE, ierr) ;
CHKERRA(ierr) ! Distribute on processors call
DMPlexDistribute(dm, overlap, PETSC_NULL_SF, dmDist, ierr) ;
CHKERRA(ierr) ! Security check if (dmDist /=
PETSC_NULL_DM) then ! Destroy previous dm
call DMDestroy(dm, ierr) ;
CHKERRA(ierr) ! Replace with dmDist dm =
dmDist end if ! Finalize setup of the object
call DMSetFromOptions(dm, ierr)
; CHKERRA(ierr) ! Boundary condition with ghost cells
call DMPlexConstructGhostCells(dm, PETSC_NULL_CHARACTER,
PETSC_NULL_INTEGER, dmGhost, ierr); CHKERRA(ierr) ! Security
check if (dmGhost /= PETSC_NULL_DM) then !
Destroy previous dm call DMDestroy(dm, ierr)
; CHKERRA(ierr) ! Replace
with dmGhost dm = dmGhost end if*
And I write my vector as follow :
* call DMCreateGlobalVector(dm, Xnat, ierr); CHKERRA(ierr)
call PetscObjectSetName(Xnat, "NaturalSolution", ierr);
CHKERRA(ierr) call DMPlexGlobalToNaturalBegin(dm, X, Xnat,
ierr); CHKERRA(ierr) call DMPlexGlobalToNaturalEnd(dm, X,
Xnat, ierr); CHKERRA(ierr) call
DMGetOutputSequenceNumber(dm, save_seqnum, save_seqval, ierr);
CHKERRA(ierr) call DMSetOutputSequenceNumber(dm, -1, 0.d0,
ierr); CHKERRA(ierr) write(filename,'(A,I8.8,A)')
"restart_", stepnum, ".bin" call
PetscViewerCreate(PETSC_COMM_WORLD, binViewer, ierr); CHKERRA(ierr)
call PetscViewerSetType(binViewer, PETSCVIEWERBINARY, ierr);
CHKERRA(ierr) call PetscViewerFileSetMode(binViewer,
FILE_MODE_WRITE, ierr); CHKERRA(ierr); call
PetscViewerBinarySetUseMPIIO(binViewer, PETSC_TRUE, ierr); CHKERRA(ierr);
call PetscViewerFileSetName(binViewer, trim(filename), ierr);
CHKERRA(ierr) ! call VecView(X, binViewer, ierr);
CHKERRA(ierr) call VecView(Xnat, binViewer, ierr);
CHKERRA(ierr) call PetscViewerDestroy(binViewer, ierr);
CHKERRA(ierr) call DMSetOutputSequenceNumber(dm,
save_seqnum, save_seqval, ierr); CHKERRA(ierr)*
Did you find the time to fix the bug you are mentioning in the thread above
regarding the passing of the natural property when calling
DMPlexConstructGhostCells ?
Thanks !!
Thibault
Le mer. 7 juil. 2021 à 23:46, Matthew Knepley <knepley at gmail.com> a écrit :
> On Wed, Jul 7, 2021 at 3:49 PM Thibault Bridel-Bertomeu <
> thibault.bridelbertomeu at gmail.com> wrote:
>
>> Hi Dave,
>>
>> Thank you for your fast answer.
>>
>> To postprocess the files in python, I use the PetscBinaryIO package that
>> is provided with PETSc, yes.
>>
>> I load the file like this :
>>
>>
>>
>>
>>
>>
>>
>>
>> *import numpy as npimport meshioimport PetscBinaryIO as pioimport
>> matplotlib as mplimport matplotlib.pyplot as pltimport matplotlib.cm
>> <http://matplotlib.cm> as cmmpl.use('Agg')*
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> *restartname = "restart_00001001.bin"print("Reading {}
>> ...".format(restartname))io = pio.PetscBinaryIO()fh =
>> open(restartname)objecttype = io.readObjectType(fh)data = Noneif objecttype
>> == 'Vec': data = io.readVec(fh)print("Size of data = ",
>> data.size)print("Size of a single variable (4 variables) = ", data.size /
>> 4)assert(np.isclose(data.size / 4.0, np.floor(data.size / 4.0)))*
>>
>> Then I load the mesh (it's from Gmsh so I use the meshio package) :
>>
>>
>>
>>
>>
>> *meshname = "ForwardFacing.msh"print("Reading {}
>> ...".format(meshname))mesh = meshio.read(meshname)print("Number of vertices
>> = ", mesh.points.shape[0])print("Number of cells = ",
>> mesh.cells_dict['quad'].shape[0])*
>>
>> From the 'data' and the 'mesh' I use tricontourf from matplotlib to plot
>> the figure.
>>
>> I removed the call to ...SetUseMPIIO... and it gives the same kind of
>> data yes (I attached a figure of the data obtained with the binary viewer
>> without MPI I/O).
>>
>> Maybe it's just a connectivity issue ? Maybe the way the Vec is written
>> by the PETSc viewer somehow does not match the connectivity from the ori
>> Gmsh file but some other connectivity of the partitionned DMPlex ?
>>
>
> Yes, when you distribute the mesh, it gets permuted so that each piece is
> contiguous. This happens on all meshes (DMDA, DMStag, DMPlex, DMForest).
> When it is written out, it just concatenates that ordering, or what we
> usually call the "global order" since it is the order of a global vector.
>
>
>> If so, is there a way to get the latter ?
>>
>
> If you call
>
>
> https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMSetUseNatural.html
>
> before distribution, then a mapping back to the original ordering will be
> saved. You can use
> that mapping with a global vector and an original vector
>
>
> https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DMPLEX/DMPlexGlobalToNaturalBegin.html
>
> to get a vector in the original ordering. However, you would also need to
> understand how you want that output.
> The ExodusII viewer uses this by default since people how use it (Blaise)
> generally want that. Most people using
> HDF5 (me) want the new order since it is faster. Plex ex15 and ex26 show
> some manipulations using this mapping.
>
>
>> I know the binary viewer does not work on DMPlex,
>>
>
> You can output the Vec in native format, which will use the
> GlobalToNatural reordering. It will not output the Plex,
> but you will have the values in the order you expect.
>
>
>> the VTK viewer yields a corrupted dataset
>>
>
> VTK is not supported. We support Paraview through the Xdmf extension to
> HDF5.
>
>
>> and I have issues with HDF5 viewer with MPI (see another recent thread of
>> mine) ...
>>
>
> I have not been able to reproduce this yet.
>
> Thanks,
>
> Matt
>
>
>> Thanks again for your help !!
>>
>> Thibault
>>
>> Le mer. 7 juil. 2021 à 20:54, Dave May <dave.mayhem23 at gmail.com> a
>> écrit :
>>
>>>
>>>
>>> On Wed 7. Jul 2021 at 20:41, Thibault Bridel-Bertomeu <
>>> thibault.bridelbertomeu at gmail.com> wrote:
>>>
>>>> Dear all,
>>>>
>>>> I have been having issues with large Vec (based on DMPLex) and massive
>>>> MPI I/O ... it looks like the data that is written by the Petsc Binary
>>>> Viewer is gibberish for large meshes split on a high number of processes.
>>>> For instance, I am using a mesh that has around 50 million cells, split on
>>>> 1024 processors.
>>>> The computation seems to run fine, the timestep computed from the data
>>>> makes sense so I think internally everything is fine. But when I look at
>>>> the solution (one example attached) it's noise - at this point it should
>>>> show a bow shock developing on the left near the step.
>>>> The piece of code I use is below for the output :
>>>>
>>>> call DMGetOutputSequenceNumber(dm, save_seqnum,
>>>> save_seqval, ierr); CHKERRA(ierr)
>>>> call DMSetOutputSequenceNumber(dm, -1, 0.d0, ierr);
>>>> CHKERRA(ierr)
>>>> write(filename,'(A,I8.8,A)') "restart_", stepnum, ".bin"
>>>> call PetscViewerCreate(PETSC_COMM_WORLD, binViewer,
>>>> ierr); CHKERRA(ierr)
>>>> call PetscViewerSetType(binViewer, PETSCVIEWERBINARY,
>>>> ierr); CHKERRA(ierr)
>>>> call PetscViewerFileSetMode(binViewer, FILE_MODE_WRITE,
>>>> ierr); CHKERRA(ierr);
>>>> call PetscViewerBinarySetUseMPIIO(binViewer,
>>>> PETSC_TRUE, ierr); CHKERRA(ierr);
>>>>
>>>>
>>>
>>> Do you get the correct output if you don’t call the function above (or
>>> equivalently use PETSC_FALSE)
>>>
>>>
>>> call PetscViewerFileSetName(binViewer, trim(filename), ierr);
>>>> CHKERRA(ierr)
>>>> call VecView(X, binViewer, ierr); CHKERRA(ierr)
>>>> call PetscViewerDestroy(binViewer, ierr); CHKERRA(ierr)
>>>> call DMSetOutputSequenceNumber(dm, save_seqnum,
>>>> save_seqval, ierr); CHKERRA(ierr)
>>>>
>>>> I do not think there is anything wrong with it but of course I would be
>>>> happy to hear your feedback.
>>>> Nonetheless my question was : how far have you tested the binary mpi
>>>> i/o of a Vec ? Does it make some sense that for a 50 million cell mesh
>>>> split on 1024 processes, it could somehow fail ?
>>>> Or is it my python drawing method that is completely incapable of
>>>> handling this dataset ? (paraview displays the same thing though so I'm not
>>>> sure ...)
>>>>
>>>
>>> Are you using the python provided tools within petsc to load the Vec
>>> from file?
>>>
>>>
>>> Thanks,
>>> Dave
>>>
>>>
>>>
>>>> Thank you very much for your advice and help !!!
>>>>
>>>> Thibault
>>>>
>>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210708/ab894ccc/attachment-0001.html>
More information about the petsc-users
mailing list