[petsc-users] DMPlex with dof in cells AND at nodes

Thibault Bridel-Bertomeu thibault.bridelbertomeu at gmail.com
Mon Jan 11 05:41:05 CST 2021


Dear all,

I haven't been on this mailing list since december, so first of all, happy
new year to everyone !

I am facing another challenge with the DMPlex structure : although I am
doing cell-centered finite volume (using PetscFV), I also need to do some
manipulations and store information that are vertex-based.
The problem I have is when I run the code in parallel : when I visualize
the nodal data using a vtk viewer, it shows NaN at all the vertices in
close proximity to the MPI partition line.
Here is how I proceed to work on the vertex-based data :

First comes the creation of the DM that I'll be using for the cell-centered
FVM

call DMPlexCreateFromFile(comm, name, PETSC_TRUE, dm, ierr)
call DMSetBasicAdjacency(dm, PETSC_TRUE, PETSC_TRUE, ierr)
call DMPlexDistribute(dm, 1, PETSC_NULL_SF, dmDist, ierr)
...
dm = dmDist
...
call DMConstructGhostCells(dm, ..., ..., dmGhost, ierr)
...
dm = dmGhost
...

I set both boolean to PETSC_TRUE in the SetBasicAdjacency so even in MPI
the neighbouring relationships are conserved for all strata. Is that true ?
Like, does doing this actually enable to get all the transitive closure of
a given vertex, even if part of it is in another MPI block ?

Then I need a vertex-based DM, so I figure it's pretty much what's done in
CreateMassMatrix in TS ex11.c.

call DMClone(dm, dmNodal, ierr)
call DMGetCoordinateSection(dm, coordSection, ierr); CHKERRA(ierr)
call DMGetCoordinatesLocal(dm, coordinates, ierr); CHKERRA(ierr)
call DMSetCoordinateSection(dmNodal, PETSC_DETERMINE, coordSection, ierr);
call PetscSectionCreate(PETSC_COMM_WORLD, sectionNodal, ierr); CHKERRA(ierr)
call DMPlexGetDepthStratum(dm, 0, vstart, vend, ierr); CHKERRA(ierr)
call PetscSectionSetChart(sectionNodal, vstart, vend, ierr); CHKERRA(ierr)
do inode = vstart, vend-1
 ! 3 dof per node, ndim = 3
call PetscSectionSetDof(sectionNodal, inode, ndim, ierr); CHKERRA(ierr)
end do
call PetscSectionSetUp(sectionNodal, ierr); CHKERRA(ierr)
call DMSetLocalSection(dmNodal, sectionNodal, ierr); CHKERRA(ierr)
call PetscSectionDestroy(sectionNodal, ierr); CHKERRA(ierr)

>From what I understood, this new dmNodal basically has all the
characteristics of the ori dm, except it's primary layer is vertex-based
with 3 DOF per vertex.

Then I get a local vector and work on it :

call DMGetLocalVector(dmNodal, nodalvec, ierr)
call VecGetArrayF90(nodalvec, nodalvecarray, ierr)
call DMPlexGetDepthStratum(dm, 0, vstart, vend, ierr)
call DMPlexGetDepthStratum(dm, 2, fstart, fend, ierr)
! For each node ...
do inode = vstart, vend-1
A = 0 ! matrix, ndim x ndim
b = 0 ! vector, ndim
call DMPlexGetTransitiveClosure(dmNodal, inode, PETSC_FALSE, vertexclosure,
ierr);
! For each face attached to that node
do iface = 1, size(vertexclosure)
if ((vertexclosure(iface) < fstart) .or. (vertexclosure(iface) >= fend))
cycle
A = A + ...
b = b + ...
end do
nodalvecarray(1+(inode-vstart)*ndim:3+(inode-vstart)*ndim) = matmul(A, b)
...
end do

Then I want to visualize it :

call VecRestoreArrayF90(nodalvec, nodalvecarray, ierr)
call DMCreateGlobalVector(dmNodal, globalnodalvec, ierr)
call DMLocalToGlobal(dmNodal, nodalvec, INSERT_VALUES, globalnodalvec, ierr)
call PetscViewerCreate(PETSC_COMM_WORLD, vtkViewer, ierr); CHKERRA(ierr)
call PetscViewerSetType(vtkViewer, PETSCVIEWERVTK, ierr); CHKERRA(ierr)
call PetscViewerFileSetName(vtkViewer, "toto.vtk", ierr); CHKERRA(ierr)
call VecView(globalnodalvec, vtkViewer, ierr); CHKERRA(ierr)
call PetscViewerDestroy(vtkViewer, ierr); CHKERRA(ierr)

In the toto.vtk file, I get vertex-based values as expected, and the whole
shebang works in sequential. Now when I run it in parallel, I get
vertex-based values inside the MPI blocks, but in the neighbourhood of the
join between the MPI blocks, all the vertex-based values are NaN (see
attachment).

All in all, I think I do not understand fully the consequences of the
DMPlexDistribute with useCone=true & useClosure=true ... but I cannot
figure out where the NaN are coming from : is it because the vertices do
not actually know all their closure through the MPI join, or is it some
Global / Local error I made ?

Thank you very much in advance for your support !!

Thibault
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210111/82345b0a/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: nodal_velocity_00010.vtu
Type: application/octet-stream
Size: 20533 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210111/82345b0a/attachment-0001.obj>


More information about the petsc-users mailing list