[petsc-users] Parallel DMPlex Local to Global Mapping
Ferrand, Jesus A.
FERRANJ2 at my.erau.edu
Wed Jan 12 18:55:06 CST 2022
Dear PETSc Team:
Hi! I'm working on a parallel version of a PETSc script that I wrote in serial using DMPlex. After calling DMPlexDistribute() each rank is assigned its own DAG where the points are numbered locally. For example, If I split a 100-cell mesh over 4 processors, each process numbers their cells 0-24 as oppossed to something like 0-24, 25-49, 50-74, and 74-99 on ranks 0,1,2, and 3 respectively. The same happens for Face, Edge, and Vertex points in that the local DAG's renumber the ID's starting with 0 instead of using global numbering.
How can I distribute a mesh such that the global numbering is reflected in the local DAG's? If not, what would be the right way to retrieve the global numbering? I've seen the term "StarForest" in some [petsc-users] discussion threads discussing a similar issue but have little clue as how to use them.
I've looked at the following functions:
* DMPlexCreatePointNumbering() - Sounds like what I need, but I don't think it will work because I am relying on DMPlexGetDepthStratum() which returns bounds in local numbering.
* DMPlexGetCellNumbering() - Only converts Cells
* DMPlexGetVertexNumbering() - Only converts Vertices
Basically, what I want to do is to have a global matrix and have my MPI ranks call MatSetValues() on it (with ADD_VALUES as the mode). In my serial code I was relying on the global point numbering to build the matrix. Without it, I can't do it my way : (. I'm manually assembling a global stiffness matrix out of element stiffness matrices to run FEA.
Any help is much appreciated.
Sincerely:
J.A. Ferrand
Embry-Riddle Aeronautical University - Daytona Beach FL
M.Sc. Aerospace Engineering | May 2022
B.Sc. Aerospace Engineering
B.Sc. Computational Mathematics
Sigma Gamma Tau
Tau Beta Pi
Honors Program
Phone: (386)-843-1829
Email(s): ferranj2 at my.erau.edu
jesus.ferrand at gmail.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220113/2e59546e/attachment-0001.html>
More information about the petsc-users
mailing list