[petsc-users] Node numbering in parallel partitioned mesh

Karthikeyan Chockalingam - STFC UKRI karthikeyan.chockalingam at stfc.ac.uk
Tue May 2 10:03:26 CDT 2023


Thank you Matt.

I will look to find out those shared nodes. Sorry, I didn’t get it when you say “Roots are owned, and leaves are not owned”

My question was specifically related to numbering – how do I start numbering in a partition from where I left off from the previous partition without double counting so that the node numbers are unique?

Let's say I have a VECMPI which is distributed among the partitions. When I try to retrieve the data using VecGetValues, I often run into problems accessing non-local data (so, for now, I scatter the vector). When some nodes are shared, will I not always have this problem accessing those nodes from the wrong partition unless those nodes are ghosted? Maybe I am not thinking about it correctly.

Kind regards,
Karthik.


From: Matthew Knepley <knepley at gmail.com>
Date: Tuesday, 2 May 2023 at 13:35
To: Chockalingam, Karthikeyan (STFC,DL,HC) <karthikeyan.chockalingam at stfc.ac.uk>
Cc: petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] Node numbering in parallel partitioned mesh
On Tue, May 2, 2023 at 8:25 AM Karthikeyan Chockalingam - STFC UKRI via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:
Hello,

This is not exactly a PETSc question. I have a parallel partitioned finite element mesh. What are the steps involved in having a contiguous but unique set of node numbering from one partition to the next? There are nodes which are shared between different partitions. Moreover, this partition has to coincide parallel partition of PETSc Vec/Mat, which ensures data locality.

If you can post the algorithm or cite a reference, it will prove helpful.

Somehow, you have to know what "nodes" are shared. Once you know this, you can make a rule for numbering, such
as "the lowest rank gets the shared nodes". We encapsulate this ownership relation in the PetscSF. Roots are owned,
and leaves are not owned. The rule above is not great for load balance, so we have an optimization routine for the
simple PetscSF: https://petsc.org/main/manualpages/DMPlex/DMPlexRebalanceSharedPoints/

  Thanks,

     Matt

Many thanks.

Kind regards,
Karthik.



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230502/2b53d46c/attachment.html>


More information about the petsc-users mailing list