[petsc-users] Node numbering in parallel partitioned mesh
Matthew Knepley
knepley at gmail.com
Tue May 2 10:25:42 CDT 2023
On Tue, May 2, 2023 at 11:03 AM Karthikeyan Chockalingam - STFC UKRI <
karthikeyan.chockalingam at stfc.ac.uk> wrote:
> Thank you Matt.
>
>
>
> I will look to find out those shared nodes. Sorry, I didn’t get it when
> you say “Roots are owned, and leaves are not owned”
>
That is the nomenclature from PetscSF.
>
>
> My question was specifically related to numbering – how do I start
> numbering in a partition from where I left off from the previous partition
> without double counting so that the node numbers are unique?
>
1) Determine the local sizes
Run over the local nodes. If any are not owned, do not count them.
2) Get the local offset nStart
Add up the local sizes to get the offset for each process using
MPI_Scan()
3) Number locally
Run over local nodes and number each owned node, starting with nStart
Thanks,
Matt
>
>
Let's say I have a VECMPI which is distributed among the partitions. When I
> try to retrieve the data using VecGetValues, I often run into problems
> accessing non-local data (so, for now, I scatter the vector). When some
> nodes are shared, will I not always have this problem accessing those nodes
> from the wrong partition unless those nodes are ghosted? Maybe I am not
> thinking about it correctly.
>
>
>
> Kind regards,
>
> Karthik.
>
>
>
>
>
> *From: *Matthew Knepley <knepley at gmail.com>
> *Date: *Tuesday, 2 May 2023 at 13:35
> *To: *Chockalingam, Karthikeyan (STFC,DL,HC) <
> karthikeyan.chockalingam at stfc.ac.uk>
> *Cc: *petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
> *Subject: *Re: [petsc-users] Node numbering in parallel partitioned mesh
>
> On Tue, May 2, 2023 at 8:25 AM Karthikeyan Chockalingam - STFC UKRI via
> petsc-users <petsc-users at mcs.anl.gov> wrote:
>
> Hello,
>
>
>
> This is not exactly a PETSc question. I have a parallel partitioned finite
> element mesh. What are the steps involved in having a contiguous but unique
> set of node numbering from one partition to the next? There are nodes which
> are shared between different partitions. Moreover, this partition has to
> coincide parallel partition of PETSc Vec/Mat, which ensures data locality.
>
>
>
> If you can post the algorithm or cite a reference, it will prove helpful.
>
>
>
> Somehow, you have to know what "nodes" are shared. Once you know this, you
> can make a rule for numbering, such
>
> as "the lowest rank gets the shared nodes". We encapsulate this ownership
> relation in the PetscSF. Roots are owned,
>
> and leaves are not owned. The rule above is not great for load balance, so
> we have an optimization routine for the
>
> simple PetscSF:
> https://petsc.org/main/manualpages/DMPlex/DMPlexRebalanceSharedPoints/
>
>
>
> Thanks,
>
>
>
> Matt
>
>
>
> Many thanks.
>
>
>
> Kind regards,
>
> Karthik.
>
>
>
>
>
>
> --
>
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230502/d95ff56e/attachment-0001.html>
More information about the petsc-users
mailing list