[petsc-users] PetscSF clarification
Toby Isaac
toby.isaac at gmail.com
Tue Aug 2 10:54:09 CDT 2022
Hi Nicholas,
What command did you use to view the star forest? What you are
showing looks like the PetscSF that is used to distribute the points
from the root, not the final PetscSF describing the points and their
duplicates once it has been distributed.
-- Toby
On Tue, Aug 2, 2022 at 10:28 AM Nicholas Arnold-Medabalimi
<narnoldm at umich.edu> wrote:
>
> Hello
>
> I have been trying to follow how PetscSF works, and I'm observing some behavior I don't quite understand. I have been looking at some of the past petsc-user discussions involving PetsSF, and my understanding is that each processor will have roots and leaves; roots are "owned" points, and leaves are ghosts of another processor.
>
> I have a setup when I've built the original mesh(6000 points) on just a single processor and then distributed it using DMPlexDistrubute with an overlap and then viewing the generated star forest.
>
> My expectation is that roughly 1/p points will end up as roots on each processor, with the overlap points being leaves on each processor. However, instead I get the first processor having 6000 roots with ~ 1/p leaves
> PetscSF Object: 4 MPI processes
> type: basic
> [0] Number of roots=6003, leaves=1587, remote ranks=1
> [0] 0 <- (0,0)
> , and then each of the following processors having ~1/p leaves all pointed at the root processor
> [1] Number of roots=0, leaves=1497, remote ranks=1
> [1] 0 <- (0,262)
>
> Is this the expected outcome? At least to me, this implies the first processor still "owns" all the points. I imagine I misunderstand something here. Thank you for the assistance
>
> Sincerely
> Nicholas
>
>
> --
> Nicholas Arnold-Medabalimi
>
> Ph.D. Candidate
> Computational Aeroscience Lab
> University of Michigan
More information about the petsc-users
mailing list