[petsc-users] PetscSF clarification

Nicholas Arnold-Medabalimi narnoldm at umich.edu
Tue Aug 2 11:41:11 CDT 2022


I see. I was under the impression that the PetscSF returned by
DMPlexDistribute was the resulting Star Forest. Getting the SF from the DM
after the distribution gets the results I am expecting. Thank you for the
help.



On Tue, Aug 2, 2022 at 12:12 PM Toby Isaac <toby.isaac at gmail.com> wrote:

> I think you want the PetscSF that can be obtained from calling
> DMGetPointSF() <
> https://petsc.org/release/docs/manualpages/DM/DMGetPointSF.html>
> on the mesh created by DMPlexDistribute(), not the PetscSF returned by
> DMPlexDistribute() itself.
>
> On Tue, Aug 2, 2022 at 10:54 AM Toby Isaac <toby.isaac at gmail.com> wrote:
> >
> > Hi Nicholas,
> >
> > What command did you use to view the star forest?  What you are
> > showing looks like the PetscSF that is used to distribute the points
> > from the root, not the final PetscSF describing the points and their
> > duplicates once it has been distributed.
> >
> > -- Toby
> >
> > On Tue, Aug 2, 2022 at 10:28 AM Nicholas Arnold-Medabalimi
> > <narnoldm at umich.edu> wrote:
> > >
> > > Hello
> > >
> > > I have been trying to follow how PetscSF works, and I'm observing some
> behavior I don't quite understand. I have been looking at some of the past
> petsc-user discussions involving PetsSF, and my understanding is that each
> processor will have roots and leaves; roots are "owned" points, and leaves
> are ghosts of another processor.
> > >
> > > I have a setup when I've built the original mesh(6000 points) on just
> a single processor and then distributed it using DMPlexDistrubute with an
> overlap and then viewing the generated star forest.
> > >
> > > My expectation is that roughly 1/p points will end up as roots on each
> processor, with the overlap points being leaves on each processor. However,
> instead I get the first processor having 6000 roots with ~ 1/p leaves
> > > PetscSF Object: 4 MPI processes
> > >   type: basic
> > >   [0] Number of roots=6003, leaves=1587, remote ranks=1
> > >   [0] 0 <- (0,0)
> > > , and then each of the following processors having ~1/p leaves all
> pointed at the root processor
> > >   [1] Number of roots=0, leaves=1497, remote ranks=1
> > >   [1] 0 <- (0,262)
> > >
> > > Is this the expected outcome? At least to me, this implies the first
> processor still "owns" all the points. I imagine I misunderstand something
> here. Thank you for the assistance
> > >
> > > Sincerely
> > > Nicholas
> > >
> > >
> > > --
> > > Nicholas Arnold-Medabalimi
> > >
> > > Ph.D. Candidate
> > > Computational Aeroscience Lab
> > > University of Michigan
>


-- 
Nicholas Arnold-Medabalimi

Ph.D. Candidate
Computational Aeroscience Lab
University of Michigan
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220802/0a5cf098/attachment.html>


More information about the petsc-users mailing list