[petsc-users] Question about PETScSF usage in DMNetwork/DMPlex
Adrian Maldonado
dmaldona at hawk.iit.edu
Fri Sep 16 12:54:02 CDT 2016
Just one addition about one thing I've noticed.
The section:
PetscSection Object: 2 MPI processes
type not yet set
Process 0:
( 0) dim 1 offset 0
( 1) dim 1 offset 1
( 2) dim 1 offset 2
( 3) dim 1 offset 3
( 4) dim -2 offset -8
( 5) dim -2 offset -9
( 6) dim -2 offset -10
Process 1:
( 0) dim 1 offset 4
( 1) dim 1 offset 5
( 2) dim 1 offset 6
( 3) dim 1 offset 7
( 4) dim 1 offset 8
( 5) dim 1 offset 9
For the ghost values 4, 5, 6... is encoding the ghost values as rank = -(-2
+ 1) and offset = -(-8 + 1) ?
On Fri, Sep 16, 2016 at 11:36 AM, Adrian Maldonado <dmaldona at hawk.iit.edu>
wrote:
> Hi,
>
> I am trying to understand some of the data structures DMPlex/DMNetwork
> creates and the relationship among them.
>
> As an example, I have an small test circuit (/src/ksp/ksp/examples/
> tutorials/network/ex1.c).
>
> This is a graph that consists on 6 edges and 4 vertices, each one of those
> having one degree of freedom. When ran with two processors, each rank will
> own 3 edges. Rank 0 will own one vertex (3 ghost) and Rank 1 will own 3
> vertices.
>
> These are some data structures for this problem. I am getting these data
> structures inside DMNetworkDistribute
> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMNetworkDistribute.html>
>
> DM Object: Parallel Mesh 2 MPI processes
> type: plex
> Parallel Mesh in 1 dimensions:
> 0-cells: 4 3
> 1-cells: 3 3
> Labels:
> depth: 2 strata of sizes (4, 3)
>
> This, as I understand, is printing a tree with all the vertices and edges
> in each processor (owned and ghost).
>
> PetscSection Object: 2 MPI processes
> type not yet set
> Process 0:
> ( 0) dim 1 offset 0
> ( 1) dim 1 offset 1
> ( 2) dim 1 offset 2
> ( 3) dim 1 offset 3
> ( 4) dim -2 offset -8
> ( 5) dim -2 offset -9
> ( 6) dim -2 offset -10
> Process 1:
> ( 0) dim 1 offset 4
> ( 1) dim 1 offset 5
> ( 2) dim 1 offset 6
> ( 3) dim 1 offset 7
> ( 4) dim 1 offset 8
> ( 5) dim 1 offset 9
>
> This is a global PETSc section that gives me the global numbering for the
> owned points and (garbage?) negative values for ghost.
>
> Until here everything is good. But then I print the PetscSF that is
> created by 'DMPlexDistribute'. This I do not understand:
>
> PetscSF Object: Migration SF 2 MPI processes
> type: basic
> sort=rank-order
> [0] Number of roots=10, leaves=7, remote ranks=1
> [0] 0 <- (0,0)
> [0] 1 <- (0,1)
> [0] 2 <- (0,3)
> [0] 3 <- (0,6)
> [0] 4 <- (0,7)
> [0] 5 <- (0,8)
> [0] 6 <- (0,9)
> [1] Number of roots=0, leaves=6, remote ranks=1
> [1] 0 <- (0,2)
> [1] 1 <- (0,4)
> [1] 2 <- (0,5)
> [1] 3 <- (0,7)
> [1] 4 <- (0,8)
> [1] 5 <- (0,9)
> [0] Roots referenced by my leaves, by rank
> [0] 0: 7 edges
> [0] 0 <- 0
> [0] 1 <- 1
> [0] 2 <- 3
> [0] 3 <- 6
> [0] 4 <- 7
> [0] 5 <- 8
> [0] 6 <- 9
> [1] Roots referenced by my leaves, by rank
> [1] 0: 6 edges
> [1] 0 <- 2
> [1] 1 <- 4
> [1] 2 <- 5
> [1] 3 <- 7
> [1] 4 <- 8
> [1] 5 <- 9
>
> I understand that SF is a data structure that saves references to pieces
> of data that are now owned by the process (https://arxiv.org/pdf/1506.
> 06194v1.pdf, page 4).
>
> Since the only ghost nodes appear in rank 0 (three ghost vertices) I would
> expect something like:
> *rank 0:*
> 4 - (1, 3) (to read: point 4 is owned by rank 1 and is rank's 1 point 3)
> etc...
> *rank 1:*
> nothing
>
> Is my intuition correct? If so, what does the star forest that I get from
> DMPlexDistribute mean? I am printing the wrong thing?
>
> Thank you
>
> --
> D. Adrian Maldonado, PhD Candidate
> Electrical & Computer Engineering Dept.
> Illinois Institute of Technology
> 3301 S. Dearborn Street, Chicago, IL 60616
>
--
D. Adrian Maldonado, PhD Candidate
Electrical & Computer Engineering Dept.
Illinois Institute of Technology
3301 S. Dearborn Street, Chicago, IL 60616
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160916/61727583/attachment.html>
More information about the petsc-users
mailing list