[petsc-users] Partitioning does not work

Barry Smith bsmith at mcs.anl.gov
Sat Apr 15 16:31:12 CDT 2017


   The DMPlex experts and point you in the correct direction.


> On Apr 15, 2017, at 4:28 PM, Orxan Shibliyev <orxan.shibli at gmail.com> wrote:
> 
> About partitioning of DMPlex, manual says that,
> 
> In exactly the same way as MatPartitioning and MatOrdering, we encode the results of a partition or order in an IS. However, the graph we are dealing with now is not the adjacency graph of the problem Jacobian, but the mesh itself.
> 
> I could not find a function related to partitioning to feed in DM object. I know how to partition with adjacency matrix but not with DM. Would you refer me to an example in which DMPlex is partitioned or to a proper function?
> 
> On Sat, Apr 15, 2017 at 11:30 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> 
> > On Apr 15, 2017, at 2:48 PM, Orxan Shibliyev <orxan.shibli at gmail.com> wrote:
> >
> > Yes I just realized that before your reply. I would like to ask a few more questions.
> >       • MatCreateMPIAdj() takes "number of local rows" as argument. Well that's what normally I don't know prior to partitioning. Should not the number of elements assigned to partitions determined by Petsc? After all the number of elements in partitions do not need to equal. I have used METIS before but never provided this kind of parameter.
> 
>    It is whatever it is when you are providing the information. It need not be associated with a "good" partitioning, in fact, it probably is not associated with a good partitioning, otherwise you would not have to call the partitioner. For example if you are reading the list of cells from a file you just put an equal number of cells on each process and then do the partitioning process to determine where they "should go" to have a good partitioning.
> 
> 
> >       • Do I need to migrate elements to processes myself? This is normally what I do after using METIS.
> 
>    Yes, this API in PETSc is only useful for telling you how to partition, it doesn't move all the data to the "correct" processes.
> 
> >       • Examples just stop after partitioning. After partitioning I just know to which processes the elements are assigned and orderings (IS objects). How can I get a matrix, A to use it in Ax=b?
> 
>    Depends, if you are using the finite element method you need to compute the element stiffness matrices for each element and call MatSetValues() to put them into the matrix.
> 
>   The PETSc DMPlex object manages most of the finite element/volume process for you so you should consider that. It handles doing the partitioning and moving the element information to the right process and doing the finite element integrations for you. Much easier than doing it all yourself.
> 
> 
>    Barry
> 
> >
> >
> >
> > On Sat, Apr 15, 2017 at 10:33 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> >
> >    Put a MatView() on the mesh matrix.
> >
> >    The mesh information in your modified version is not a list of cells of triangles or quads and hence the conversion to dual doesn't find anything.
> > Each row of the mesh matrix (for trianglular mesh) needs three entries indicating the vertices of the triangles (four entries for quad) and then the different rows need to make sense with respect to the other rows, so there are no overlapping triangles etc.
> >
> > In order to call MatMeshToDual() the mat has to correspond to a real mesh it cannot be any graph/sparse matrix.
> >
> >   Barry
> >
> >
> >
> > > On Apr 15, 2017, at 12:40 PM, Orxan Shibliyev <orxan.shibli at gmail.com> wrote:
> > >
> > > I attached two files. One includes the original example and the output at the end of the file while the other file includes modified example (commented modified lines) and also its output at the EOF.
> > >
> > > On Sat, Apr 15, 2017 at 8:22 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> > >
> > > > On Apr 15, 2017, at 12:13 PM, Orxan Shibliyev <orxan.shibli at gmail.com> wrote:
> > > >
> > > > I modified ex11.c:
> > >
> > >    How did you modify it, what exactly did you change?
> > >
> > > > Tests MatMeshToDual() in order to partition the unstructured grid provided in Petsc documentation, page 71. The resulting code is as follows but MatView() does not print entries of matrix, dual whereas in the original example it does.
> > >
> > >    What does MatView() show instead? How is the output different? Send as attachments the "modified" example and the output from both.
> > >
> > >
> > >
> > >
> > > > Why?
> > > >
> > > > static char help[] = "Tests MatMeshToDual()\n\n";
> > > >
> > > > /*T
> > > > Concepts: Mat^mesh partitioning
> > > > Processors: n
> > > > T*/
> > > >
> > > > /*
> > > >    Include "petscmat.h" so that we can use matrices.
> > > >    automatically includes:
> > > >    petscsys.h       - base PETSc routines   petscvec.h    - vectors
> > > >    petscmat.h    - matrices
> > > >    petscis.h     - index sets            petscviewer.h - viewers
> > > >    */
> > > > #include <petscmat.h>
> > > >
> > > > #undef __FUNCT__
> > > > #define __FUNCT__ "main"
> > > > int main(int argc,char **args)
> > > > {
> > > >     Mat             mesh,dual;
> > > >     PetscErrorCode  ierr;
> > > >     PetscInt        Nvertices = 4;       /* total number of vertices */
> > > >     PetscInt        ncells    = 2;       /* number cells on this process */
> > > >     PetscInt        *ii,*jj;
> > > >     PetscMPIInt     size,rank;
> > > >     MatPartitioning part;
> > > >     IS              is;
> > > >
> > > >     PetscInitialize(&argc,&args,(char*)0,help);
> > > >     ierr = MPI_Comm_size(MPI_COMM_WORLD,&size);CHKERRQ(ierr);
> > > >     if (size != 2) SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_SUP,"This example is for exactly two processes");
> > > >     ierr = MPI_Comm_rank(MPI_COMM_WORLD,&rank);CHKERRQ(ierr);
> > > >
> > > >     ierr  = PetscMalloc1(3,&ii);CHKERRQ(ierr);
> > > >     ierr  = PetscMalloc1(3,&jj);CHKERRQ(ierr);
> > > >     if (rank == 0) {
> > > >         ii[0] = 0; ii[1] = 2; ii[2] = 3;
> > > >         jj[0] = 2; jj[1] = 3; jj[2] = 3;
> > > >     } else {
> > > >         ii[0] = 0; ii[1] = 1; ii[2] = 3;
> > > >         jj[0] = 0; jj[1] = 0; jj[2] = 1;
> > > >     }
> > > >     ierr = MatCreateMPIAdj(MPI_COMM_WORLD,ncells,Nvertices,ii,jj,NULL,&mesh);CHKERRQ(ierr);
> > > >     ierr = MatMeshToCellGraph(mesh,2,&dual);CHKERRQ(ierr);
> > > >     ierr = MatView(dual,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr);
> > > >
> > > >     ierr = MatPartitioningCreate(MPI_COMM_WORLD,&part);CHKERRQ(ierr);
> > > >     ierr = MatPartitioningSetAdjacency(part,dual);CHKERRQ(ierr);
> > > >     ierr = MatPartitioningSetFromOptions(part);CHKERRQ(ierr);
> > > >     ierr = MatPartitioningApply(part,&is);CHKERRQ(ierr);
> > > >     ierr = ISView(is,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr);
> > > >     ierr = ISDestroy(&is);CHKERRQ(ierr);
> > > >     ierr = MatPartitioningDestroy(&part);CHKERRQ(ierr);
> > > >
> > > >     ierr = MatDestroy(&mesh);CHKERRQ(ierr);
> > > >     ierr = MatDestroy(&dual);CHKERRQ(ierr);
> > > >     ierr = PetscFinalize();
> > > >     return 0;
> > > > }
> > >
> > >
> > > <modified.c><original.c>
> >
> >
> 
> 



More information about the petsc-users mailing list