[petsc-users] Partitioning does not work

Barry Smith bsmith at mcs.anl.gov
Sat Apr 15 14:33:14 CDT 2017


   Put a MatView() on the mesh matrix. 

   The mesh information in your modified version is not a list of cells of triangles or quads and hence the conversion to dual doesn't find anything. 
Each row of the mesh matrix (for trianglular mesh) needs three entries indicating the vertices of the triangles (four entries for quad) and then the different rows need to make sense with respect to the other rows, so there are no overlapping triangles etc. 

In order to call MatMeshToDual() the mat has to correspond to a real mesh it cannot be any graph/sparse matrix.

  Barry



> On Apr 15, 2017, at 12:40 PM, Orxan Shibliyev <orxan.shibli at gmail.com> wrote:
> 
> I attached two files. One includes the original example and the output at the end of the file while the other file includes modified example (commented modified lines) and also its output at the EOF.
> 
> On Sat, Apr 15, 2017 at 8:22 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> 
> > On Apr 15, 2017, at 12:13 PM, Orxan Shibliyev <orxan.shibli at gmail.com> wrote:
> >
> > I modified ex11.c:
> 
>    How did you modify it, what exactly did you change?
> 
> > Tests MatMeshToDual() in order to partition the unstructured grid provided in Petsc documentation, page 71. The resulting code is as follows but MatView() does not print entries of matrix, dual whereas in the original example it does.
> 
>    What does MatView() show instead? How is the output different? Send as attachments the "modified" example and the output from both.
> 
> 
> 
> 
> > Why?
> >
> > static char help[] = "Tests MatMeshToDual()\n\n";
> >
> > /*T
> > Concepts: Mat^mesh partitioning
> > Processors: n
> > T*/
> >
> > /*
> >    Include "petscmat.h" so that we can use matrices.
> >    automatically includes:
> >    petscsys.h       - base PETSc routines   petscvec.h    - vectors
> >    petscmat.h    - matrices
> >    petscis.h     - index sets            petscviewer.h - viewers
> >    */
> > #include <petscmat.h>
> >
> > #undef __FUNCT__
> > #define __FUNCT__ "main"
> > int main(int argc,char **args)
> > {
> >     Mat             mesh,dual;
> >     PetscErrorCode  ierr;
> >     PetscInt        Nvertices = 4;       /* total number of vertices */
> >     PetscInt        ncells    = 2;       /* number cells on this process */
> >     PetscInt        *ii,*jj;
> >     PetscMPIInt     size,rank;
> >     MatPartitioning part;
> >     IS              is;
> >
> >     PetscInitialize(&argc,&args,(char*)0,help);
> >     ierr = MPI_Comm_size(MPI_COMM_WORLD,&size);CHKERRQ(ierr);
> >     if (size != 2) SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_SUP,"This example is for exactly two processes");
> >     ierr = MPI_Comm_rank(MPI_COMM_WORLD,&rank);CHKERRQ(ierr);
> >
> >     ierr  = PetscMalloc1(3,&ii);CHKERRQ(ierr);
> >     ierr  = PetscMalloc1(3,&jj);CHKERRQ(ierr);
> >     if (rank == 0) {
> >         ii[0] = 0; ii[1] = 2; ii[2] = 3;
> >         jj[0] = 2; jj[1] = 3; jj[2] = 3;
> >     } else {
> >         ii[0] = 0; ii[1] = 1; ii[2] = 3;
> >         jj[0] = 0; jj[1] = 0; jj[2] = 1;
> >     }
> >     ierr = MatCreateMPIAdj(MPI_COMM_WORLD,ncells,Nvertices,ii,jj,NULL,&mesh);CHKERRQ(ierr);
> >     ierr = MatMeshToCellGraph(mesh,2,&dual);CHKERRQ(ierr);
> >     ierr = MatView(dual,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr);
> >
> >     ierr = MatPartitioningCreate(MPI_COMM_WORLD,&part);CHKERRQ(ierr);
> >     ierr = MatPartitioningSetAdjacency(part,dual);CHKERRQ(ierr);
> >     ierr = MatPartitioningSetFromOptions(part);CHKERRQ(ierr);
> >     ierr = MatPartitioningApply(part,&is);CHKERRQ(ierr);
> >     ierr = ISView(is,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr);
> >     ierr = ISDestroy(&is);CHKERRQ(ierr);
> >     ierr = MatPartitioningDestroy(&part);CHKERRQ(ierr);
> >
> >     ierr = MatDestroy(&mesh);CHKERRQ(ierr);
> >     ierr = MatDestroy(&dual);CHKERRQ(ierr);
> >     ierr = PetscFinalize();
> >     return 0;
> > }
> 
> 
> <modified.c><original.c>



More information about the petsc-users mailing list