[petsc-users] Partitioning does not work

Barry Smith bsmith at mcs.anl.gov
Sat Apr 15 12:22:32 CDT 2017


> On Apr 15, 2017, at 12:13 PM, Orxan Shibliyev <orxan.shibli at gmail.com> wrote:
> 
> I modified ex11.c:

   How did you modify it, what exactly did you change?

> Tests MatMeshToDual() in order to partition the unstructured grid provided in Petsc documentation, page 71. The resulting code is as follows but MatView() does not print entries of matrix, dual whereas in the original example it does.

   What does MatView() show instead? How is the output different? Send as attachments the "modified" example and the output from both.




> Why?
> 
> static char help[] = "Tests MatMeshToDual()\n\n";
> 
> /*T
> Concepts: Mat^mesh partitioning
> Processors: n
> T*/
> 
> /*
>    Include "petscmat.h" so that we can use matrices.
>    automatically includes:
>    petscsys.h       - base PETSc routines   petscvec.h    - vectors
>    petscmat.h    - matrices
>    petscis.h     - index sets            petscviewer.h - viewers
>    */
> #include <petscmat.h>
> 
> #undef __FUNCT__
> #define __FUNCT__ "main"
> int main(int argc,char **args)
> {
>     Mat             mesh,dual;
>     PetscErrorCode  ierr;
>     PetscInt        Nvertices = 4;       /* total number of vertices */
>     PetscInt        ncells    = 2;       /* number cells on this process */
>     PetscInt        *ii,*jj;
>     PetscMPIInt     size,rank;
>     MatPartitioning part;
>     IS              is;
> 
>     PetscInitialize(&argc,&args,(char*)0,help);
>     ierr = MPI_Comm_size(MPI_COMM_WORLD,&size);CHKERRQ(ierr);
>     if (size != 2) SETERRQ(PETSC_COMM_WORLD,PETSC_ERR_SUP,"This example is for exactly two processes");
>     ierr = MPI_Comm_rank(MPI_COMM_WORLD,&rank);CHKERRQ(ierr);
> 
>     ierr  = PetscMalloc1(3,&ii);CHKERRQ(ierr);
>     ierr  = PetscMalloc1(3,&jj);CHKERRQ(ierr);
>     if (rank == 0) {
>         ii[0] = 0; ii[1] = 2; ii[2] = 3;
>         jj[0] = 2; jj[1] = 3; jj[2] = 3;
>     } else {
>         ii[0] = 0; ii[1] = 1; ii[2] = 3;
>         jj[0] = 0; jj[1] = 0; jj[2] = 1;
>     }
>     ierr = MatCreateMPIAdj(MPI_COMM_WORLD,ncells,Nvertices,ii,jj,NULL,&mesh);CHKERRQ(ierr);
>     ierr = MatMeshToCellGraph(mesh,2,&dual);CHKERRQ(ierr);
>     ierr = MatView(dual,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr);
> 
>     ierr = MatPartitioningCreate(MPI_COMM_WORLD,&part);CHKERRQ(ierr);
>     ierr = MatPartitioningSetAdjacency(part,dual);CHKERRQ(ierr);
>     ierr = MatPartitioningSetFromOptions(part);CHKERRQ(ierr);
>     ierr = MatPartitioningApply(part,&is);CHKERRQ(ierr);
>     ierr = ISView(is,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr);
>     ierr = ISDestroy(&is);CHKERRQ(ierr);
>     ierr = MatPartitioningDestroy(&part);CHKERRQ(ierr);
> 
>     ierr = MatDestroy(&mesh);CHKERRQ(ierr);
>     ierr = MatDestroy(&dual);CHKERRQ(ierr);
>     ierr = PetscFinalize();
>     return 0;
> }



More information about the petsc-users mailing list