[petsc-users] PETSc set off-diagonal matrix pre-allocation

Andrew Ho andrewh0 at uw.edu
Wed Feb 15 09:52:26 CST 2017


I don't understand how the matrices are partitioned then. The documentation
online all shows each mpi rank owning entire rows, with no partitioning
along columns.

I've attached an example where I try to partition a 6x7 matrix into 4 ranks:

rank 0 should own a 3x4 section
rank 1 should own a 3x3 section
rank 2 should own a 3x4 section
rank 3 should own a 3x3 section

However, when I run the example and print out the ownership range and
ownership column range, I get:

rank 0 owns rows/cols: [0,3), [0, 4)
rank 1 owns rows/cols: [3,6), [4, 7)
rank 2 owns rows/cols: [6,9), [7, 11)
rank 3 owns rows/cols: [9,12), [11, 14)

Which makes no sense since these ranges extend beyond the actual size of
the matrix.

On Tue, Feb 14, 2017 at 6:05 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
>    1) on a single process all the columns are in the diagonal block and
> none in the off diagonal block (note that the column partitioning
> corresponding to a partitioning of the vector in the product A *x
>
>    2) on two processes you guessed wrong how many columns PETSc would put
> on the first process. You guessed it would put the first three on the first
> process and the last three on the second process.
>
>     a) it cannot do that, every column has to been owned by some process
> (in the vector x above) so it cannot be 3 and 3, it has to be 4 and 3 or 3
> and 4.
>
>     b) PETSc puts the "extra" columns on the earlier processes not the
> later processes.
>
>   For rectangular matrices you really cannot get away with using
> "PETSC_DECIDE" for local columns when using preallocation
> , since it may decide something different than what you assume.
>
>    I've attached the parallel code that behaves as expected.
>
>    Barry
>
> > On Feb 14, 2017, at 1:06 PM, Andrew Ho <andrewh0 at uw.edu> wrote:
> >
> > The problem isn't only with 1 process, but it seems to be with all
> non-square matrices?
> >
> > For example, here I have a two process program which tries to set a 6x7
> matrix to (each process owns 3 rows):
> >
> > 0 0 0 1 1 1 1
> > 0 0 0 1 1 1 1
> > 0 0 0 1 1 1 1
> > 0 0 0 1 1 1 1
> > 0 0 0 1 1 1 1
> > 0 0 0 1 1 1 1
> >
> > #include <petsc.h>
> > #include <mpi.h>
> >
> > int main(int argc, char** argv)
> > {
> >   PetscErrorCode err;
> >   err = PetscInitialize(&argc, &argv, NULL, "help");
> >   CHKERRQ(err);
> >
> >   int rank, size;
> >   MPI_Comm_rank(MPI_COMM_WORLD, &rank);
> >   MPI_Comm_size(MPI_COMM_WORLD, &size);
> >   if(size != 2)
> >   {
> >     printf("must run with 2 processes");
> >     MPI_Abort(MPI_COMM_WORLD, -1);
> >   }
> >
> >   // create a sparse AIJ matrix distributed across MPI
> >   Mat A;
> >   err = MatCreate(MPI_COMM_WORLD, &A);
> >   CHKERRQ(err);
> >   err = MatSetType(A, MATMPIAIJ);
> >   CHKERRQ(err);
> >   // setup pre-allocation for matrix space
> >   {
> >     err =
> >       MatSetSizes(A, 3, PETSC_DECIDE, 6, 7);
> >     CHKERRQ(err);
> >     if(rank == 0)
> >     {
> >       PetscInt d_nnz[] = {0, 0, 0};
> >       PetscInt o_nnz[] = {4, 4, 4};
> >       err = MatMPIAIJSetPreallocation(A, 0, d_nnz, 0, o_nnz);
> >       CHKERRQ(err);
> >     }
> >     else
> >     {
> >       PetscInt d_nnz[] = {3, 3, 3};
> >       PetscInt o_nnz[] = {1, 1, 1};
> >       err = MatMPIAIJSetPreallocation(A, 0, d_nnz, 0, o_nnz);
> >       CHKERRQ(err);
> >     }
> >   }
> >   err = MatSetUp(A);
> >   CHKERRQ(err);
> >
> >   // set values inside the matrix
> >   for (PetscInt row = 0; row < 3; ++row)
> >   {
> >     for (PetscInt col = 3; col < 7; ++col)
> >     {
> >       err = MatSetValue(A, 3 * rank + row, col, 1, INSERT_VALUES);
> >       CHKERRQ(err);
> >     }
> >   }
> >
> >   err = MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY);
> >   CHKERRQ(err);
> >   err = MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY);
> >   CHKERRQ(err);
> >
> >   err = MatView(A, PETSC_VIEWER_STDOUT_WORLD);
> >   CHKERRQ(err);
> >
> >   // free memory
> >   err = MatDestroy(&A);
> >   CHKERRQ(err);
> >
> >   // cleanup any internal PETSc data at end of program
> >   err = PetscFinalize();
> >   CHKERRQ(err);
> > }
> >
> >
> > On Tue, Feb 14, 2017 at 5:26 AM, Matthew Knepley <knepley at gmail.com>
> wrote:
> > On Tue, Feb 14, 2017 at 6:41 AM, Andrew Ho <andrewh0 at uw.edu> wrote:
> > I have a 3x6 matrix, and I want to set it to (just as an example):
> >
> > 0 0 0 1 1 1
> > 0 0 0 1 1 1
> > 0 0 0 1 1 1
> >
> > From what I can tell, according to the documentation the MPIAIJ sparsity
> of this matrix is:
> >
> > I believe this is an inconsistency in PETSc for rectangular matrices. We
> divide matrices into diagonal and
> > off-diagonal parts mainly to separate communication from computation.
> Thus on 1 proc, we never divide
> > them, even if the matrix is rectangular. Therefore we misinterpret your
> preallocation.
> >
> > Barry, do you think we want to try and change this?
> >
> >   Matt
> >
> > d_nnz = [0, 0, 0]
> > o_nnz = [3, 3, 3]
> >
> > However, when I do this, I get the following error:
> >
> > [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > [0]PETSC ERROR: Argument out of range
> > [0]PETSC ERROR: New nonzero at (0,3) caused a malloc
> > Use MatSetOption(A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE) to
> turn off this check
> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> > [0]PETSC ERROR: Petsc Release Version 3.7.5, unknown
> > [0]PETSC ERROR: ./ex3 on a arch-linux2-c-opt Tue Feb 14 04:23:56 2017
> > [0]PETSC ERROR: Configure options --with-debugging=0 --COPTFLAGS="-O3
> -march=native" --CXXOPTFLAGS="-O3 -march=native" --FOPTFLAGS="-O3
> -march=native"
> > [0]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 582 in
> petsc/src/mat/impls/aij/mpi/mpiaij.c
> > [0]PETSC ERROR: #2 MatSetValues() line 1190 in petsc/src/mat/interface/
> matrix.c
> > [0]PETSC ERROR: #3 main() line 36 in ex3.c
> > [0]PETSC ERROR: No PETSc Option Table entries
> > [0]PETSC ERROR: ----------------End of Error Message -------send entire
> error message to petsc-maint at mcs.anl.gov----------
> >
> > Here's a working test code:
> >
> > #include <petsc.h>
> > #include <mpi.h>
> > int main(int argc, char** argv)
> > {
> >   PetscErrorCode err;
> >   err = PetscInitialize(&argc, &argv, NULL, "help");
> >   CHKERRQ(err);
> >   // create a sparse AIJ matrix distributed across MPI
> >   PetscInt global_width = 6;
> >   PetscInt global_height = 3;
> >   Mat A;
> >   err = MatCreate(MPI_COMM_WORLD, &A);
> >   CHKERRQ(err);
> >   err = MatSetType(A, MATMPIAIJ);
> >   CHKERRQ(err);
> >   // setup pre-allocation for matrix space
> >   {
> >     err =
> >       MatSetSizes(A, global_height, PETSC_DECIDE, global_height,
> global_width);
> >     CHKERRQ(err);
> >     PetscInt d_nnz[] = {0, 0, 0};
> >     PetscInt o_nnz[] = {3, 3, 3};
> >     err = MatMPIAIJSetPreallocation(A, 0, d_nnz, 0, o_nnz);
> >     CHKERRQ(err);
> >   }
> >   err = MatSetUp(A);
> >   CHKERRQ(err);
> >   // set values inside the matrix
> >   for (PetscInt row = 0; row < global_height; ++row)
> >   {
> >     for (PetscInt col = global_height; col < global_width; ++col)
> >     {
> >       err = MatSetValue(A, row, col, 1, INSERT_VALUES);
> >       CHKERRQ(err);
> >     }
> >   }
> >   err = MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY);
> >   CHKERRQ(err);
> >   err = MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY);
> >   CHKERRQ(err);
> >   err = MatView(A, PETSC_VIEWER_STDOUT_WORLD);
> >   CHKERRQ(err);
> >   // free memory
> >   err = MatDestroy(&A);
> >   CHKERRQ(err);
> >   // cleanup any internal PETSc data at end of program
> >   err = PetscFinalize();
> >   CHKERRQ(err);
> > }
> >
> > Am I mis-understanding what the d_nnz and o_nnz parameter are supposed
> to mean?
> > --
> > Andrew Ho
> >
> >
> >
> > --
> > What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> > -- Norbert Wiener
> >
> >
> >
> > --
> > Andrew Ho
>
>
>


-- 
Andrew Ho
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170215/55447211/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: ex3.c
Type: text/x-csrc
Size: 2094 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170215/55447211/attachment.c>


More information about the petsc-users mailing list