[petsc-users] Unequal sparse matrix row distribution for MPI MatMult

Matthew Knepley knepley at gmail.com
Sun Mar 29 22:02:49 CDT 2015


On Sun, Mar 29, 2015 at 9:56 PM, Steena M <stm8086 at yahoo.com> wrote:

> Hi Barry,
>
> I am trying to partition a 20 row and 20 col sparse matrix between two
> procs such that proc 0 has 15 rows and 20 cols and proc 1 has 5 rows and 20
> cols. The code snippet:
>
>
>
>         CHKERRQ(MatCreate(PETSC_COMM_WORLD,&A)); // at runtime:
> -matload_block_size 1
>
>
>         if (rank ==0)
>         {
>                 CHKERRQ( MatSetSizes(A, 15, 20, 20, 20) ); //rank 0 gets
> 75% of the rows
>                 CHKERRQ( MatSetType(A, MATMPIBAIJ) );
>                 CHKERRQ( MatLoad(A,fd) );
>          }
>
>         else
>         {
>                 CHKERRQ( MatSetSizes(A, 5, 20, 20, 20) ); //rank 1 gets
> 25% of the rows
>                 CHKERRQ( MatSetType(A, MATMPIBAIJ) );
>                 CHKERRQ( MatLoad(A,fd) );
>       }
>
> This throws the following error (probably from psplit.c):
> [1]PETSC ERROR: --------------------- Error Message
> ------------------------------------
> [1]PETSC ERROR: Nonconforming object sizes!
> [1]PETSC ERROR: Sum of local lengths 40 does not equal global length 20,
> my local length 20
>   likely a call to VecSetSizes() or MatSetSizes() is wrong.
> See http://www.mcs.anl.gov/petsc/documentation/faq.html#split!
>
> This error printout doesn't quite make sense to me. I'm trying to specify
> a total matrix size of 20x20... I haven't yet figured out where the '40'
> comes from in the error message.
>
> Any thoughts on what might be going wrong?
>

Its the column specification. Just use PETSC_DETERMINE for the local
columns since all our sparse matrix
formats are row divisions anyway.

  Thanks,

     Matt


> Thanks in advance,
> Steena
>
>
>
> --------------------------------------------
> On Sun, 3/22/15, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
>  Subject: Re: [petsc-users] Unequal sparse matrix row distribution for MPI
> MatMult
>  To: "Steena M" <stm8086 at yahoo.com>
>  Cc: petsc-users at mcs.anl.gov
>  Date: Sunday, March 22, 2015, 3:58 PM
>
>
>
>  Steena,
>
>     I am
>  a little unsure of your question.
>
>     1) you can create a MPIBAIJ
>  matrix with any distribution of block rows per process you
>  want, just set the local row size for each process to be
>  what you like.  Use MatCreateVecs() to get correspondingly
>  laid out vectors.
>
>     or 2) if you have a MPIBAIJ
>  matrix with "equal" row layout and you want a new
>  one with uneven row layout you can simply use
>  MatGetSubMatrix() to create that new matrix.
>
>    Barry
>
>  Unless you have another reason to have the
>  matrix with an equal number row layout I would just generate
>  the matrix with the layout you want.
>
>
>  > On Mar 22, 2015, at 5:50 PM, Steena M
>  <stm8086 at yahoo.com>
>  wrote:
>  >
>  > Hello,
>  >
>  > I need to distribute
>  a sparse matrix such that each proc owns an unequal number
>  of blocked rows before I proceed with MPI MatMult. My
>  initial thoughts on doing this:
>  >
>  > 1) Use  MatGetSubMatrices() on the test
>  MATMPIBAIJ matrix to produce a new matrix where each proc
>  has an unequal number of rows.
>  >
>  > 2) Provide scatter context for vector X
>  (for MatMult )using IS iscol from MatGetSubMatrices() while
>  creating the vector X.
>  >
>  > 3) Call MatMult()
>  >
>  > Will MatMult_MPIBAIJ continue to scatter
>  this matrix and vector such that each proc will own an equal
>  number of matrix rows and corresponding diagonal vector
>  elements? Should I write my own MPIMatMult function to
>  retain my redistribution of the matrix and vector?
>  >
>  > Thanks in
>  advance,
>  > Steena
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150329/c9afc1ec/attachment.html>


More information about the petsc-users mailing list