[petsc-users] PETSc Matrix Partitioning and SPMVM

Matthew Knepley knepley at gmail.com
Mon Feb 20 09:01:10 CST 2012


On Mon, Feb 20, 2012 at 5:14 AM, Bibrak Qamar <bibrakc at gmail.com> wrote:

> Hello all,
>
> The way PETSc MatCreateMPIAIJ distributes an N*N square matrix is row
> wise. And internally every processor stores its local matrix into two sub
> matrices one Diagonal and other part is off-Diagonal. (more here -->
> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatCreateMPIAIJ.html
> ).
>
>
> So when PETSc MatMul (v = A.x) is called it basically tries to hide
> communication of vector (x) by overlapping computation. As for as I
> understand every process first initiates a kind of non-Blocking Bcast of
> vector (x) and continues to do MVM of diagonal submatrix and then waits
> till communication is done and finally does MVM for the off-Diagonal matrix.
>
> My question is this (since I am new) what was the historical reason PETSc
> opted for this Diagonal and off-Diagonal storage?
>

Overlapping communication and computation.


> And what if I want to change the overlapping strategy of MVM by lets say
> introducing a ring based communication of vector (x), then I have to
> partition the local matrix into not two sub-matrices but P sub-matrices
> (here P = number of processors). Does PETSc provide this facility or one
> has to go from scratch to implement different techniques to store local
> matrix?
>

1) Its unclear what that would accomplish

2) You can get that style of communication by altering the VecScatter that
sends the data, rather than the matrix

3) You can always implement another matrix type. We have a lot (see
src/mat/impl)

  Thanks,

     Matt


> Thanks
> Bibrak
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120220/3acfa335/attachment-0001.htm>


More information about the petsc-users mailing list