[petsc-users] PETSc Matrix Partitioning and SPMVM

Bibrak Qamar bibrakc at gmail.com
Mon Feb 20 05:14:27 CST 2012


Hello all,

The way PETSc MatCreateMPIAIJ distributes an N*N square matrix is row wise.
And internally every processor stores its local matrix into two sub
matrices one Diagonal and other part is off-Diagonal. (more here -->
http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatCreateMPIAIJ.html
).


So when PETSc MatMul (v = A.x) is called it basically tries to hide
communication of vector (x) by overlapping computation. As for as I
understand every process first initiates a kind of non-Blocking Bcast of
vector (x) and continues to do MVM of diagonal submatrix and then waits
till communication is done and finally does MVM for the off-Diagonal matrix.

My question is this (since I am new) what was the historical reason PETSc
opted for this Diagonal and off-Diagonal storage?

And what if I want to change the overlapping strategy of MVM by lets say
introducing a ring based communication of vector (x), then I have to
partition the local matrix into not two sub-matrices but P sub-matrices
(here P = number of processors). Does PETSc provide this facility or one
has to go from scratch to implement different techniques to store local
matrix?


Thanks
Bibrak
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120220/5fc853c8/attachment.htm>


More information about the petsc-users mailing list