[petsc-users] Equivalent of all_reduce for sparse matrices

Matthew Knepley knepley at gmail.com
Thu May 8 14:13:18 CDT 2014


On Thu, May 8, 2014 at 2:06 PM, marco restelli <mrestelli at gmail.com> wrote:

> 2014-05-08 18:29 GMT+0200, Matthew Knepley <knepley at gmail.com>:
> > On Thu, May 8, 2014 at 11:25 AM, marco restelli <mrestelli at gmail.com>
> > wrote:
> >
> >> Hi,
> >>    I have a Cartesian communicator and some matrices distributed along
> >> the "x" direction. I would like to compute an all_reduce operation for
> >> these matrices in the y direction, and I wander whether there is a
> >> PETSc function for this.
> >>
> >>
> >> More precisely:
> >>
> >> a matrix A is distributed among processors  0 , 1 , 2
> >> another A is distributed among processors   3 , 4 , 5
> >> another A is distributed among processors   6 , 7 , 8
> >> ...
> >>
> >> The x direction is 0,1,2; while the y direction is 0,3,6,...
> >>
> >> I would like to compute a matrix  B = "sum of the matrices A"  and a
> >> copy of B should be distributed among processors 0,1,2, another copy
> >> among 3,4,5 and so on.
> >>
> >> A way of doing this is getting the matrix coefficients, broadcasting
> >> them along the y direction and summing them in the matrix B; maybe
> >> however there is already a PETSc function doing this.
> >>
> >
> > There is nothing like this in PETSc. There are many tools for this using
> > dense
> > matrices in Elemental, but I have not seen anything for sparse matrices.
> >
> >    Matt
> >
>
> OK, thank you.
>
> Now, to do it myself, is MatGetRow the best way to get all the local
> nonzero entries of a matrix?


I think MatGetSubmatrices() is probably better.

   Matt


>
> Marco
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140508/ace720f7/attachment.html>


More information about the petsc-users mailing list