[petsc-users] about ksp

Matthew Knepley knepley at gmail.com
Mon Aug 29 16:05:28 CDT 2011


On Mon, Aug 29, 2011 at 8:51 PM, Likun Tan <likunt at andrew.cmu.edu> wrote:

> Instead of solving Ax=b with different right-hand-side sequentially, we
> can also form a sparse block diagonal matrix A and a vector b composed of
> all the elements. Then we can set values to each section of b concurrently
> and solve the enlarged system in parallel, is this an efficient way?
>
> And also, I found MatCreateMPIBDiag() is used to define a sparse block
> diagonal matrix, in my case, each block has the same format and elements,
> how could i set values efficiently?
>

There is no reason to introduce synchronization across these problems. Just
split the communicator, put a KSP in each subcommunicator, and solve.

    Matt


> Thanks,
> Likun
>
>
> On Mon, August 29, 2011 11:14 am, Jed Brown wrote:
> > On Mon, Aug 29, 2011 at 10:08, Matthew Knepley <knepley at gmail.com>
> wrote:
> >
> >
> >> As I said, no one knows
> >> how to do this for Krylov methods (and everyone has tried).
> >>
> >
> > There are methods and even simply running multiple independent Krylov
> > solves concurrently would be good for memory traffic (matrix entries get
> > reused for multiple vectors). The problem with block Krylov methods (that
> > try to share information between multiple simultaneous solves) is loss of
> > orthogonality between the subspaces generated by each vector. And it's
> not
> > so much that there are no ways to detect and account for this, but
> > robustness is still a problem and making efficient software for it is
> more
> > tricky and AFAIK, has not been done.
> >
>
>
>
>
>


-- 
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110829/5b805a6c/attachment.htm>


More information about the petsc-users mailing list