[petsc-users] KSP: domain decomposition and distribution

Jed Brown jed at jedbrown.org
Sun Jan 26 12:01:01 CST 2014

mary sweat <mary.sweat78 at yahoo.it> writes:

> Resuming: I have a parabolic differential equation, through the finite difference scheme I obtain a linear system of equations with laplace matrix as coefficient matrix, i.e. the coefficient matrix is sparse, huge and structured. 
> Then I solve this system with GMRES+Jacob.
> I don't care about the number of processes and the size of the portions of the matrix assigned to the processes; of course the matrix is partitioned in blocks assigned to the processes
> the problem is that I need to know, just theoretically, how is the matrix splitted between processes 

MatGetOwnershipRanges(), MatGetOwnershipRangesColumn()

> Moreover, how does it happen on GPUs??

Same, but arrays are mirrored to the GPU and updated lazily.

> Essentially in which way the domain is splitted between processes? 

If you use DM (like many examples), the DM provides the decomposition.
Otherwise it was your numbering.  There are many ways to do it; see the

> when do they communicate to synchronize/share/exchange partial
> results?

When algorithmically necessary.

> I nees to know also how does it all happen on a GPU.

-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 835 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140126/518ccf02/attachment.pgp>

More information about the petsc-users mailing list