[petsc-users] KSP: domain decomposition and distribution

Dave May dave.mayhem23 at gmail.com
Wed Jan 8 06:46:25 CST 2014


You asked how the problem was split between processes. In your case, this
is defined by the matrix.

 The default solver in petsc is gmres preconditioned with block Jacobi and
ilu(0) applied on each block. The "block" I refer to is the piece of the
matrix locally owned by each processor (which is thus defined by the matrix
layout/partitioning)





On Wednesday, 8 January 2014, mary sweat wrote:

> I do not explicitily check the size, because I use PETSC_DECIDE, instead
> I specify the number of processes. What I really care about is hw does
> KSPSolve solve the system in a parallel way with multiple processes.
>
>
>   Il Mercoledì 8 Gennaio 2014 12:34, Dave May <dave.mayhem23 at gmail.com<javascript:_e({}, 'cvml', 'dave.mayhem23 at gmail.com');>>
> ha scritto:
>  Please check out the manual page for MatSetSizes()
>
> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatSetSizes.html
>
> Basically you have two choices:
>
> 1/ Define the global size of the matrix and use PETSC_DECIDE for the local
> sizes.
> In this case, PETSc will define the local row size in a manner such that
> there are approximately the same number of rows on each process.
>
> 2/ Define the local sizes yourself and use PETSC_DETERMINE for the global
> size.
> Then you have full control over the parallel layout.
>
> The following functions described by these pages
>
>
> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetSize.html
>
> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetLocalSize.html
>
> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetOwnershipRanges.html
>
> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetOwnershipRangesColumn.html
>
> might also be useful for you in double checking what the matrix
> decomposition looks like
>
>
> Cheers,
>   Dave
>
>
>
> On 8 January 2014 12:26, mary sweat <mary.sweat78 at yahoo.it<javascript:_e({}, 'cvml', 'mary.sweat78 at yahoo.it');>
> > wrote:
>
> My target is the following. I got a huge linear system with a sparse huge
> matrix, nothing to deal with PDE. How is the system splitted between
> processes? is there in this suggested book the answer?
> Thanks again
>
>
>   Il Martedì 7 Gennaio 2014 17:34, Jed Brown <jedbrown at mcs.anl.gov<javascript:_e({}, 'cvml', 'jedbrown at mcs.anl.gov');>>
> ha scritto:
>   mary sweat <mary.sweat78 at yahoo.it <javascript:_e({}, 'cvml',
> 'mary.sweat78 at yahoo.it');>> writes:
>
>
> > Hi all,  I need to know how does KSP separate and distribute domain
> > between processes and the way processes share and communicate halfway
> > results. Is there any good documentation about it???
>
>
> The communication is in Mat and Vec functions.  You can see it
> summarized in -log_summary.  For the underlying theory, see Barry's
> book.
>
> http://www.mcs.anl.gov/~bsmith/ddbook.html
>
>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140108/c692b1cd/attachment.html>


More information about the petsc-users mailing list