[petsc-users] Solve of subdomains without connections
Barry Smith
bsmith at mcs.anl.gov
Mon Aug 25 20:19:16 CDT 2014
On Aug 25, 2014, at 5:18 PM, Qin Lu <lu_qin_2000 at yahoo.com> wrote:
> Hello,
>
> I am using PETSc ksp solver to solve a problem on a physical domain. The domain is splitted to subdomains in such a way that there is no connections between them, but I still have to solve the whole domain as a single linear system. My questions are:
>
> 1. Does PETSc detect that the matrix is a block diagonal matrix and solve it efficiently?
> 2. In parallel solve, each subdoamin is assigned to a separate process. Does PETSc solve the system efficiently by avoiding all the unnecessary parallel message passing since there is no connections between processes?
If you use block Jacobi preconditioner then there will be no communication during the matrix-vector product nor the preconditioner. However the global reductions for the default Krylov method GMRES will still occur. To eliminate the global reductions use for a solve
-ksp_type preonly -pc_type bjacobi -sub_ksp_type gmres (or whatever Krylov method you want on each process) -sub_pc_type ilu (or whatever preconditioner you want on each process).
Now there will be no communication during the linear solve.
Barry
>
> Thanks,
> Qin
>
>
More information about the petsc-users
mailing list