[petsc-users] Solve of subdomains without connections

Qin Lu lu_qin_2000 at yahoo.com
Mon Aug 25 21:33:03 CDT 2014


What if I use -pc_type asm? Will the communication be avoided in this special case?

Thanks,
Qin


________________________________
 From: Barry Smith <bsmith at mcs.anl.gov>
To: Qin Lu <lu_qin_2000 at yahoo.com> 
Cc: Petsc-users <petsc-users at mcs.anl.gov> 
Sent: Monday, August 25, 2014 8:19 PM
Subject: Re: [petsc-users] Solve of subdomains without connections
 


On Aug 25, 2014, at 5:18 PM, Qin Lu <lu_qin_2000 at yahoo.com> wrote:

> Hello,
>  
> I am using PETSc ksp solver to solve a problem on a physical domain. The domain is splitted to subdomains in such a way that there is no connections between them, but I still have to solve the whole domain as a single linear system. My questions are:
>  
> 1. Does PETSc detect that the matrix is a block diagonal matrix and solve it efficiently?
> 2. In parallel solve, each subdoamin is assigned to a separate process. Does PETSc solve the system efficiently by avoiding all the unnecessary parallel message passing since there is no connections between processes?

   If you use block Jacobi preconditioner then there will be no communication during the matrix-vector product nor the preconditioner. However the global reductions for the default Krylov method GMRES will still occur.  To eliminate the global reductions use for a solve

   -ksp_type preonly -pc_type bjacobi   -sub_ksp_type gmres (or whatever Krylov method you want on each process) -sub_pc_type ilu (or whatever preconditioner you want on each process).

   Now there will be no communication during the linear solve.

  Barry


>  
> Thanks,
> Qin
> 
> 
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140825/46190e1f/attachment.html>


More information about the petsc-users mailing list