[petsc-users] Solve of subdomains without connections

Barry Smith bsmith at mcs.anl.gov
Mon Aug 25 21:36:13 CDT 2014


On Aug 25, 2014, at 9:33 PM, Qin Lu <lu_qin_2000 at yahoo.com> wrote:

> What if I use -pc_type asm? Will the communication be avoided in this special case?

   It will do one reduction at the beginning of building the preconditioner when it determines that there is no overlap between subdomains. Otherwise it will have no more communication than bjacobi. But the solver will end up being identical to using bjacobi so why bother with ASM?

   Barry

> 
> Thanks,
> Qin
> 
> From: Barry Smith <bsmith at mcs.anl.gov>
> To: Qin Lu <lu_qin_2000 at yahoo.com> 
> Cc: Petsc-users <petsc-users at mcs.anl.gov> 
> Sent: Monday, August 25, 2014 8:19 PM
> Subject: Re: [petsc-users] Solve of subdomains without connections
> 
> 
> On Aug 25, 2014, at 5:18 PM, Qin Lu <lu_qin_2000 at yahoo.com> wrote:
> 
> > Hello,
> >  
> > I am using PETSc ksp solver to solve a problem on a physical domain. The domain is splitted to subdomains in such a way that there is no connections between them, but I still have to solve the whole domain as a single linear system. My questions are:
> >  
> > 1. Does PETSc detect that the matrix is a block diagonal matrix and solve it efficiently?
> > 2. In parallel solve, each subdoamin is assigned to a separate process. Does PETSc solve the system efficiently by avoiding all the unnecessary parallel message passing since there is no connections between processes?
> 
>   If you use block Jacobi preconditioner then there will be no communication during the matrix-vector product nor the preconditioner. However the global reductions for the default Krylov method GMRES will still occur.  To eliminate the global reductions use for a solve
> 
>   -ksp_type preonly -pc_type bjacobi  -sub_ksp_type gmres (or whatever Krylov method you want on each process) -sub_pc_type ilu (or whatever preconditioner you want on each process).
> 
>   Now there will be no communication during the linear solve.
> 
>   Barry
> 
> 
> >  
> > Thanks,
> > Qin
> > 
> > 
> 
> 



More information about the petsc-users mailing list