[petsc-users] multigrid questions

Barry Smith bsmith at mcs.anl.gov
Wed Feb 15 13:13:29 CST 2017


> On Feb 15, 2017, at 12:19 PM, Matthew Knepley <knepley at gmail.com> wrote:
> 
> On Wed, Feb 15, 2017 at 10:42 AM, Matt Landreman <matt.landreman at gmail.com> wrote:
> Hi, Petsc developers,
> 
> A few basic questions about geometric and algebraic multigrid in petsc:
> 
> Based on the output of -ksp_view, I see that for both -pc_type mg and -pc_type gamg, the smoothing on each multigrid level is implemented using a ksp with default type chebyshev, associated with a pc of default type sor. I also see in section 4.4.5 of the petsc manual that switching to -mg_levels_ksp_type richardson is sometimes recommended, and I see in the ksp examples makefile that ksp ex28.c uses -mg_levels_ksp_type gmres.
> 
> 1. Does “textbook” Gauss-Seidel smoothing correspond in petsc to  -mg_levels_ksp_type richardson?
> 
> Yes.
>  
> 2. It says in http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/KSP/KSPCHEBYSHEV.html that PETSc’s chebyshev iteration only works for symmetric matrices. So to confirm, for any problem with non-symmetric matrices such as convection-diffusion, setting -mg_levels_ksp_type to some non-default value like richardson or gmres is mandatory then?  (If so, it would be helpful if this were stated in the manual.)
> 
> Yes, this is true.
>  
> 3. For smoothing with geometric multigrid, is there a way in petsc to do red-black Gauss Seidel? Line relaxation?
> 
> I don't know if Barry has something special, but you could easily do it using PCFIELDSPLIT with the default splitting and
> Jacobi on each block. Line relaxation is similar, with different blocking and Block-Jacobi.

   For optimal implementation of line relaxation (assuming each process is only doing sequential lines) you would write a PCSHELL that looped over the "lines" for your geometry and use inside it a PC created with the tridiagonal matrix for a line. Line relaxation in parallel opens all kinds of questions; it is possible but I suspect it will have poor parallel efficiency.


> 
>   Matt
>  
> Thanks,
> Matt Landreman
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener



More information about the petsc-users mailing list