[petsc-dev] Should -pc_type mg be a linear preconditioner by default?

Jed Brown jedbrown at mcs.anl.gov
Tue Oct 25 19:13:50 CDT 2011


On Tue, Oct 25, 2011 at 17:48, Mark F. Adams <mark.adams at columbia.edu>wrote:

> As Jed says Cheb is great for SPD problem, and his estimator stuff is
> excellent and should be made the default for smoothers.  One problem is that
> the lower range should depend on coarsening rate (ie, 3D cubic elements
> should be lower than 2D quads but the Cheb polynomial is at least smooth at
> that end).  I might use a lower factor than 0.1 (thats a 2D quad and is at
> the low end of what anyone would do) but not a big deal.
>

For the unstructured problem we were playing with a couple weeks ago, there
was a nice broad region where the iteration count was essentially the same.
Playing with other problems, larger values seem to be better.

The following seems to be a pretty hard problem. I'm using grid sequencing
here because something is needed for globalization, but I only care about
the finer level linear solves.

mpiexec -n 4 ./ex50 -lidvelocity 1e3 -prandtl 1e4 -snes_grid_sequence 4
-snes_monitor_short -ksp_monitor -pc_type mg -mg_levels_ksp_type chebychev
-mg_levels_ksp_max_it 3 -mg_levels_ksp_chebychev_estimate_eigenvalues
0,0.5,0,1.1 -mg_levels_pc_type sor

This problem diverges if you bring the lower Cheby estimate down to
0.1. This configuration is also beating anything I've tried that has inner
Krylov iterations (including ASM with direct subdomain solves).

Ex48 is a funny 3D problem, but 0,0.1,0,1.1 seems to be a good set of
parameters there as well. (I was considering a reasonably anisotropic
problem where I did not semi-coarsen, and preconditioning with ICC to bind
the tightly coupled direction together. I couldn't get PBJacobi or SOR
smoothing to work here.) The following can be solved much faster using
(aggressive) semi-coarsening, but this configuration is not bad for
isotropic coarsening.

mpiexec -n 4 ./ex48 -M 6 -P 4 -thi_hom x -dmmg_grid_sequence -pc_mg_type
multiplicative -thi_L 80e3 -thi_nlevels 4 -ksp_monitor -snes_monitor
-ksp_type fgmres -mg_levels_ksp_type chebychev -mg_levels_ksp_max_it 3
-mg_levels_ksp_chebychev_estimate_eigenvalues 0,0.1,0,1.1
-mg_levels_sub_pc_type icc


I'd be interested to see your problems where 0.1 is too large. It wouldn't
surprise me with aggressive coarsening in 3D, but 0.1 seems to be a good
default for standard 2^d coarsening that is our default for geometric
multigrid.

For very unsymmetric problems, that is problems with hyperbolic terms you
> want to use Gauss-Seidel, and sometimes you want to damp it.  (I would still
> use Cheb for unsymmetric elliptic operators like what you can get in FV or
> other minor perturbations to an SPD matrix.)  Setting a damping parameter is
> too complex to figure out automatically, and is not always needed, and we do
> not have a parallel Gauss-Seidel in PETSc.  I actually have a parallel G-S,
> it is complex but I could move it over to PETSc if there is demand for it.
>

Is it specialized for BAIJ(3)? Maybe we should look at your approach and see
if we can find a way to make it simple enough for the rest of us to
understand.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20111025/df364e1f/attachment.html>


More information about the petsc-dev mailing list