[petsc-users] Projection preconditioner as PCMG

Jed Brown jedbrown at mcs.anl.gov
Fri Mar 9 09:24:58 CST 2012


On Fri, Mar 9, 2012 at 08:53, Abdul Hanan Sheikh <hanangul12 at yahoo.co.uk>wrote:

>
>
>
>   ------------------------------
> *From:* Jed Brown <jedbrown at mcs.anl.gov>
> *To:* Abdul Hanan Sheikh <hanangul12 at yahoo.co.uk>
> *Cc:* "petsc-users at mcs.anl.gov" <petsc-users at mcs.anl.gov>
> *Sent:* Friday, 9 March 2012, 14:08
>
> *Subject:* Re: [petsc-users] Projection preconditioner as PCMG
>
> On Fri, Mar 9, 2012 at 05:16, Abdul Hanan Sheikh <hanangul12 at yahoo.co.uk>wrote:
>
> Dear,
>
> To my earlier querry:
>
> What if I want to approximate my all coarse matrices with any Krylov
> iteration ?
>
> reply:
> The methods on each level are independent, you can set them with
> -mg_coarse_ksp_type gmres -mg_levels_1_ksp_type cg -mg_levels_1_ksp_max_it
> 100 -mg_levels_2_ksp_type minres ...
>
> This command line option changes the ksp_type on both ksp_pre and ksp_post
> (pre and post smoothing) .
> Where as I need to fix the ksp_post_type as RICHARDSON to get my Prec = C
> + I - AC ;
>
>
> Do you want this on every level or still just the finest?
> I need this on every level.
>
> You can set each with code or you can do it all with the following
options. Some notes:

1. I use Chebychev instead of Richardson because I don't like tuning
Richardson. The option -ksp_richardson_self_scale make the preconditioner
nonlinear, in which case we'd have to use FGMRES or GCR on the outside.
Chebychev(1) is equivalent to Richardson with some damping factor. I found
the Chebychev bounds (3,9) by running with
-mg_levels_ksp_chebychev_estimate_eigenvalues 0,0.3,0,1.1 which targets the
upper end of the spectrum (from 0.3*maxeig to 1.1*maxeig where maxeig is an
estimate of the largest eigenvalue on each level). I ran with -snes_view
and noticed that those bounds happened to be about the same on all levels,
so I just set them directly. You can translate that to a Richardson
relaxation if you like, but it tends to be more fragile.

2. The unpreconditioned norm gives much more accurate condition number
estimates here.

3. You can also do Kaskade multigrid (which is exactly what you are asking
for) with -pc_mg_smoothdown 0 -pc_mg_smoothup 1.

4. If you have variable coefficients, you will need to use the diagonal in
the Richardson smoother for scaling (-mg_levels_pc_type jacobi). Remember
that this will force you to recompute your Chebychev/Richardson bounds (or
keep doing it automatically with
-mg_levels_ksp_chebychev_estimate_eigenvalues 0,0.3,0,1.1).

$ ./ex5 -da_grid_x 257 -da_grid_y 257 -snes_max_it 1 -ksp_type gmres
-ksp_norm_type unpreconditioned -ksp_monitor_singular_value
-ksp_compute_eigenvalues -pc_type mg -pc_mg_levels 9 -pc_mg_type kaskade
-mg_levels_ksp_type chebychev -mg_levels_ksp_chebychev_eigenvalues 3,9
-mg_levels_pc_type none
    0 KSP Residual norm 1.062542528908e+00 % max 1.000000000000e+00 min
1.000000000000e+00 max/min 1.000000000000e+00
    1 KSP Residual norm 2.376610964454e-01 % max 1.001957907202e+00 min
1.001957907202e+00 max/min 1.000000000000e+00
    2 KSP Residual norm 4.282525183085e-02 % max 1.139043558191e+00 min
8.730762970080e-01 max/min 1.304632323767e+00
    3 KSP Residual norm 4.732778291888e-03 % max 1.148710370654e+00 min
7.632757295550e-01 max/min 1.504974318158e+00
    4 KSP Residual norm 7.292121371221e-04 % max 1.148751387861e+00 min
7.091101704841e-01 max/min 1.619990003919e+00
    5 KSP Residual norm 8.739575412872e-05 % max 1.152691180981e+00 min
6.760500858171e-01 max/min 1.705038140166e+00
    6 KSP Residual norm 1.511456466330e-05 % max 1.154316560884e+00 min
6.659327122249e-01 max/min 1.733383177750e+00
    7 KSP Residual norm 2.020623105869e-06 % max 1.154629497508e+00 min
6.591981750318e-01 max/min 1.751566586865e+00
Iteratively computed eigenvalues
0.659531 + 0i
0.742062 + 0i
0.815314 + 0i
0.96316 + 0.0106016i
0.96316 - 0.0106016i
1.08027 + 0i
1.11294 + 0i

$ ./ex5 -da_grid_x 257 -da_grid_y 257 -snes_max_it 1 -ksp_type gmres
-ksp_norm_type unpreconditioned -ksp_monitor_singular_value
-ksp_compute_eigenvalues -pc_type mg -pc_mg_levels 9 -pc_mg_type
multiplicative -mg_levels_ksp_type chebychev
-mg_levels_ksp_chebychev_eigenvalues 3,9 -mg_levels_pc_type none
    0 KSP Residual norm 1.062542528908e+00 % max 1.000000000000e+00 min
1.000000000000e+00 max/min 1.000000000000e+00
    1 KSP Residual norm 3.178344085522e-02 % max 9.704936480137e-01 min
9.704936480137e-01 max/min 1.000000000000e+00
    2 KSP Residual norm 1.968390067011e-03 % max 9.897660772730e-01 min
9.220434626490e-01 max/min 1.073448397356e+00
    3 KSP Residual norm 1.145527676866e-04 % max 9.992147688651e-01 min
8.443712510420e-01 max/min 1.183383218735e+00
    4 KSP Residual norm 5.642680559393e-06 % max 1.001387939323e+00 min
8.242353430425e-01 max/min 1.214929628747e+00
Iteratively computed eigenvalues
0.824026 + 0i
0.918072 + 0i
0.968702 + 0i
1.00085 + 0i


>
>
>
>
> If adapt pre_smoother as follows :
>
> ierr = MGGetSmootherDown(pcmg,lev,&ksp_pre); CHKERRQ(ierr);
> ierr = KSPSetType(ksp_pre, KSPGMRES);CHKERRQ(ierr);
>
> Does ONLY setting pre_smoother as GMRES means that GMRES iterates on
> coarse matrix at level = lev ?
>
>
> Yes, this will only affect the pre-smoother (so it will then go to the
> next coarser grid, come back, and do a post-smoother). You may have to
> increase the number of iterations. I strongly recommend configuring the
> solver using the command line and checking that it is the method you
> intended by reading the output of -ksp_view.
>  Yes I always see what Multilevel solver do by -ksp_view.. Thanks a lot.
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120309/6efe0c17/attachment.htm>


More information about the petsc-users mailing list