[petsc-users] Scaling/Preconditioners for Poisson equation

Jed Brown jed at jedbrown.org
Mon Sep 29 14:59:49 CDT 2014


Filippo Leonardi <filippo.leonardi at sam.math.ethz.ch> writes:

> Thank you.
>
> Actually I had the feeling that it wasn't my problem with Bjacobi and CG.
>
> So I'll stick to MG. Problem with MG is that there are a lot of parameters to 
> be tuned, so I leave the defaults (expect I select CG as Krylow method). I 
> post just results for 64^3 and 128^3. Tell me if I'm missing some useful 
> detail. (I get similar results with BoomerAMG).
>
> Time for one KSP iteration (-ksp_type cg -log_summary -pc_mg_galerkin -pc_type 
> mg): 
> 32^3 and 1 proc: 1.01e-1
> 64^3 and 8 proc: 6.56e-01
> 128^3 and 64 proc: 1.05e+00
> Number of PCSetup per KSPSolve:
> 15
> 39
> 65

Presumably you mean PCApply.  Something is wrong here because this
iteration count is way too high.  Perhaps your boundary conditions are
nonsymmetric or interpolation is not compatible with the discretization.

> With BoomerAMG:
> stable 8 iterations per KSP but time per iteration greater than PETSc MG and 
> still increases:
> 64^3:  3.17e+00
> 128^3: 9.99e+00
>
>
> --> For instance with 64^3 (256 iterations):

In the first pass with geometric multigrid, don't worry about timing and
get the iterations figured out.  Are you using a cell-centered or
vertex-centered discretization.  When you say 128^3, is that counting
the number of elements or the number of vertices?  Note that if you have
a vertex-centered discretization, you will want a 129^3 grid.  With
PCMG, make sure you are getting the number of levels of refinement that
you expect.

You should see something like the following (this is 193^3).

$ mpiexec -n 4 ./ex45 -da_refine 5 -pc_type mg -ksp_monitor -pc_mg_type full -mg_levels_ksp_type richardson -mg_levels_pc_type sor -ksp_type richardson       
  0 KSP Residual norm 2.653722249919e+03 
  1 KSP Residual norm 1.019366121923e+02 
  2 KSP Residual norm 2.364558296616e-01 
  3 KSP Residual norm 7.438761746501e-04 
Residual norm 1.47939e-06


You can actually do better than this by using higher order FMG
interpolation, by going matrix-free, etc.  For example, HPGMG
(finite-element or finite-volume, see https://hpgmg.org) will solve more
than a million equations/second per core.  Is your application really
solving the constant-coefficient Poisson problem on a Cartesian grid, or
is that just a test?

> Using Petsc Release Version 3.3.0, Patch 3, Wed Aug 29 11:26:24 CDT 2012 

And a reminder to please upgrade to the current version of PETSc.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 818 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140929/c3fc8040/attachment.pgp>


More information about the petsc-users mailing list