[petsc-users] Poor multigrid convergence in parallel

Dave May dave.mayhem23 at gmail.com
Mon Jul 21 05:50:08 CDT 2014


Hi Lawrence,

Hmm this sounds odd. The convergence obtained with chebyshev should be
essentially identical in serial and parallel when using a jacobi
preconditioner

1) How did you configure the coarse grid solver in the serial and parallel
test? Are they consistent?

2) Does using one level with PCMG and a chebyshev smoother give the same
answers in serial and parallel? If you precondition with jacobi, the
residuals in serial and parallel should be very similar. If this test
didn't pass, try a 1 level method again again without the preconditioner
(i.e. just apply chebyshev). If the residuals are really different between
these runs, there is likely something wrong/inconsistent with the
definition of the operator on each level, or the way the boundary
conditions are imposed.

3) Is the code valgrind clean?

Cheers,
  Dave


On 21 July 2014 12:18, Lawrence Mitchell <lawrence.mitchell at imperial.ac.uk>
wrote:

> Hello all,
>
> I'm implementing a multigrid solver using PCMG, starting with a simple
> Poisson equation (with strong boundary conditions) to ensure I'm doing
> things right.  Everything works fine in serial, but when running on two
> processes, with the default chebyshev smoother, convergence goes to pot, in
> particular, the preconditioner becomes indefinite.
>
> I've determined that the operators I'm building on each level are indeed
> symmetric, furthermore, if I switch the smoother to jacobi (or sor)
> preconditioned richardson iterations then I get good convergence in both
> serial and parallel runs.  The eigenvalue estimates for the chebyshev
> smoother look plausible in both cases.  Any suggestions as to where to look
> next to figure out what's wrong in my code?  I can try and untangle it such
> that it demonstrates the problem standalone, but that may take a little
> time.
>
> Cheers,
>
> Lawrence
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140721/c8ad32a4/attachment.html>


More information about the petsc-users mailing list