[petsc-dev] sor smoothers
Jed Brown
jedbrown at mcs.anl.gov
Fri Aug 16 18:53:52 CDT 2013
"Mark F. Adams" <mfadams at lbl.gov> writes:
> Some hypre papers have shown that cheb/jacobi is faster for some
> problems but for me robustness trumps this for default solver
> parameters in PETSc.
Richardson/SOR is about the best thing out there in terms of reasonable
local work, low/no setup cost, and reliable convergence. Cheby/Jacobi
just has trivial fine-grained parallelism, but it's not clear that buys
anything on a CPU.
> Jed's analysis suggests that Eisenstat's method saves almost 50% work
> but needs a specialized matrix to get good flop rates. Something to
> think about doing …
Mine was too sloppy, Barry got it right. Eisenstat is for Cheby/SOR,
however, and doesn't do anything for Richardson.
To speed Richardson/SSOR up with a new matrix format, I think we have to
cache the action of the lower triangular part in the forward sweep so
that the back-sweep can use it, and vice-versa. With full caching and
triangular residual optimization, I think this brings 2 SSOR iterations
of the down-smoother plus a residual to 2.5 work units in the
down-smooth (zero initial guess) and 3 work units in the up-smooth
(nonzero initial guess). (This is a strong smoother and frequently, one
SSOR would be enough.)
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 835 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20130816/4f249f54/attachment.sig>
More information about the petsc-dev
mailing list