[petsc-dev] How do you get RIchardson?

Matthew Knepley knepley at gmail.com
Fri Sep 16 11:58:29 CDT 2011


On Fri, Sep 16, 2011 at 11:53 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
>   This will take more research, the references you cite below are too
> cluttered with other considerations to be definitive.
>
>    In the current "Picard/Richardson" we have two cases
>
>          x^{n+1}   = x^{n}  - lambda F(x^{n})     the default and then
>
>          x^{n+1}   = x^{n}  +  lambda d^{n}    where d^{n} is obtained by
> applying any iterative procedure to SNESSolve(x^{n}) resulting in
> \hat{x}^{n+1} and then computing d^{n} = \hat{x}^{n+1} - x^{n}
>
>    There is an analogy with unpreconditioned Richardson in the linear case
> where F(x) = Ax - b  hence F(x^{n}) = Ax^{n} - b  and with preconditioned
> Richardson B(b - Ax^{n}) is equivalent to d^{n} = B(b - Ax^{n}) so one can
> easily win an argument that what we have implemented is a general
> "preconditioned" nonlinear Richardson that is a direct generalization of
> preconditioned Richardson for linear systems.
>
>    What concerns me about calling it Picard is that different people use
> Picard to mean different things, for example  Jed calls the iterative
> process for the nonlinear operator div K(u) grad u  Picard when he does
> solves of  div K(u^{n} grad u^{n+1}, then there is the whole business of
> Picard for ODEs.  Do they all fit into the case above?
>

Picard for ODEs is exactly what I call Picard, in that the forward ODE
operator is repeatedly applied. This is easy to see because
the convergence analysis just relies on the contraction mapping principle.

I have the Louis Rall book at home and will look up all his references when
I get there, as he has an entire chapter on Picard's method
for nonlinear algebraic equations.


>   I admit I've never heard the term Richardson applied for nonlinear
> equations but I like the fact that we will have similar names for the linear
> and nonlinear cases: we will have nonlinear Richardson, nonlinear GMRES,
> nonlinear CG for SNES (all allowing nonlinear preconditioning) to go with
> the linear Richardson, GMRES, and CG (all allowing preconditioning). Neat
> and clean (of course we need to clarify everything well in the manual pages,
> maybe change SNESRICHARSON to SNESNRICHARDSON).
>

Yes, I agree it has superior continuity with the linear case.


>    I'm willing to put back Picard but only if it makes sense.


I am sure it does (Jed's bastard usage notwithstanding), however I could be
fine with
Richardson as long as the documentation clearly notes that this is Picard's
method.

   Matt


>
>    Barry
>
>
>
> On Sep 16, 2011, at 8:33 AM, Matthew Knepley wrote:
>
> > http://petsc.cs.iit.edu/petsc/petsc-dev/rev/387e672db72a
> >
> > These are cite Picard for this method:
> >
> >  http://math.fullerton.edu/mathews/n2003/picarditerationmod.html
> >  http://cdsweb.cern.ch/record/419580/files/ext-2000-010.pdf
> >
> http://books.google.com/books/about/Computational_solution_of_nonlinear_oper.html?id=dexQAAAAMAAJ
> >
> http://books.google.com/books?id=BVkbypnD0yUC&pg=PA87&lpg=PA87&dq=Terence+tao+picard&source=bl&ots=VQOpQbFsyn&sig=b-KdPdTXtenRi-dBPsLfdhHcPpY&hl=en&ei=1U9zTv2uFungiAKouuWzAg&sa=X&oi=book_result&ct=result&resnum=7&ved=0CD4Q6AEwBg#v=onepage&q&f=false
> >
> > Richardson is specifically for linear equations I thought.
> >
> >     Matt
> >
> > --
> > What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> > -- Norbert Wiener
>
>


-- 
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20110916/b94a5ffb/attachment.html>


More information about the petsc-dev mailing list