[petsc-dev] Bordered systems and low-rank corrections
Barry Smith
bsmith at mcs.anl.gov
Sun Nov 6 12:09:12 CST 2011
On Nov 6, 2011, at 9:48 AM, Mark F. Adams wrote:
>
> On Nov 5, 2011, at 6:19 PM, Jed Brown wrote:
>
>
> I see you did ask about "-pc_fieldsplit_type schur and Richardson" in your email. I was not trying to imply that (nontrivial) Krylov is never useful. I've had people ask me this at talks: 'why are you using Richardson when everyone knows that CG is cooler'.
Mark,
You can/should always have the numbers and graph right there to show them; "look for this class of problems CG doesn't buy you smack". The problem with NOT having the numbers to show them (I know you have the numbers from your experience but if you don't show them to the questioners) is that it then LOOKS like a philosophical/religious reason to use Richardson. This isn't good, much better that people don't think it is philosophical/religious and know that it is based on actual numbers.
Barry
> It would be nice to just use CG for social reasons of nothing else, and its cheap compared to the preconditioner for A anyway. And I think the Stokes Schur compliment (kind of like the contact problems that I have experience with) are very well conditioned -- Uzawa always converged with about on digit per iteration for me. This is pretty fast for unpreconditioned Richardson and I've seen on many test problems in solid mechanics contact.
>
>> An alternative is to apply A^{-1} inexactly, perhaps only apply a preconditioner for S (instead of doing inner iterations) and run a Krylov method in the full space.
>
> This seems like a good thing to have in the toolbox. The Bramble paper has this (eq, 2.5) as their ultimate linear algorithm as I recall, and I've never run just the two preconditioners in the loop (ie, a very approximate solve for A). In this case the outer iteration would look more like what your solve for A -- Krylov generally cheap compared to the other costs in each iteration so its something that you definitely want available.
>
>> If you do this, the lower-triangular part of the factorization can usually be dropped because it does not affect the eigenvalues (all eigenvalues are 1, with minimal polynomial of degree 2). This is the philosophy of the "block preconditioners" crowd. It often wins over SCR when the fields are well-scaled, but SCR is more robust to radically different scales for the primal and dual variables.
>
> Humm, I would not like (ie, hate) a method that is sensitive to scaling. eg, my ex56 runs the solver multiple times with large rescalings of the problem just to test that the solver is invariant to scaling -- so this property is in my regression tests. I would question wether this sensitivity to scaling (eg, units) can not be fixed for "block preconditioners" -- especially if I'm in the "block preconditioners" crowd :)
>
More information about the petsc-dev
mailing list