[petsc-users] Ainsworth formula to solve saddle point problems / preconditioner for shell matrices

Jed Brown jed at jedbrown.org
Tue Oct 6 23:49:10 CDT 2020


Barry Smith <bsmith at petsc.dev> writes:

>> On Oct 6, 2020, at 6:57 AM, Olivier Jamond <olivier.jamond at cea.fr> wrote:
>> 
>> 
>> On 03/10/2020 00:23, Barry Smith wrote:
>>>    I think what Jed is saying is that you should just actually build your preconditioner for your Ct*C + Qt*S*Q operator with S. Because Ct is tall and skinny the eigenstructure of Ct*C + Qt*S*Q is just the eigenstructure of S with a low rank "modification" and Krylov methods (GMRES) are good at solving problems where the eigenstructure of the preconditioner is only a small rank modification of the eigenstructure of the operator you are supply to GMRES. In the best situation each new iteration of GMRES corrects one more of the "rogue" eigen directions. I would first use a direct solver with S just to test how well it works as a preconditioner and then switch to GAMG or whatever should work efficiently for solving your particular S matrix.
>>> 
>>>   I'd be interested in hearing how well the Ainsworth Formula works, it is something that might be worth adding to PCFIELDSPLIT.
>>> 
>>> 
>>>   Barry
>> 
>> Hi Barry,
>> 
>> Thanks for these clarifications.
>> 
>> To give some context, the test I am working on is a traction on an elastoplastic cube in large strain on which I apply 2% of strain at the first loading increment. The cube has 14739 dofs, and the number of rows of the C matrix is 867.
>> 
>> In this simple case, the C matrix just refers to simple dirichlet conditions. Then Q is diagonal with 1. on dofs without dirichlet on 0. for dofs with dirichlets. Q'*S*Q is like S with zeros on lines/columns referring to dofs with dirichlet, and then C'*C just re-add non null value on the diagonal for the dofs with dirichlet. In the end, I feel that in this case the ainsworth method just do exactly the same as row/column elimination that can be done with MatZeroRowsColumns and the x and b optional vectors provided.
>> 
>> On this test, with '-ksp_rtol 1.e-9' and '-ksp_type gmres', using S as a preconditionning matrix and a direct solver gives 65 iterations of the gmres for my first newton iteration (where S is SPD) and between 170 and 290 for the next ones (S is still symmetric but has negative eigenvalues). If I use '-pc_type gamg', the number of iterations of the gmres for the first (SPD) newton iteration is (14 with Sp / 23 with S), and for the next ones (not SPD) it is (~45 with Sp / ~180 with S).
>
>    Given the structure of C it seems you should just explicitly
>    construct Sp and use GAMG (or other preconditioners, even a direct
>    solver) directly on Sp. Trying to avoid explicitly forming Sp will
>    give you a much slower performing solving for what benefit? 

Note that S is singular if it's a pure Neumann stiffness matrix (rigid body modes are in the null space).  I'm with Barry that you should just form Sp, which is much more solver friendly.  If your constraint matrix C has dense rows (e.g., integral conditions on the boundary), then use FieldSplit for the saddle point problem.


More information about the petsc-users mailing list