[petsc-users] Fieldsplit schur complement with preonly solves

Matthew Knepley knepley at gmail.com
Tue Mar 11 10:06:53 CDT 2014


On Tue, Mar 11, 2014 at 9:56 AM, Luc Berger-Vergiat <
luc.berger.vergiat at gmail.com> wrote:

> Hi all,
> I am testing some preconditioners for a FEM problem involving different
> types of fields (displacements, temperature, stresses and plastic strain).
> To make sure that things are working correctly I am first solving this
> problem with:
>
> -ksp_type preonly -pc_type lu, which works fine obviously.
>
>
> Then I move on to do:
>
> -ksp_type gmres -pc_type lu, and I get very good convergence (one gmres
> linear iteration per time step) which I expected.
>
>
> So solving the problem exactly in a preconditioner to gmres leads to
> optimal results.
> This can be done using a Schur complement, but when I pass the following
> options:
>
> -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur
> -pc_fieldsplit_schur_factorization_type full -pc_fieldsplit_0_fields 2,3
> -pc_fieldsplit_1_fields 0,1 -fieldsplit_0_ksp_type
> preonly -fieldsplit_0_pc_type lu -fieldsplit_1_ksp_type
> preonly -fieldsplit_1_pc_type lu
>
> My results are terrible, gmres does not converge and my FEM code reduces
> the size of the time step in order to converge.
> This does not make much sense to me...
>

For any convergence question, we need the output of -ksp_view -ksp_monitor,
and for this we would also like

  -fieldsplit_0_ksp_monitor -fieldsplit_1_ksp_monitor

Yes, something is wrong. Very often this is caused by submatrices which
have a null space. Of course that null space
should show up in LU, unless you are perturbing the factorization.

   Matt


> Curiously if I use the following options:
>
> -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur
> -pc_fieldsplit_schur_factorization_type full -pc_fieldsplit_0_fields 2,3
> -pc_fieldsplit_1_fields 0,1 -fieldsplit_0_ksp_type gmres
> -fieldsplit_0_pc_type lu -fieldsplit_1_ksp_type gmres -fieldsplit_1_pc_type
> lu
>
> then the global gmres converges in two iterations.
>
> So using a pair of ksp gmres/pc lu on the A00 block and the Schur
> complements works, but using lu directly doesn't.
>
> Because I think that all this is quite strange, I decided to dump some
> matrices out. Namely, I dumped the complete FEM jacobian, I also do a
> MatView on jac->B, jac->C and the result of KSPGetOperators on kspA. These
> returns three out of the four blocks needed to do the Schur complement.
> They are correct and I assume that the last block is also correct.
> When I import jac->B, jac->C and the matrix corresponding to kspA in
> MATLAB to compute the inverse of the Schur complement and pass it to gmres
> as preconditioner the problem is solved in 1 iteration.
>
> So MATLAB says:
>
> -ksp_type gmres -pc_type fieldsplit -pc_fieldsplit_type schur
> -pc_fieldsplit_schur_factorization_type full -pc_fieldsplit_0_fields 2,3
> -pc_fieldsplit_1_fields 0,1 -fieldsplit_0_ksp_type
> preonly -fieldsplit_0_pc_type lu -fieldsplit_1_ksp_type
> preonly -fieldsplit_1_pc_type lu
>
> should yield only one iteration (maybe two depending on implementation).
>
> Any ideas why the Petsc doesn't solve this correctly?
>
> Best,
> Luc
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140311/5035c05d/attachment.html>


More information about the petsc-users mailing list