[petsc-users] understanding the LSC preconditioner

David Nolte dnolte at dim.uchile.cl
Tue Feb 21 10:07:42 CST 2017


Dear all,

new to PETSc, I am trying to use the LSC preconditioner for a Stokes
problem (discretized by means of stable FEM). I use the python backend
petsc4py.
The "automatic" version of the LSC seems to work with the following
setup (I think adapted from Matt's tutorial slides):

    -ksp_view
    -ksp_converged_reason
    -ksp_monitor_true_residual
    -ksp_type fgmres
    -ksp_rtol 1.0e-8

    -pc_type fieldsplit
    -pc_fieldsplit_detect_saddle_point
    -pc_fieldsplit_type schur
    -pc_fieldsplit_schur_fact_type upper
    -pc_fieldsplit_schur_precondition self

    -fieldsplit_0_ksp_type preonly
    -fieldsplit_0_pc_type ml

    -fieldsplit_1_ksp_type preonly
    -fieldsplit_1_pc_type lsc
    -fieldsplit_1_lsc_pc_type ml
    -fieldsplit_1_lsc_ksp_type preonly

In a 3D setup with 250k dofs this converges within 78 iterations. (For
reference, upper Schur factorization with ML for the uu-block and Sp =
diag(Q), the diagonal of the pressure mass matrix, takes 41 iterations
and half of the computation time.)

Now I just wanted to check if I can get the same result by building the
L-matrix manually with the following piece of python code, where is0,
is1 are the index sets corresponding to the velocity and pressure dofs,
and A is full the system matrix.
           
            Sp = Sp.getSubMatrix(is1, is1)
            pc.setFieldSplitSchurPreType(PETSc.PC.SchurPreType.USER, Sp)
            # Sp.setType(PETSc.Mat.Type.SCHURCOMPLEMENT)   # necessary?
            # extract A10 block
            Bdiv = A.getSubMatrix(is1, is0)
            # extract A01 block
            Bgrad = A.getSubMatrix(is0, is1)
            L = Bdiv
            L.matMult(Bgrad)
            Sp.compose('LSC_L', L)
            Sp.compose('LSC_Lp', L)

To my understanding, this should behave similarly to what the LSC
preconditioner does when LSC_L is not given. However, I get a
segmentation fault during the first iteration:

  0 KSP unpreconditioned resid norm 2.963704216563e+01 true resid norm
2.963704216563e+01 ||r(i)||/||b|| 1.000000000000e+00
[1]    2311 segmentation fault (core dumped)  python
StokesPC/stokespc/stokes_bench.py

What am I doing wrong? I appreciate any hints, thanks a lot in advance!

Regards,
David


PS: The log trace is:
  0 KSP unpreconditioned resid norm 2.963704216563e+01 true resid norm
2.963704216563e+01 ||r(i)||/||b|| 1.000000000000e+00
 [0] 10.0543 Event begin: VecScale
 [0] 10.0545 Event end: VecScale
[0] PCSetUp(): Leaving PC with identical preconditioner since operator
is unchanged
 [0] 10.0545 Event begin: PCApply
   [0] 10.0545 Event begin: VecScatterBegin
   [0] 10.0546 Event end: VecScatterBegin
   [0] 10.0546 Event begin: KSPSolve_FS_Schu
     [0] 10.0546 Event begin: KSPSetUp
       [0] 10.0546 Event end: KSPSetUp
[0] PCSetUp(): Setting up PC for first time
       [0] 10.0546 Event begin: PCSetUp
         [0] 10.0547 Event begin: VecSet
         [0] 10.055 Event end: VecSet
         [0] 10.055 Event begin: VecSet
         [0] 10.0553 Event end: VecSet
         [0] 10.0553 Event begin: VecSet
         [0] 10.0553 Event end: VecSet
       [0] 10.0554 Event end: PCSetUp
       [0] 10.0554 Event begin: VecSet
       [0] 10.0554 Event end: VecSet
[0] PCSetUp(): Leaving PC with identical preconditioner since operator
is unchanged
       [0] 10.0554 Event begin: KSPSetUp
           [0] 10.0554 Event end: KSPSetUp
[0] PCSetUp(): Setting up PC for first time
           [0] 10.0554 Event begin: PCSetUp
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374783
             [0] 10.0555 Event begin: VecSet
             [0] 10.0557 Event end: VecSet
[0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
-2080374783
             [0] 10.0557 Event begin: VecSet
             [0] 10.0558 Event end: VecSet
             [0] 10.082 Event begin: MatMult
             [0] 10.1273 Event end: MatMult
             [0] 10.1277 Event begin: MatMult
             [0] 10.1739 Event end: MatMult
             [0] 10.1742 Event begin: MatMult
             [0] 10.2195 Event end: MatMult
             [0] 10.2199 Event begin: MatMult
             [0] 10.2653 Event end: MatMult
             [0] 10.2657 Event begin: MatMult
             [0] 10.3113 Event end: MatMult
             [0] 10.3116 Event begin: MatMult
             [0] 10.3571 Event end: MatMult
             [0] 10.3575 Event begin: MatMult
             [0] 10.403 Event end: MatMult
             [0] 10.4033 Event begin: MatMult
             [0] 10.4487 Event end: MatMult
             [0] 10.4491 Event begin: MatMult
             [0] 10.4947 Event end: MatMult
             [0] 10.495 Event begin: MatMult
             [0] 10.5406 Event end: MatMult





More information about the petsc-users mailing list