[petsc-users] understanding the LSC preconditioner
Barry Smith
bsmith at mcs.anl.gov
Tue Feb 21 13:09:31 CST 2017
Hmm, it is crashing inside trying to build the ML preconditioner. This is certainly not expected, could be some data got corrupted earlier. I'm not sure how to track it down, maybe run the python under valgrind?
Barry
> On Feb 21, 2017, at 1:02 PM, David Nolte <dnolte at dim.uchile.cl> wrote:
>
> It crashes in ML_matmat_mult():
>
> Program received signal SIGSEGV, Segmentation
> fault.
>
> 0x00007ffff5cdaeb0 in ML_matmat_mult () from
> /usr/local/petsc-32/lib/libpetsc.so.3.7
>
> (gdb) bt
> #0 0x00007ffff5cdaeb0 in ML_matmat_mult () from
> /usr/local/petsc-32/lib/libpetsc.so.3.7
>
> #1 0x00007ffff5cdbf76 in ML_2matmult () from
> /usr/local/petsc-32/lib/libpetsc.so.3.7
>
> #2 0x00007ffff5ca197a in ML_AGG_Gen_Prolongator () from
> /usr/local/petsc-32/lib/libpetsc.so.3.7
>
> #3 0x00007ffff5c9fa71 in ML_Gen_MultiLevelHierarchy () from
> /usr/local/petsc-32/lib/libpetsc.so.3.7
>
> #4 0x00007ffff5ca0484 in ML_Gen_MultiLevelHierarchy_UsingAggregation ()
> from
> /usr/local/petsc-32/lib/libpetsc.so.3.7
>
> #5 0x00007ffff5780aff in PCSetUp_ML () from
> /usr/local/petsc-32/lib/libpetsc.so.3.7
>
> #6 0x00007ffff5667fce in PCSetUp () from
> /usr/local/petsc-32/lib/libpetsc.so.3.7
>
> #7 0x00007ffff578d9f8 in KSPSetUp () from
> /usr/local/petsc-32/lib/libpetsc.so.3.7
>
> #8 0x00007ffff578e7d8 in KSPSolve () from
> /usr/local/petsc-32/lib/libpetsc.so.3.7
>
> #9 0x00007ffff578606b in PCApply_LSC () from
> /usr/local/petsc-32/lib/libpetsc.so.3.7
>
> #10 0x00007ffff5668640 in PCApply () from
> /usr/local/petsc-32/lib/libpetsc.so.3.7
>
> #11 0x00007ffff57f05a5 in KSPSolve_PREONLY () from
> /usr/local/petsc-32/lib/libpetsc.so.3.7
> #12 0x00007ffff578ea63 in KSPSolve () from
> /usr/local/petsc-32/lib/libpetsc.so.3.7
> #13 0x00007ffff5746cf4 in PCApply_FieldSplit_Schur () from
> /usr/local/petsc-32/lib/libpetsc.so.3.7
> #14 0x00007ffff5668640 in PCApply () from
> /usr/local/petsc-32/lib/libpetsc.so.3.7
> #15 0x00007ffff580ab2d in KSPFGMRESCycle () from
> /usr/local/petsc-32/lib/libpetsc.so.3.7
> #16 0x00007ffff580b900 in KSPSolve_FGMRES () from
> /usr/local/petsc-32/lib/libpetsc.so.3.7
> #17 0x00007ffff578ea63 in KSPSolve () from
> /usr/local/petsc-32/lib/libpetsc.so.3.7
> #18 0x00007ffff6597fbf in __pyx_pf_8petsc4py_5PETSc_3KSP_98solve
> (__pyx_v_self=0x7fffe4f59830, __pyx_v_b=<optimized out>,
> __pyx_v_x=<optimized out>)
> at src/petsc4py.PETSc.c:153555
> [...]
>
> Is this where it tries to perform the matrix multiplication
> `Bdiv.matMult(Bgrad)`?
>
> When instead of ML I use LU in the PC,
>
> -fieldsplit_1_lsc_ksp_type preonly
> -fieldsplit_1_lsc_pc_type lu
>
> I get a "wrong argument" error:
>
> File "StokesPC/stokespc/stokes_bench.py", line 242, in
> solve_petsc
> ksp.solve(b.vec(), x)
> File "PETSc/KSP.pyx", line 384, in petsc4py.PETSc.KSP.solve
> (src/petsc4py.PETSc.c:153555)
> petsc4py.PETSc.Error: error code 62
>
>
> Regards,
> David
>
>
> On 02/21/2017 02:20 PM, Barry Smith wrote:
>> You'll have to figure out what is triggering the segmentation violation. If it is the python that is crashing then likely you can run the entire python program in the debugger and then when it crashes you should be able to see where.
>>
>> Barry
>>
>>
>>> On Feb 21, 2017, at 10:07 AM, David Nolte <dnolte at dim.uchile.cl> wrote:
>>>
>>> Dear all,
>>>
>>> new to PETSc, I am trying to use the LSC preconditioner for a Stokes
>>> problem (discretized by means of stable FEM). I use the python backend
>>> petsc4py.
>>> The "automatic" version of the LSC seems to work with the following
>>> setup (I think adapted from Matt's tutorial slides):
>>>
>>> -ksp_view
>>> -ksp_converged_reason
>>> -ksp_monitor_true_residual
>>> -ksp_type fgmres
>>> -ksp_rtol 1.0e-8
>>>
>>> -pc_type fieldsplit
>>> -pc_fieldsplit_detect_saddle_point
>>> -pc_fieldsplit_type schur
>>> -pc_fieldsplit_schur_fact_type upper
>>> -pc_fieldsplit_schur_precondition self
>>>
>>> -fieldsplit_0_ksp_type preonly
>>> -fieldsplit_0_pc_type ml
>>>
>>> -fieldsplit_1_ksp_type preonly
>>> -fieldsplit_1_pc_type lsc
>>> -fieldsplit_1_lsc_pc_type ml
>>> -fieldsplit_1_lsc_ksp_type preonly
>>>
>>> In a 3D setup with 250k dofs this converges within 78 iterations. (For
>>> reference, upper Schur factorization with ML for the uu-block and Sp =
>>> diag(Q), the diagonal of the pressure mass matrix, takes 41 iterations
>>> and half of the computation time.)
>>>
>>> Now I just wanted to check if I can get the same result by building the
>>> L-matrix manually with the following piece of python code, where is0,
>>> is1 are the index sets corresponding to the velocity and pressure dofs,
>>> and A is full the system matrix.
>>>
>>> Sp = Sp.getSubMatrix(is1, is1)
>>> pc.setFieldSplitSchurPreType(PETSc.PC.SchurPreType.USER, Sp)
>>> # Sp.setType(PETSc.Mat.Type.SCHURCOMPLEMENT) # necessary?
>>> # extract A10 block
>>> Bdiv = A.getSubMatrix(is1, is0)
>>> # extract A01 block
>>> Bgrad = A.getSubMatrix(is0, is1)
>>> L = Bdiv
>>> L.matMult(Bgrad)
>>> Sp.compose('LSC_L', L)
>>> Sp.compose('LSC_Lp', L)
>>>
>>> To my understanding, this should behave similarly to what the LSC
>>> preconditioner does when LSC_L is not given. However, I get a
>>> segmentation fault during the first iteration:
>>>
>>> 0 KSP unpreconditioned resid norm 2.963704216563e+01 true resid norm
>>> 2.963704216563e+01 ||r(i)||/||b|| 1.000000000000e+00
>>> [1] 2311 segmentation fault (core dumped) python
>>> StokesPC/stokespc/stokes_bench.py
>>>
>>> What am I doing wrong? I appreciate any hints, thanks a lot in advance!
>>>
>>> Regards,
>>> David
>>>
>>>
>>> PS: The log trace is:
>>> 0 KSP unpreconditioned resid norm 2.963704216563e+01 true resid norm
>>> 2.963704216563e+01 ||r(i)||/||b|| 1.000000000000e+00
>>> [0] 10.0543 Event begin: VecScale
>>> [0] 10.0545 Event end: VecScale
>>> [0] PCSetUp(): Leaving PC with identical preconditioner since operator
>>> is unchanged
>>> [0] 10.0545 Event begin: PCApply
>>> [0] 10.0545 Event begin: VecScatterBegin
>>> [0] 10.0546 Event end: VecScatterBegin
>>> [0] 10.0546 Event begin: KSPSolve_FS_Schu
>>> [0] 10.0546 Event begin: KSPSetUp
>>> [0] 10.0546 Event end: KSPSetUp
>>> [0] PCSetUp(): Setting up PC for first time
>>> [0] 10.0546 Event begin: PCSetUp
>>> [0] 10.0547 Event begin: VecSet
>>> [0] 10.055 Event end: VecSet
>>> [0] 10.055 Event begin: VecSet
>>> [0] 10.0553 Event end: VecSet
>>> [0] 10.0553 Event begin: VecSet
>>> [0] 10.0553 Event end: VecSet
>>> [0] 10.0554 Event end: PCSetUp
>>> [0] 10.0554 Event begin: VecSet
>>> [0] 10.0554 Event end: VecSet
>>> [0] PCSetUp(): Leaving PC with identical preconditioner since operator
>>> is unchanged
>>> [0] 10.0554 Event begin: KSPSetUp
>>> [0] 10.0554 Event end: KSPSetUp
>>> [0] PCSetUp(): Setting up PC for first time
>>> [0] 10.0554 Event begin: PCSetUp
>>> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
>>> -2080374783
>>> [0] 10.0555 Event begin: VecSet
>>> [0] 10.0557 Event end: VecSet
>>> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689
>>> -2080374783
>>> [0] 10.0557 Event begin: VecSet
>>> [0] 10.0558 Event end: VecSet
>>> [0] 10.082 Event begin: MatMult
>>> [0] 10.1273 Event end: MatMult
>>> [0] 10.1277 Event begin: MatMult
>>> [0] 10.1739 Event end: MatMult
>>> [0] 10.1742 Event begin: MatMult
>>> [0] 10.2195 Event end: MatMult
>>> [0] 10.2199 Event begin: MatMult
>>> [0] 10.2653 Event end: MatMult
>>> [0] 10.2657 Event begin: MatMult
>>> [0] 10.3113 Event end: MatMult
>>> [0] 10.3116 Event begin: MatMult
>>> [0] 10.3571 Event end: MatMult
>>> [0] 10.3575 Event begin: MatMult
>>> [0] 10.403 Event end: MatMult
>>> [0] 10.4033 Event begin: MatMult
>>> [0] 10.4487 Event end: MatMult
>>> [0] 10.4491 Event begin: MatMult
>>> [0] 10.4947 Event end: MatMult
>>> [0] 10.495 Event begin: MatMult
>>> [0] 10.5406 Event end: MatMult
>>>
>>>
>>>
>
>
More information about the petsc-users
mailing list