[petsc-users] PCFIELDSPLIT on a MATNEST with MATSHELL off-diagonal blocks

Matthew Knepley knepley at gmail.com
Fri Apr 29 09:01:34 CDT 2016


On Fri, Apr 29, 2016 at 8:32 AM, Le Chenadec, Vincent <vlc at illinois.edu>
wrote:

> Dear Matt,
>
> thanks for your prompt response.
>
> You are were right (KSPView output below): I was trying to precondition
> the off-diagonal blocks (I messed up the ordering of the blocks, stored as
> ((00), (01), (11, 10)) where I should have simply used row-major ordering).
>
> One question: is PCFieldSplitSetIS is most effective way to go about this?
> Or is there a way to pass the DMDAs directly rather than extracting the ISs
> from A?
>

This is the best way when you are using a bunch of DMs. I have a more
flexible dof specification in DMPlex, but
it has not migrated to DMDA yet.

  Thanks,

    Matt


> Thanks a lot,
> Vincent
>
> KSP Object: 8 MPI processes
>   type: cg
>   maximum iterations=10000, initial guess is zero
>   tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
>   left preconditioning
>   using PRECONDITIONED norm type for convergence test
> PC Object: 8 MPI processes
>   type: fieldsplit
>     FieldSplit with MULTIPLICATIVE composition: total splits = 3
>     Solver info for each split is in the following KSP objects:
>     Split number 0 Defined by IS
>     KSP Object:    (fieldsplit_0_)     8 MPI processes
>       type: cg
>       maximum iterations=10000, initial guess is zero
>       tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
>       left preconditioning
>       using DEFAULT norm type for convergence test
>     PC Object:    (fieldsplit_0_)     8 MPI processes
>       type: jacobi
>       PC has not been set up so information may be incomplete
>       linear system matrix = precond matrix:
>       Mat Object:      (fieldsplit_0_)       8 MPI processes
>         type: mpiaij
>         rows=262144, cols=262144
>         total: nonzeros=7.07789e+06, allocated nonzeros=7.07789e+06
>         total number of mallocs used during MatSetValues calls =0
>     Split number 1 Defined by IS
>     KSP Object:    (fieldsplit_1_)     8 MPI processes
>       type: cg
>       maximum iterations=10000, initial guess is zero
>       tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
>       left preconditioning
>       using DEFAULT norm type for convergence test
>     PC Object:    (fieldsplit_1_)     8 MPI processes
>       type: jacobi
>       PC has not been set up so information may be incomplete
>       linear system matrix = precond matrix:
>       Mat Object:      (fieldsplit_1_)       8 MPI processes
>         type: shell
>         rows=262144, cols=262144
>     Split number 2 Defined by IS
>     KSP Object:    (fieldsplit_2_)     8 MPI processes
>       type: cg
>       maximum iterations=10000, initial guess is zero
>       tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
>       left preconditioning
>       using DEFAULT norm type for convergence test
>     PC Object:    (fieldsplit_2_)     8 MPI processes
>       type: jacobi
>       PC has not been set up so information may be incomplete
>       linear system matrix = precond matrix:
>       Mat Object:      (fieldsplit_2_)       8 MPI processes
>         type: shell
>         rows=262144, cols=262144
>   linear system matrix = precond matrix:
>   Mat Object:   8 MPI processes
>     type: nest
>     rows=786432, cols=786432
>       Matrix object:
>         type=nest, rows=3, cols=3
>         MatNest structure:
>         (0,0) : prefix="fieldsplit_0_", type=mpiaij, rows=262144,
> cols=262144
>         (0,1) : type=shell, rows=262144, cols=262144
>         (0,2) : type=shell, rows=262144, cols=262144
>         (1,0) : type=mpiaij, rows=262144, cols=262144
>         (1,1) : prefix="fieldsplit_1_", type=shell, rows=262144,
> cols=262144
>         (1,2) : type=shell, rows=262144, cols=262144
>         (2,0) : type=mpiaij, rows=262144, cols=262144
>         (2,1) : type=shell, rows=262144, cols=262144
>         (2,2) : prefix="fieldsplit_2_", type=shell, rows=262144,
> cols=262144
>
>
> ------------------------------
> *From:* Matthew Knepley [knepley at gmail.com]
> *Sent:* Friday, April 29, 2016 7:42 AM
> *To:* Le Chenadec, Vincent
> *Cc:* petsc-users at mcs.anl.gov
> *Subject:* Re: [petsc-users] PCFIELDSPLIT on a MATNEST with MATSHELL
> off-diagonal blocks
>
> On Fri, Apr 29, 2016 at 7:34 AM, Le Chenadec, Vincent <vlc at illinois.edu>
> wrote:
>
>> Dear all,
>>
>> I'm trying to use PCFIELDSPLIT as a preconditioner on a block matrix A,
>>
>> [A_{00}   A_{01}]
>> [A_{10}   A_{11}]
>>
>> where the diagonal blocks (A_{00} and A_{11}) are diagonally dominant. I
>> would like to use either one of the block preconditioners (not even based
>> on the Schur complement, simply either one of the Jacobi/Gauss-Seidel ones).
>>
>
> For any questions about solvers, we need to see the output of -ksp_view.
>
> Below, it seems clear that either a) One of your diagonal blocks is a
> MatShell, or b) you are trying to invert
> an off-diagonal block.
>
>    Matt
>
>
>> The issue I'm running into concerns the formats I would like to use, so
>> I'll start with that:
>>
>>    1. A_{00} and A_{11} are both in MATMPIAIJ format (created using
>>    DMCreateMatrix on 2D DMDAs of different sizes),
>>    2. A_{10} and A_{01} are both in MATSHELL format (created using
>>    DMCreateShell). The reason why I used this format is that it seemed
>>    convenient, since the 2D DMDAs are of different sizes and I assumed I had
>>    to provide the MATOP_MULT operation only,
>>    3. Finally the blocks are stored in MATNEST format (MatCreateNest).
>>
>> With this choice, matrix-vector multiplications work fine (the DMDAs
>> being packed in a DMComposite), which makes me think that so far the code
>> is ok. Also, when I attempt to invert the matrix without preconditioner,
>> I get the correct result.
>>
>> Now here is where things do not work: if I try to use a preconditioner,
>> I get the error pasted below (for a simple PCJACOBI)
>>
>> [0]PETSC ERROR: No support for this operation for this object type
>>
>> [0]PETSC ERROR: Mat type shell
>>
>> [0]PETSC ERROR: #1 MatGetDiagonal() line 4306 in
>> petsc/3.6.3_dbg/src/mat/interface/matrix.c
>>
>> [0]PETSC ERROR: #2 PCSetUp_Jacobi() line 180 in
>> petsc/3.6.3_dbg/src/ksp/pc/impls/jacobi/jacobi.c
>>
>> [0]PETSC ERROR: #3 PCSetUp_Jacobi_NonSymmetric() line 261 in
>> petsc/3.6.3_dbg/src/ksp/pc/impls/jacobi/jacobi.c
>>
>> [0]PETSC ERROR: #4 PCApply_Jacobi() line 286 in
>> petsc/3.6.3_dbg/src/ksp/pc/impls/jacobi/jacobi.c
>>
>> [0]PETSC ERROR: #5 PCApply() line 483 in
>> petsc/3.6.3_dbg/src/ksp/pc/interface/precon.c
>>
>> [0]PETSC ERROR: #6 KSP_PCApply() line 242 in
>> petsc/3.6.3_dbg/include/petsc/private/kspimpl.h
>>
>> [0]PETSC ERROR: #7 KSPSolve_CG() line 139 in
>> petsc/3.6.3_dbg/src/ksp/ksp/impls/cg/cg.c
>>
>> [0]PETSC ERROR: #8 KSPSolve() line 604 in
>> petsc/3.6.3_dbg/src/ksp/ksp/interface/itfunc.c
>>
>> [0]PETSC ERROR: #9 PCApply_FieldSplit() line 988 in
>> petsc/3.6.3_dbg/src/ksp/pc/impls/fieldsplit/fieldsplit.c
>>
>> [0]PETSC ERROR: #10 PCApply() line 483 in
>> petsc/3.6.3_dbg/src/ksp/pc/interface/precon.c
>>
>> [0]PETSC ERROR: #11 KSP_PCApply() line 242 in
>> petsc/3.6.3_dbg/include/petsc/private/kspimpl.h
>>
>> [0]PETSC ERROR: #12 KSPSolve_CG() line 139 in
>> petsc/3.6.3_dbg/src/ksp/ksp/impls/cg/cg.c
>>
>> [0]PETSC ERROR: #13 KSPSolve() line 604 in
>> petsc/3.6.3_dbg/src/ksp/ksp/interface/itfunc.c
>>
>> I may not be calling the appropriate routines, here is what I'm
>> basically doing:
>>
>>    1. KSP type set to KSPCG (A is symmetric), PC type to PCFIELDSPLIT,
>>    2. I then use MatNestGetISs on A to supply the ISs to
>>    PCFieldSplitSetIS,
>>    3. Finally I use PCFieldSplitGetSubKSP to use set the subKSP for each
>>    diagonal block.
>>
>> As I mentioned, as long as the subKSPs don't have any preconditioning
>> (PCNONE), KSPSolve does not complain, only when I try to using one (even
>> PCJACOBI) do I get similar errors). That makes me suspect that I'm doing
>> something wrong, but I could not find an example that was doing the same
>> thing...
>>
>> Any pointers would be much appreciated!
>>
>> Thanks,
>> Vincent
>>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160429/925ec190/attachment.html>


More information about the petsc-users mailing list