[petsc-users] Nested Fieldsplit for custom index sets

Lawrence Mitchell lawrence.mitchell at imperial.ac.uk
Thu Jul 28 03:35:30 CDT 2016


Dear Artur,

On 28/07/16 02:20, Safin, Artur wrote:
> Barry, Lawrence,
> 
>> I think the SubKSPs (and therefore SubPCs) are not set up until you call KSPSetUp(ksp) which your code does not do explicitly and is therefore done in KSPSolve.
> 
> I added KSPSetUp(), but unfortunately the issue did not go away.
> 
> 
> 
> I have created a MWE that replicates the issue. The program tries to solve a tridiagonal system, where the first fieldsplit partitions the global matrix
> 
> [ P  x ]
> [ x  T ],
> 
> and the nested fieldsplit partitions P into
> 
> [ A  x ]
> [ x  B ].

Two things:

1. Always check the return value from all PETSc calls.  This will
normally give you a very useful backtrace when something goes wrong.

That is, annotate all your calls with:

PetscErrorCode ierr;


ierr = SomePetscFunction(...); CHKERRQ(ierr);

If I do this, I see that the call to KSPSetUp fails:

[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: Petsc has generated inconsistent data
[0]PETSC ERROR: Unhandled case, must have at least two fields, not 1
[0]PETSC ERROR: See
http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.7.2-931-g1e46b98
GIT Date: 2016-07-06 16:57:50 -0500

...

[0]PETSC ERROR: #1 PCFieldSplitSetDefaults() line 470 in
/data/lmitche1/src/deps/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c
[0]PETSC ERROR: #2 PCSetUp_FieldSplit() line 487 in
/data/lmitche1/src/deps/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c
[0]PETSC ERROR: #3 PCSetUp() line 968 in
/data/lmitche1/src/deps/petsc/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: #4 KSPSetUp() line 393 in
/data/lmitche1/src/deps/petsc/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #5 main() line 65 in /homes/lmitche1/tmp/ex.c

The reason is you need to call KSPSetUp *after* setting the outermost
fieldsplit ISes.

If I move the call to KSPSetUp, then things seem to work.  I've
attached the working code.

Cheers,

Lawrence

$ cat options.txt
-pc_type fieldsplit
-pc_fieldsplit_type multiplicative
-fieldsplit_T_ksp_type bcgs
-fieldsplit_P_ksp_type gmres
-fieldsplit_P_pc_type fieldsplit
-fieldsplit_P_pc_fieldsplit_type multiplicative
-fieldsplit_P_fieldsplit_A_ksp_type gmres
-fieldsplit_P_fieldsplit_B_pc_type lu
-fieldsplit_P_fieldsplit_B_ksp_type preonly
-ksp_converged_reason
-ksp_monitor_true_residual
-ksp_view

$ ./ex -options_file options.txt

  0 KSP preconditioned resid norm 5.774607007892e+00 true resid norm
1.414213562373e+00 ||r(i)||/||b|| 1.000000000000e+00
  1 KSP preconditioned resid norm 1.921795888956e-01 true resid norm
4.802975385197e-02 ||r(i)||/||b|| 3.396216464745e-02
  2 KSP preconditioned resid norm 1.436304589027e-12 true resid norm
2.435255920058e-13 ||r(i)||/||b|| 1.721985974998e-13
Linear solve converged due to CONVERGED_RTOL iterations 2
KSP Object: 1 MPI processes
  type: gmres
    GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
    GMRES: happy breakdown tolerance 1e-30
  maximum iterations=10000, initial guess is zero
  tolerances:  relative=1e-05, absolute=1e-50, divergence=10000.
  left preconditioning
  using PRECONDITIONED norm type for convergence test
PC Object: 1 MPI processes
  type: fieldsplit
    FieldSplit with MULTIPLICATIVE composition: total splits = 2
    Solver info for each split is in the following KSP objects:
    Split number 0 Defined by IS
    KSP Object: (fieldsplit_P_) 1 MPI processes
      type: gmres
        GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
        GMRES: happy breakdown tolerance 1e-30
      maximum iterations=10000, initial guess is zero
      tolerances:  relative=1e-05, absolute=1e-50, divergence=10000.
      left preconditioning
      using PRECONDITIONED norm type for convergence test
    PC Object: (fieldsplit_P_) 1 MPI processes
      type: fieldsplit
        FieldSplit with MULTIPLICATIVE composition: total splits = 2
        Solver info for each split is in the following KSP objects:
        Split number 0 Defined by IS
        KSP Object: (fieldsplit_P_fieldsplit_A_) 1 MPI processes
          type: gmres
            GMRES: restart=30, using Classical (unmodified)
Gram-Schmidt Orthogonalization with no iterative refinement
            GMRES: happy breakdown tolerance 1e-30
          maximum iterations=10000, initial guess is zero
          tolerances:  relative=1e-05, absolute=1e-50, divergence=10000.
          left preconditioning
          using PRECONDITIONED norm type for convergence test
        PC Object: (fieldsplit_P_fieldsplit_A_) 1 MPI processes
          type: ilu
            ILU: out-of-place factorization
            0 levels of fill
            tolerance for zero pivot 2.22045e-14
            matrix ordering: natural
            factor fill ratio given 1., needed 1.
              Factored matrix follows:
                Mat Object: 1 MPI processes
                  type: seqaij
                  rows=25, cols=25
                  package used to perform factorization: petsc
                  total: nonzeros=73, allocated nonzeros=73
                  total number of mallocs used during MatSetValues
calls =0
                    not using I-node routines
          linear system matrix = precond matrix:
          Mat Object: (fieldsplit_P_fieldsplit_A_) 1 MPI processes
            type: seqaij
            rows=25, cols=25
            total: nonzeros=73, allocated nonzeros=73
            total number of mallocs used during MatSetValues calls =0
              not using I-node routines
        Split number 1 Defined by IS
        KSP Object: (fieldsplit_P_fieldsplit_B_) 1 MPI processes
          type: preonly
          maximum iterations=10000, initial guess is zero
          tolerances:  relative=1e-05, absolute=1e-50, divergence=10000.
          left preconditioning
          using NONE norm type for convergence test
        PC Object: (fieldsplit_P_fieldsplit_B_) 1 MPI processes
          type: lu
            LU: out-of-place factorization
            tolerance for zero pivot 2.22045e-14
            matrix ordering: nd
            factor fill ratio given 5., needed 1.43836
              Factored matrix follows:
                Mat Object: 1 MPI processes
                  type: seqaij
                  rows=25, cols=25
                  package used to perform factorization: petsc
                  total: nonzeros=105, allocated nonzeros=105
                  total number of mallocs used during MatSetValues
calls =0
                    not using I-node routines
          linear system matrix = precond matrix:
          Mat Object: (fieldsplit_P_fieldsplit_B_) 1 MPI processes
            type: seqaij
            rows=25, cols=25
            total: nonzeros=73, allocated nonzeros=73
            total number of mallocs used during MatSetValues calls =0
              not using I-node routines
      linear system matrix = precond matrix:
      Mat Object: (fieldsplit_P_) 1 MPI processes
        type: seqaij
        rows=50, cols=50
        total: nonzeros=148, allocated nonzeros=148
        total number of mallocs used during MatSetValues calls =0
          not using I-node routines
    Split number 1 Defined by IS
    KSP Object: (fieldsplit_T_) 1 MPI processes
      type: bcgs
      maximum iterations=10000, initial guess is zero
      tolerances:  relative=1e-05, absolute=1e-50, divergence=10000.
      left preconditioning
      using PRECONDITIONED norm type for convergence test
    PC Object: (fieldsplit_T_) 1 MPI processes
      type: ilu
        ILU: out-of-place factorization
        0 levels of fill
        tolerance for zero pivot 2.22045e-14
        matrix ordering: natural
        factor fill ratio given 1., needed 1.
          Factored matrix follows:
            Mat Object: 1 MPI processes
              type: seqaij
              rows=50, cols=50
              package used to perform factorization: petsc
              total: nonzeros=148, allocated nonzeros=148
              total number of mallocs used during MatSetValues calls =0
                not using I-node routines
      linear system matrix = precond matrix:
      Mat Object: (fieldsplit_T_) 1 MPI processes
        type: seqaij
        rows=50, cols=50
        total: nonzeros=148, allocated nonzeros=148
        total number of mallocs used during MatSetValues calls =0
          not using I-node routines
  linear system matrix = precond matrix:
  Mat Object: 1 MPI processes
    type: seqaij
    rows=100, cols=100
    total: nonzeros=298, allocated nonzeros=500
    total number of mallocs used during MatSetValues calls =0
      not using I-node routines


-------------- next part --------------
A non-text attachment was scrubbed...
Name: ex.c
Type: text/x-csrc
Size: 2885 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160728/c8397e86/attachment-0001.c>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 490 bytes
Desc: OpenPGP digital signature
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160728/c8397e86/attachment-0001.pgp>


More information about the petsc-users mailing list