[petsc-users] Nullspaces for schur complement PCs
Matthew Knepley
knepley at gmail.com
Mon Nov 10 15:48:27 CST 2014
On Fri, Nov 7, 2014 at 11:04 AM, Lawrence Mitchell <
lawrence.mitchell at imperial.ac.uk> wrote:
> Hi petsc-dev,
>
> I'm solving a pure Neumann mixed Poisson-like problem, preconditioning
> with a schur complement. The pressure space has a nullspace of the
> constant functions and so I attach the appropriate nullspace to the krylov
> solver, and compose the constant nullspace with the IS defining the
> pressure space. My RHS is consistent.
>
That is supposed to work, and I think it does in my tests. The code is here
https://bitbucket.org/petsc/petsc/src/1f0d623c8336219eb98f7ded6f95c151ca603fe7/src/ksp/pc/impls/fieldsplit/fieldsplit.c?at=master#cl-562
so maybe we can track this down in the debugger.
Thanks,
Matt
> When I precondition with:
>
> -pc_type fieldsplit -pc_fieldsplit_type schur
> -pc_fieldsplit_schur_fact_type full \
> -pc_fieldsplit_schur_precondition selfp
>
> I notice that the nullspace is not transferred over to the preconditioning
> matrix for S. Is this a deliberate choice?
>
> ksp_view output below, note that the schurcomplement mat has an attached
> nullspace, but the generated pmat does not.
>
> Cheers,
>
> Lawrence
>
> KSP Object: 1 MPI processes
> type: gmres
> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
> Orthogonalization with no iterative refinement
> GMRES: happy breakdown tolerance 1e-30
> maximum iterations=30, initial guess is zero
> tolerances: relative=1e-07, absolute=1e-50, divergence=10000
> left preconditioning
> has attached null space
> using PRECONDITIONED norm type for convergence test
> PC Object: 1 MPI processes
> type: fieldsplit
> FieldSplit with Schur preconditioner, factorization FULL
> Preconditioner for the Schur complement formed from Sp, an assembled
> approximation to S, which uses (the lumped) A00's diagonal's inverse
> Split info:
> Split number 0 Defined by IS
> Split number 1 Defined by IS
> KSP solver for A00 block
> KSP Object: (fieldsplit_0_) 1 MPI processes
> type: gmres
> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
> Orthogonalization with no iterative refinement
> GMRES: happy breakdown tolerance 1e-30
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000
> left preconditioning
> using PRECONDITIONED norm type for convergence test
> PC Object: (fieldsplit_0_) 1 MPI processes
> type: ilu
> ILU: out-of-place factorization
> 0 levels of fill
> tolerance for zero pivot 2.22045e-14
> matrix ordering: natural
> factor fill ratio given 1, needed 1
> Factored matrix follows:
> Mat Object: 1 MPI processes
> type: seqaij
> rows=72, cols=72
> package used to perform factorization: petsc
> total: nonzeros=1080, allocated nonzeros=1080
> total number of mallocs used during MatSetValues calls =0
> using I-node routines: found 23 nodes, limit used is 5
> linear system matrix = precond matrix:
> Mat Object: (fieldsplit_0_) 1 MPI processes
> type: seqaij
> rows=72, cols=72
> total: nonzeros=1080, allocated nonzeros=0
> total number of mallocs used during MatSetValues calls =0
> using I-node routines: found 23 nodes, limit used is 5
> KSP solver for S = A11 - A10 inv(A00) A01
> KSP Object: (fieldsplit_1_) 1 MPI processes
> type: gmres
> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt
> Orthogonalization with no iterative refinement
> GMRES: happy breakdown tolerance 1e-30
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000
> left preconditioning
> using PRECONDITIONED norm type for convergence test
> PC Object: (fieldsplit_1_) 1 MPI processes
> type: ilu
> ILU: out-of-place factorization
> 0 levels of fill
> tolerance for zero pivot 2.22045e-14
> matrix ordering: natural
> factor fill ratio given 1, needed 1
> Factored matrix follows:
> Mat Object: 1 MPI processes
> type: seqaij
> rows=24, cols=24
> package used to perform factorization: petsc
> total: nonzeros=216, allocated nonzeros=216
> total number of mallocs used during MatSetValues calls =0
> using I-node routines: found 8 nodes, limit used is 5
> linear system matrix followed by preconditioner matrix:
> Mat Object: (fieldsplit_1_) 1 MPI processes
> type: schurcomplement
> rows=24, cols=24
> has attached null space
> Schur complement A11 - A10 inv(A00) A01
> A11
> Mat Object: (fieldsplit_1_) 1 MPI
> processes
> type: seqaij
> rows=24, cols=24
> total: nonzeros=72, allocated nonzeros=0
> total number of mallocs used during MatSetValues calls =0
> has attached null space
> using I-node routines: found 8 nodes, limit used is 5
> A10
> Mat Object: 1 MPI processes
> type: seqaij
> rows=24, cols=72
> total: nonzeros=288, allocated nonzeros=0
> total number of mallocs used during MatSetValues calls =0
> using I-node routines: found 8 nodes, limit used is 5
> KSP of A00
> KSP Object: (fieldsplit_0_) 1 MPI
> processes
> type: gmres
> GMRES: restart=30, using Classical (unmodified)
> Gram-Schmidt Orthogonalization with no iterative refinement
> GMRES: happy breakdown tolerance 1e-30
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50,
> divergence=10000
> left preconditioning
> using PRECONDITIONED norm type for convergence test
> PC Object: (fieldsplit_0_) 1 MPI
> processes
> type: ilu
> ILU: out-of-place factorization
> 0 levels of fill
> tolerance for zero pivot 2.22045e-14
> matrix ordering: natural
> factor fill ratio given 1, needed 1
> Factored matrix follows:
> Mat Object: 1 MPI processes
> type: seqaij
> rows=72, cols=72
> package used to perform factorization: petsc
> total: nonzeros=1080, allocated nonzeros=1080
> total number of mallocs used during MatSetValues
> calls =0
> using I-node routines: found 23 nodes, limit
> used is 5
> linear system matrix = precond matrix:
> Mat Object: (fieldsplit_0_)
> 1 MPI processes
> type: seqaij
> rows=72, cols=72
> total: nonzeros=1080, allocated nonzeros=0
> total number of mallocs used during MatSetValues calls =0
> using I-node routines: found 23 nodes, limit used is 5
> A01
> Mat Object: 1 MPI processes
> type: seqaij
> rows=72, cols=24
> total: nonzeros=288, allocated nonzeros=0
> total number of mallocs used during MatSetValues calls =0
> using I-node routines: found 23 nodes, limit used is 5
> Mat Object: 1 MPI processes
> type: seqaij
> rows=24, cols=24
> total: nonzeros=216, allocated nonzeros=216
> total number of mallocs used during MatSetValues calls =0
> using I-node routines: found 8 nodes, limit used is 5
> linear system matrix = precond matrix:
> Mat Object: 1 MPI processes
> type: nest
> rows=96, cols=96
> has attached null space
> Matrix object:
> type=nest, rows=2, cols=2
> MatNest structure:
> (0,0) : prefix="fieldsplit_0_", type=seqaij, rows=72, cols=72
> (0,1) : type=seqaij, rows=72, cols=24
> (1,0) : type=seqaij, rows=24, cols=72
> (1,1) : prefix="fieldsplit_1_", type=seqaij, rows=24, cols=24
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20141110/b621afda/attachment.html>
More information about the petsc-users
mailing list