[petsc-users] shell preconditioner for Schur complement
Elena Travaglia
elena.travaglia at edu.unito.it
Wed Feb 10 15:05:23 CST 2021
Thanks for the link.
We have set a Schur factorization of type FULL, and we passed it when we
run the code with
-pc_fieldsplit_schur_fact_type full
Here there is the output of -ksp_view
KSP Object: 1 MPI processes
type: fgmres
restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization
with no iterative refinement
happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
tolerances: relative=1e-08, absolute=1e-50, divergence=10000.
right preconditioning
using UNPRECONDITIONED norm type for convergence test
PC Object: 1 MPI processes
type: fieldsplit
FieldSplit with Schur preconditioner, factorization FULL
Preconditioner for the Schur complement formed from A11
Split info:
Split number 0 Defined by IS
Split number 1 Defined by IS
KSP solver for A00 block
KSP Object: (fieldsplit_0_) 1 MPI processes
type: gmres
restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
happy breakdown tolerance 1e-30
maximum iterations=10000, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
left preconditioning
using PRECONDITIONED norm type for convergence test
PC Object: (fieldsplit_0_) 1 MPI processes
type: ilu
out-of-place factorization
0 levels of fill
tolerance for zero pivot 2.22045e-14
matrix ordering: natural
factor fill ratio given 1., needed 1.
Factored matrix follows:
Mat Object: 1 MPI processes
type: seqaij
rows=44, cols=44
package used to perform factorization: petsc
total: nonzeros=482, allocated nonzeros=482
total number of mallocs used during MatSetValues calls=0
using I-node routines: found 13 nodes, limit used is 5
linear system matrix = precond matrix:
Mat Object: (fieldsplit_0_) 1 MPI processes
type: seqaij
rows=44, cols=44
total: nonzeros=482, allocated nonzeros=482
total number of mallocs used during MatSetValues calls=0
using I-node routines: found 13 nodes, limit used is 5
KSP solver for S = A11 - A10 inv(A00) A01
KSP Object: (fieldsplit_1_) 1 MPI processes
type: gmres
restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
happy breakdown tolerance 1e-30
maximum iterations=1, initial guess is zero
tolerances: relative=1e-09, absolute=1e-50, divergence=10000.
left preconditioning
using PRECONDITIONED norm type for convergence test
PC Object: (fieldsplit_1_) 1 MPI processes
type: shell
no name
linear system matrix followed by preconditioner matrix:
Mat Object: (fieldsplit_1_) 1 MPI processes
type: schurcomplement
rows=20, cols=20
Schur complement A11 - A10 inv(A00) A01
A11
Mat Object: (fieldsplit_1_) 1 MPI processes
type: seqaij
rows=20, cols=20
total: nonzeros=112, allocated nonzeros=112
total number of mallocs used during MatSetValues calls=0
using I-node routines: found 10 nodes, limit used is 5
A10
Mat Object: 1 MPI processes
type: seqaij
rows=20, cols=44
total: nonzeros=160, allocated nonzeros=160
total number of mallocs used during MatSetValues calls=0
using I-node routines: found 10 nodes, limit used is 5
KSP of A00
KSP Object: (fieldsplit_0_) 1 MPI processes
type: gmres
restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
happy breakdown tolerance 1e-30
maximum iterations=10000, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50,
divergence=10000.
left preconditioning
using PRECONDITIONED norm type for convergence test
PC Object: (fieldsplit_0_) 1 MPI processes
type: ilu
out-of-place factorization
0 levels of fill
tolerance for zero pivot 2.22045e-14
matrix ordering: natural
factor fill ratio given 1., needed 1.
Factored matrix follows:
Mat Object: 1 MPI processes
type: seqaij
rows=44, cols=44
package used to perform factorization: petsc
total: nonzeros=482, allocated nonzeros=482
total number of mallocs used during MatSetValues
calls=0
using I-node routines: found 13 nodes, limit used
is 5
linear system matrix = precond matrix:
Mat Object: (fieldsplit_0_) 1 MPI processes
type: seqaij
rows=44, cols=44
total: nonzeros=482, allocated nonzeros=482
total number of mallocs used during MatSetValues calls=0
using I-node routines: found 13 nodes, limit used is 5
A01
Mat Object: 1 MPI processes
type: seqaij
rows=44, cols=20
total: nonzeros=156, allocated nonzeros=156
total number of mallocs used during MatSetValues calls=0
using I-node routines: found 12 nodes, limit used is 5
Mat Object: (fieldsplit_1_) 1 MPI processes
type: seqaij
rows=20, cols=20
total: nonzeros=112, allocated nonzeros=112
total number of mallocs used during MatSetValues calls=0
using I-node routines: found 10 nodes, limit used is 5
linear system matrix = precond matrix:
Mat Object: 1 MPI processes
type: seqaij
rows=64, cols=64
total: nonzeros=910, allocated nonzeros=2432
total number of mallocs used during MatSetValues calls=128
using I-node routines: found 23 nodes, limit used is 5
We would like to understand why the first r.h.s, passed to our function for
the Schur preconditioner, is not
b_1-A_10*inv(A_00)*b_0,
even if we used the full factorization ( without dropping any terms ).
Thank you,
Elena
Il giorno mer 10 feb 2021 alle ore 18:05 Matthew Knepley <knepley at gmail.com>
ha scritto:
> On Wed, Feb 10, 2021 at 11:51 AM Matteo Semplice <
> matteo.semplice at uninsubria.it> wrote:
>
>> Dear PETSc users,
>> we are trying to program a preconditioner for the Schur complement
>> of a Stokes system, but it seems that the r.h.s. for the Schur
>> complement system differs from what we expect by a scale factor, which
>> we don't understand.
>>
>> Our setup has a system matrix A divided in 2x2 blocks for velocity and
>> pressure variables. We have programmed our preconditioner in a routine
>> PrecondSchur and in the main program we do
>>
>> PC pc;
>> KSPGetPC(kspA,&pc);
>> PCSetFromOptions(pc);
>> KSPSetOperators(kspA, A, A);
>> KSPSetInitialGuessNonzero(kspA,PETSC_FALSE);
>> KSPSetFromOptions(kspA);
>> KSP *subksp;
>> PetscInt nfield;
>> PCSetUp(pc);
>> PCFieldSplitGetSubKSP(pc, &nfield, &subksp);
>> PC pcSchur;
>> KSPGetPC(subksp[1],&pcSchur);
>> PCSetType(pcSchur,PCSHELL);
>> PCShellSetApply(pcSchur,PrecondSchur);
>> KSPSetFromOptions(subksp[1]);
>>
>> and eventually
>>
>> KSPSolve(A,b,solution);
>>
>> We run the code with options
>>
>> -ksp_type fgmres \
>> -pc_type fieldsplit -pc_fieldsplit_type schur \
>> -pc_fieldsplit_schur_fact_type full \
>>
>> and, from reading section 2.3.5 of the PETSc manual, we'd expect that
>> the first r.h.s. passed to PrecondSchur be exactly
>> b_1-A_10*inv(A_00)*b_0
>>
>> Instead (from a monitor function attached to the subksp[1] solver), the
>> first r.h.s. appears to be scalar multiple of the above vector; we are
>> guessing that we should take into account this multiplicative factor in
>> our preconditioner routine, but we cannot understand where it comes from
>> and how its value is determined.
>>
>> Could you explain us what is going on in the PC_SCHUR exactly, or point
>> us to some working code example?
>>
>
> 1) It is hard to understand solver questions without the output of
> -ksp_view
>
> 2) The RHS will depend on the kind of factorization you are using for the
> system
>
>
> https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCFieldSplitSetSchurFactType.html#PCFieldSplitSetSchurFactType
>
> I can see which one in the view output
>
> Thanks,
>
> Matt
>
>
>> Thanks in advance!
>>
>> Matteo
>>
>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
--
------------------------
Indirizzo istituzionale di posta elettronica
degli studenti e dei laureati dell'Università degli Studi di TorinoOfficial
University of Turin email address for students and graduates
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210210/705208ce/attachment.html>
More information about the petsc-users
mailing list