[petsc-users] Question regarding naming of fieldsplit splits
Blauth, Sebastian
sebastian.blauth at itwm.fraunhofer.de
Fri Jun 28 03:05:26 CDT 2024
Hello everyone,
I have a question regarding the naming convention using PETScs
PCFieldsplit. I have been following
https://lists.mcs.anl.gov/pipermail/petsc-users/2019-January/037262.html to
create a DMShell with FEniCS in order to customize PCFieldsplit for my
application.
I am using the following options, which work nicely for me:
-ksp_type fgmres
-pc_type fieldsplit
-pc_fieldsplit_0_fields 0, 1
-pc_fieldsplit_1_fields 2
-pc_fieldsplit_type additive
-fieldsplit_0_ksp_type fgmres
-fieldsplit_0_pc_type fieldsplit
-fieldsplit_0_pc_fieldsplit_type schur
-fieldsplit_0_pc_fieldsplit_schur_fact_type full
-fieldsplit_0_pc_fieldsplit_schur_precondition selfp
-fieldsplit_0_fieldsplit_u_ksp_type preonly
-fieldsplit_0_fieldsplit_u_pc_type lu
-fieldsplit_0_fieldsplit_p_ksp_type cg
-fieldsplit_0_fieldsplit_p_ksp_rtol 1e-14
-fieldsplit_0_fieldsplit_p_ksp_atol 1e-30
-fieldsplit_0_fieldsplit_p_pc_type icc
-fieldsplit_0_ksp_rtol 1e-14
-fieldsplit_0_ksp_atol 1e-30
-fieldsplit_0_ksp_monitor_true_residual
-fieldsplit_c_ksp_type preonly
-fieldsplit_c_pc_type lu
-ksp_view
Note that this is just an academic example (sorry for the low solver
tolerances) to test the approach, consisting of a Stokes equation and some
concentration equation (which is not even coupled to Stokes, just for
testing).
Completely analogous to
https://lists.mcs.anl.gov/pipermail/petsc-users/2019-January/037262.html, I
translate my ISs to a PETSc Section, which is then supplied to a DMShell
and assigned to a KSP. I am not so familiar with the code or how / why this
works, but it seems to do so perfectly. I name my sections with petsc4py
using
section.setFieldName(0, "u")
section.setFieldName(1, "p")
section.setFieldName(2, "c")
However, this is also reflected in the way I can access the fieldsplit
options from the command line. My question is: Is there any way of not using
the FieldNames specified in python but use the index of the field as defined
with -pc_fieldsplit_0_fields 0, 1 and -pc_fieldsplit_1_fields 2, i.e.,
instead of the prefix fieldsplit_0_fieldsplit_u I want to write
fieldsplit_0_fieldsplit_0, instead of fieldsplit_0_fieldsplit_p I want
to use fieldsplit_0_fieldsplit_1, and instead of fieldsplit_c I want to
use fieldsplit_1. Just changing the names of the fields to
section.setFieldName(0, "0")
section.setFieldName(1, "1")
section.setFieldName(2, "2")
does obviously not work as expected, as it works for velocity and pressure,
but not for the concentration the prefix there is then fieldsplit_2 and
not fieldsplit_1. In the docs, I have found
https://petsc.org/main/manualpages/PC/PCFieldSplitSetFields/ which seems to
suggest that the fieldname can potentially be supplied, but I dont see how
to do so from the command line. Also, for the sake of completeness, I attach
the output of the solve with -ksp_view below.
Thanks a lot in advance and best regards,
Sebastian
The output of ksp_view is the following:
KSP Object: 1 MPI processes
type: fgmres
restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization
with no iterative refinement
happy breakdown tolerance 1e-30
maximum iterations=10000, initial guess is zero
tolerances: relative=1e-05, absolute=1e-11, divergence=10000.
right preconditioning
using UNPRECONDITIONED norm type for convergence test
PC Object: 1 MPI processes
type: fieldsplit
FieldSplit with ADDITIVE composition: total splits = 2
Solver info for each split is in the following KSP objects:
Split number 0 Defined by IS
KSP Object: (fieldsplit_0_) 1 MPI processes
type: fgmres
restart=30, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
happy breakdown tolerance 1e-30
maximum iterations=10000, initial guess is zero
tolerances: relative=1e-14, absolute=1e-30, divergence=10000.
right preconditioning
using UNPRECONDITIONED norm type for convergence test
PC Object: (fieldsplit_0_) 1 MPI processes
type: fieldsplit
FieldSplit with Schur preconditioner, factorization FULL
Preconditioner for the Schur complement formed from Sp, an assembled
approximation to S, which uses A00's diagonal's inverse
Split info:
Split number 0 Defined by IS
Split number 1 Defined by IS
KSP solver for A00 block
KSP Object: (fieldsplit_0_fieldsplit_u_) 1 MPI processes
type: preonly
maximum iterations=10000, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
left preconditioning
using NONE norm type for convergence test
PC Object: (fieldsplit_0_fieldsplit_u_) 1 MPI processes
type: lu
out-of-place factorization
tolerance for zero pivot 2.22045e-14
matrix ordering: nd
factor fill ratio given 5., needed 3.92639
Factored matrix follows:
Mat Object: 1 MPI processes
type: seqaij
rows=4290, cols=4290
package used to perform factorization: petsc
total: nonzeros=375944, allocated nonzeros=375944
using I-node routines: found 2548 nodes, limit used is 5
linear system matrix = precond matrix:
Mat Object: (fieldsplit_0_fieldsplit_u_) 1 MPI processes
type: seqaij
rows=4290, cols=4290
total: nonzeros=95748, allocated nonzeros=95748
total number of mallocs used during MatSetValues calls=0
using I-node routines: found 3287 nodes, limit used is 5
KSP solver for S = A11 - A10 inv(A00) A01
KSP Object: (fieldsplit_0_fieldsplit_p_) 1 MPI processes
type: cg
maximum iterations=10000, initial guess is zero
tolerances: relative=1e-14, absolute=1e-30, divergence=10000.
left preconditioning
using PRECONDITIONED norm type for convergence test
PC Object: (fieldsplit_0_fieldsplit_p_) 1 MPI processes
type: icc
out-of-place factorization
0 levels of fill
tolerance for zero pivot 2.22045e-14
using Manteuffel shift [POSITIVE_DEFINITE]
matrix ordering: natural
factor fill ratio given 1., needed 1.
Factored matrix follows:
Mat Object: 1 MPI processes
type: seqsbaij
rows=561, cols=561
package used to perform factorization: petsc
total: nonzeros=5120, allocated nonzeros=5120
block size is 1
linear system matrix followed by preconditioner matrix:
Mat Object: (fieldsplit_0_fieldsplit_p_) 1 MPI processes
type: schurcomplement
rows=561, cols=561
Schur complement A11 - A10 inv(A00) A01
A11
Mat Object: (fieldsplit_0_fieldsplit_p_) 1 MPI processes
type: seqaij
rows=561, cols=561
total: nonzeros=3729, allocated nonzeros=3729
total number of mallocs used during MatSetValues calls=0
not using I-node routines
A10
Mat Object: 1 MPI processes
type: seqaij
rows=561, cols=4290
total: nonzeros=19938, allocated nonzeros=19938
total number of mallocs used during MatSetValues calls=0
not using I-node routines
KSP of A00
KSP Object: (fieldsplit_0_fieldsplit_u_) 1 MPI processes
type: preonly
maximum iterations=10000, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50,
divergence=10000.
left preconditioning
using NONE norm type for convergence test
PC Object: (fieldsplit_0_fieldsplit_u_) 1 MPI processes
type: lu
out-of-place factorization
tolerance for zero pivot 2.22045e-14
matrix ordering: nd
factor fill ratio given 5., needed 3.92639
Factored matrix follows:
Mat Object: 1 MPI processes
type: seqaij
rows=4290, cols=4290
package used to perform factorization: petsc
total: nonzeros=375944, allocated nonzeros=375944
using I-node routines: found 2548 nodes, limit
used is 5
linear system matrix = precond matrix:
Mat Object: (fieldsplit_0_fieldsplit_u_) 1 MPI processes
type: seqaij
rows=4290, cols=4290
total: nonzeros=95748, allocated nonzeros=95748
total number of mallocs used during MatSetValues calls=0
using I-node routines: found 3287 nodes, limit used is
5
A01
Mat Object: 1 MPI processes
type: seqaij
rows=4290, cols=561
total: nonzeros=19938, allocated nonzeros=19938
total number of mallocs used during MatSetValues calls=0
using I-node routines: found 3287 nodes, limit used is 5
Mat Object: 1 MPI processes
type: seqaij
rows=561, cols=561
total: nonzeros=9679, allocated nonzeros=9679
total number of mallocs used during MatSetValues calls=0
not using I-node routines
linear system matrix = precond matrix:
Mat Object: (fieldsplit_0_) 1 MPI processes
type: seqaij
rows=4851, cols=4851
total: nonzeros=139353, allocated nonzeros=139353
total number of mallocs used during MatSetValues calls=0
using I-node routines: found 3830 nodes, limit used is 5
Split number 1 Defined by IS
KSP Object: (fieldsplit_c_) 1 MPI processes
type: preonly
maximum iterations=10000, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
left preconditioning
using NONE norm type for convergence test
PC Object: (fieldsplit_c_) 1 MPI processes
type: lu
out-of-place factorization
tolerance for zero pivot 2.22045e-14
matrix ordering: nd
factor fill ratio given 5., needed 4.24323
Factored matrix follows:
Mat Object: 1 MPI processes
type: seqaij
rows=561, cols=561
package used to perform factorization: petsc
total: nonzeros=15823, allocated nonzeros=15823
not using I-node routines
linear system matrix = precond matrix:
Mat Object: (fieldsplit_c_) 1 MPI processes
type: seqaij
rows=561, cols=561
total: nonzeros=3729, allocated nonzeros=3729
total number of mallocs used during MatSetValues calls=0
not using I-node routines
linear system matrix = precond matrix:
Mat Object: 1 MPI processes
type: seqaij
rows=5412, cols=5412
total: nonzeros=190416, allocated nonzeros=190416
total number of mallocs used during MatSetValues calls=0
using I-node routines: found 3833 nodes, limit used is 5
--
Dr. Sebastian Blauth
Fraunhofer-Institut für
Techno- und Wirtschaftsmathematik ITWM
Abteilung Transportvorgänge
Fraunhofer-Platz 1, 67663 Kaiserslautern
Telefon: +49 631 31600-4968
<mailto:sebastian.blauth at itwm.fraunhofer.de>
sebastian.blauth at itwm.fraunhofer.de
<https://www.itwm.fraunhofer.de> https://www.itwm.fraunhofer.de
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240628/26199fcf/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: smime.p7s
Type: application/pkcs7-signature
Size: 7943 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240628/26199fcf/attachment-0001.p7s>
More information about the petsc-users
mailing list