[petsc-users] Problem setting Fieldsplit fields
Matthew Knepley
knepley at gmail.com
Fri Feb 3 14:11:51 CST 2023
On Fri, Feb 3, 2023 at 3:03 PM Nicolas Barnafi <nbarnafi at cmm.uchile.cl>
wrote:
> > There are a number of common errors:
> >
> > 1) Your PC has a prefix
> >
> > 2) You have not called KSPSetFromOptions() here
> >
> > Can you send the -ksp_view output?
>
> The PC at least has no prefix. I had to set ksp_rtol to 1 to get through
> the solution process, you will find both the petsc_rc and the ksp_view
> at the bottom of this message.
>
> Options are indeed being set from the options file, but there must be
> something missing at a certain level. Thanks for looking into this.
>
Okay, the next step is to pass
-info
and send the output. This will tell us how the default splits were done. If
that
is not conclusive, we will have to use the debugger.
Thanks,
Matt
> Best
>
> ---- petsc_rc file
>
> -ksp_monitor
> -ksp_type gmres
> -ksp_view
> -mat_type aij
> -ksp_norm_type unpreconditioned
> -ksp_atol 1e-14
> -ksp_rtol 1
> -pc_type fieldsplit
> -pc_fieldsplit_type multiplicative
>
> ---- ksp_view
>
> KSP Object: 1 MPI process
> type: gmres
> restart=500, using Classical (unmodified) Gram-Schmidt
> Orthogonalization with no iterative refinement
> happy breakdown tolerance 1e-30
> maximum iterations=10000, nonzero initial guess
> tolerances: relative=1., absolute=1e-14, divergence=10000.
> right preconditioning
> using UNPRECONDITIONED norm type for convergence test
> PC Object: 1 MPI process
> type: fieldsplit
> FieldSplit with MULTIPLICATIVE composition: total splits = 4
> Solver info for each split is in the following KSP objects:
> Split number 0 Defined by IS
> KSP Object: (fieldsplit_0_) 1 MPI process
> type: preonly
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
> left preconditioning
> using DEFAULT norm type for convergence test
> PC Object: (fieldsplit_0_) 1 MPI process
> type: ilu
> PC has not been set up so information may be incomplete
> out-of-place factorization
> 0 levels of fill
> tolerance for zero pivot 2.22045e-14
> matrix ordering: natural
> matrix solver type: petsc
> matrix not yet factored; no additional information available
> linear system matrix = precond matrix:
> Mat Object: (fieldsplit_0_) 1 MPI process
> type: seqaij
> rows=615, cols=615
> total: nonzeros=9213, allocated nonzeros=9213
> total number of mallocs used during MatSetValues calls=0
> not using I-node routines
> Split number 1 Defined by IS
> KSP Object: (fieldsplit_1_) 1 MPI process
> type: preonly
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
> left preconditioning
> using DEFAULT norm type for convergence test
> PC Object: (fieldsplit_1_) 1 MPI process
> type: ilu
> PC has not been set up so information may be incomplete
> out-of-place factorization
> 0 levels of fill
> tolerance for zero pivot 2.22045e-14
> matrix ordering: natural
> matrix solver type: petsc
> matrix not yet factored; no additional information available
> linear system matrix = precond matrix:
> Mat Object: (fieldsplit_1_) 1 MPI process
> type: seqaij
> rows=64, cols=64
> total: nonzeros=0, allocated nonzeros=0
> total number of mallocs used during MatSetValues calls=0
> using I-node routines: found 13 nodes, limit used is 5
> Split number 2 Defined by IS
> KSP Object: (fieldsplit_2_) 1 MPI process
> type: preonly
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
> left preconditioning
> using DEFAULT norm type for convergence test
> PC Object: (fieldsplit_2_) 1 MPI process
> type: ilu
> PC has not been set up so information may be incomplete
> out-of-place factorization
> 0 levels of fill
> tolerance for zero pivot 2.22045e-14
> matrix ordering: natural
> matrix solver type: petsc
> matrix not yet factored; no additional information available
> linear system matrix = precond matrix:
> Mat Object: (fieldsplit_2_) 1 MPI process
> type: seqaij
> rows=240, cols=240
> total: nonzeros=2140, allocated nonzeros=2140
> total number of mallocs used during MatSetValues calls=0
> not using I-node routines
> Split number 3 Defined by IS
> KSP Object: (fieldsplit_3_) 1 MPI process
> type: preonly
> maximum iterations=10000, initial guess is zero
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
> left preconditioning
> using DEFAULT norm type for convergence test
> PC Object: (fieldsplit_3_) 1 MPI process
> type: ilu
> PC has not been set up so information may be incomplete
> out-of-place factorization
> 0 levels of fill
> tolerance for zero pivot 2.22045e-14
> matrix ordering: natural
> matrix solver type: petsc
> matrix not yet factored; no additional information available
> linear system matrix = precond matrix:
> Mat Object: (fieldsplit_3_) 1 MPI process
> type: seqaij
> rows=300, cols=300
> total: nonzeros=2292, allocated nonzeros=2292
> total number of mallocs used during MatSetValues calls=0
> not using I-node routines
> linear system matrix = precond matrix:
> Mat Object: 1 MPI process
> type: seqaij
> rows=1219, cols=1219
> total: nonzeros=26443, allocated nonzeros=26443
> total number of mallocs used during MatSetValues calls=0
> not using I-node routines
> solving time: 0.00449609
> iterations: 0
> estimated error: 25.4142
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230203/539af3b3/attachment.html>
More information about the petsc-users
mailing list