[petsc-users] Petsc ILU PC Change between 3.6.4 and 3.7.x?
Gaetan Kenway
gaetank at gmail.com
Fri Aug 11 12:14:50 CDT 2017
Hi All
I'm in the process of updating a code that uses PETSc for solving linear
systems for an unstructured CFD code. As of recently, it was using an
ancient version (3.3). However, when I updated it up to 3.7.6 I ended up
running into issues with one of the KSP solves. The remainder of the code
is identical,
I've tracked the issue down to occurring between version 3.6.4 and version
3.7.0 . The same issue is present on the most recent version 3.7.6.
Specifically the issue is that on the second iteration, on the 3.7 version
the KSP kicks out with KSP converged reason of -11 or
KSP_DIVERGED_PCSETUP_FAILED
<http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/KSP/KSP_DIVERGED_PCSETUP_FAILED.html#KSP_DIVERGED_PCSETUP_FAILED>
.
After that the two runs differ. The KSPView for each of the two are given
below which are identical, up to small formatting changes. There is still
more I can track down, but I thought I would ask if someone knows what
might have changed between these two versions which could save me a lot of
time.
Thanks,
Gaetan
3.6 KSP View:
KSP Object: 8 MPI processes
type: gmres
GMRES: restart=3, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
GMRES: happy breakdown tolerance 1e-30
maximum iterations=3
using preconditioner applied to right hand side for initial guess
tolerances: relative=1e-08, absolute=1e-20, divergence=1e+15
left preconditioning
using nonzero initial guess
using PRECONDITIONED norm type for convergence test
PC Object: 8 MPI processes
type: bjacobi
block Jacobi: number of blocks = 8
Local solve is same for all blocks, in the following KSP and PC objects:
KSP Object: (sub_) 1 MPI processes
type: preonly
maximum iterations=10000, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000
left preconditioning
using NONE norm type for convergence test
PC Object: (sub_) 1 MPI processes
type: ilu
ILU: out-of-place factorization
0 levels of fill
tolerance for zero pivot 2.22045e-14
matrix ordering: natural
factor fill ratio given 1, needed 1
Factored matrix follows:
Mat Object: 1 MPI processes
type: seqaij
rows=46439, cols=46439
package used to perform factorization: petsc
total: nonzeros=502615, allocated nonzeros=502615
total number of mallocs used during MatSetValues calls =0
not using I-node routines
linear system matrix = precond matrix:
Mat Object: 1 MPI processes
type: seqaij
rows=46439, cols=46439
total: nonzeros=502615, allocated nonzeros=504081
total number of mallocs used during MatSetValues calls =0
not using I-node routines
linear system matrix = precond matrix:
Mat Object: 8 MPI processes
type: mpiaij
rows=368656, cols=368656
total: nonzeros=4.63682e+06, allocated nonzeros=4.64417e+06
total number of mallocs used during MatSetValues calls =0
not using I-node (on process 0) routines
<my output: reason, iterations, rtol, atol>
reason,its: 2 3 0.001 1e-20
Petsc 3.7 KSP View
KSP Object: 8 MPI processes
type: gmres
GMRES: restart=3, using Classical (unmodified) Gram-Schmidt
Orthogonalization with no iterative refinement
GMRES: happy breakdown tolerance 1e-30
maximum iterations=3
using preconditioner applied to right hand side for initial guess
tolerances: relative=1e-08, absolute=1e-20, divergence=1e+15
left preconditioning
using nonzero initial guess
using PRECONDITIONED norm type for convergence test
PC Object: 8 MPI processes
type: bjacobi
block Jacobi: number of blocks = 8
Local solve is same for all blocks, in the following KSP and PC objects:
KSP Object: (sub_) 1 MPI processes
type: preonly
maximum iterations=10000, initial guess is zero
tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
left preconditioning
using NONE norm type for convergence test
PC Object: (sub_) 1 MPI processes
type: ilu
ILU: out-of-place factorization
0 levels of fill
tolerance for zero pivot 2.22045e-14
matrix ordering: natural
factor fill ratio given 1., needed 1.
Factored matrix follows:
Mat Object: 1 MPI processes
type: seqaij
rows=46439, cols=46439
package used to perform factorization: petsc
total: nonzeros=502615, allocated nonzeros=502615
total number of mallocs used during MatSetValues calls =0
not using I-node routines
linear system matrix = precond matrix:
Mat Object: 1 MPI processes
type: seqaij
rows=46439, cols=46439
total: nonzeros=502615, allocated nonzeros=504081
total number of mallocs used during MatSetValues calls =0
not using I-node routines
linear system matrix = precond matrix:
Mat Object: 8 MPI processes
type: mpiaij
rows=368656, cols=368656
total: nonzeros=4636822, allocated nonzeros=4644168
total number of mallocs used during MatSetValues calls =0
not using I-node (on process 0) routines
<my output: reason, iterations, rtol, atol>
reason,its: -11 0 0.001 1e-20
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170811/fd061448/attachment.html>
More information about the petsc-users
mailing list