[petsc-users] Petsc ILU PC Change between 3.6.4 and 3.7.x?
Gaetan Kenway
gaetank at gmail.com
Fri Aug 11 15:16:28 CDT 2017
Interesting. The main thing is it's now sorted out and then solver is back
in production.
Thanks for your help
On Fri, Aug 11, 2017 at 1:05 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
> > On Aug 11, 2017, at 2:43 PM, Gaetan Kenway <gaetank at gmail.com> wrote:
> >
> > Huh. That's odd then. I was actually bisecting the petsc releases to
> narrow it down...I knew 3.3 was ok and 3.7 was not. So I tried 3.5 which
> was ok, and then 3.6 which was ok as well, leading to me to conclude the
> difference was between 3.6 and 3.7.
>
> Other people have also reported only seeing the problem with later
> versions. I think it is related to the default tolerances and it is just
> "pure luck" that it didn't need a shift with the intermediate versions. ILU
> with nearly zero pivots is a fragile issue.
>
> Barry
>
> >
> > On Fri, Aug 11, 2017 at 12:03 PM, Barry Smith <bsmith at mcs.anl.gov>
> wrote:
> >
> > Thanks for confirming this. The change was actually in the 3.4
> release. I have updated the 3.4 changes file to include this change in both
> the maint and master branches.
> >
> > Barry
> >
> > > On Aug 11, 2017, at 12:47 PM, Gaetan Kenway <gaetank at gmail.com> wrote:
> > >
> > > OK, so that was certainly it. I vaguely recall reading something
> about this on the mailing list at one point in time, but couldn't find
> anything.
> > > I would definitely put something on the 3.7 change doc since I looked
> there first to see if anything stuck out.
> > >
> > > Thanks!
> > >
> > > On Fri, Aug 11, 2017 at 10:28 AM, Barry Smith <bsmith at mcs.anl.gov>
> wrote:
> > >
> > > Run with the additional option
> > >
> > > -sub_pc_factor_shift_type nonzero
> > >
> > > does this resolve the problem. We changed the default behavior for ILU
> when it detects "zero" pivots.
> > >
> > > Please let us know if this resolves the problem and we'll update the
> changes file.
> > >
> > > Barry
> > >
> > >
> > >
> > > > On Aug 11, 2017, at 12:14 PM, Gaetan Kenway <gaetank at gmail.com>
> wrote:
> > > >
> > > > Hi All
> > > >
> > > > I'm in the process of updating a code that uses PETSc for solving
> linear systems for an unstructured CFD code. As of recently, it was using
> an ancient version (3.3). However, when I updated it up to 3.7.6 I ended up
> running into issues with one of the KSP solves. The remainder of the code
> is identical,
> > > > I've tracked the issue down to occurring between version 3.6.4 and
> version 3.7.0 . The same issue is present on the most recent version 3.7.6.
> > > >
> > > > Specifically the issue is that on the second iteration, on the 3.7
> version the KSP kicks out with KSP converged reason of -11 or
> KSP_DIVERGED_PCSETUP_FAILED . After that the two runs differ. The KSPView
> for each of the two are given below which are identical, up to small
> formatting changes. There is still more I can track down, but I thought I
> would ask if someone knows what might have changed between these two
> versions which could save me a lot of time.
> > > >
> > > > Thanks,
> > > > Gaetan
> > > >
> > > > 3.6 KSP View:
> > > > KSP Object: 8 MPI processes
> > > > type: gmres
> > > > GMRES: restart=3, using Classical (unmodified) Gram-Schmidt
> Orthogonalization with no iterative refinement
> > > > GMRES: happy breakdown tolerance 1e-30
> > > > maximum iterations=3
> > > > using preconditioner applied to right hand side for initial guess
> > > > tolerances: relative=1e-08, absolute=1e-20, divergence=1e+15
> > > > left preconditioning
> > > > using nonzero initial guess
> > > > using PRECONDITIONED norm type for convergence test
> > > > PC Object: 8 MPI processes
> > > > type: bjacobi
> > > > block Jacobi: number of blocks = 8
> > > > Local solve is same for all blocks, in the following KSP and PC
> objects:
> > > > KSP Object: (sub_) 1 MPI processes
> > > > type: preonly
> > > > maximum iterations=10000, initial guess is zero
> > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000
> > > > left preconditioning
> > > > using NONE norm type for convergence test
> > > > PC Object: (sub_) 1 MPI processes
> > > > type: ilu
> > > > ILU: out-of-place factorization
> > > > 0 levels of fill
> > > > tolerance for zero pivot 2.22045e-14
> > > > matrix ordering: natural
> > > > factor fill ratio given 1, needed 1
> > > > Factored matrix follows:
> > > > Mat Object: 1 MPI processes
> > > > type: seqaij
> > > > rows=46439, cols=46439
> > > > package used to perform factorization: petsc
> > > > total: nonzeros=502615, allocated nonzeros=502615
> > > > total number of mallocs used during MatSetValues calls =0
> > > > not using I-node routines
> > > > linear system matrix = precond matrix:
> > > > Mat Object: 1 MPI processes
> > > > type: seqaij
> > > > rows=46439, cols=46439
> > > > total: nonzeros=502615, allocated nonzeros=504081
> > > > total number of mallocs used during MatSetValues calls =0
> > > > not using I-node routines
> > > > linear system matrix = precond matrix:
> > > > Mat Object: 8 MPI processes
> > > > type: mpiaij
> > > > rows=368656, cols=368656
> > > > total: nonzeros=4.63682e+06, allocated nonzeros=4.64417e+06
> > > > total number of mallocs used during MatSetValues calls =0
> > > > not using I-node (on process 0) routines
> > > > <my output: reason, iterations, rtol, atol>
> > > > reason,its: 2 3 0.001 1e-20
> > > >
> > > >
> > > > Petsc 3.7 KSP View
> > > > KSP Object: 8 MPI processes
> > > > type: gmres
> > > > GMRES: restart=3, using Classical (unmodified) Gram-Schmidt
> Orthogonalization with no iterative refinement
> > > > GMRES: happy breakdown tolerance 1e-30
> > > > maximum iterations=3
> > > > using preconditioner applied to right hand side for initial guess
> > > > tolerances: relative=1e-08, absolute=1e-20, divergence=1e+15
> > > > left preconditioning
> > > > using nonzero initial guess
> > > > using PRECONDITIONED norm type for convergence test
> > > > PC Object: 8 MPI processes
> > > > type: bjacobi
> > > > block Jacobi: number of blocks = 8
> > > > Local solve is same for all blocks, in the following KSP and PC
> objects:
> > > > KSP Object: (sub_) 1 MPI processes
> > > > type: preonly
> > > > maximum iterations=10000, initial guess is zero
> > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000.
> > > > left preconditioning
> > > > using NONE norm type for convergence test
> > > > PC Object: (sub_) 1 MPI processes
> > > > type: ilu
> > > > ILU: out-of-place factorization
> > > > 0 levels of fill
> > > > tolerance for zero pivot 2.22045e-14
> > > > matrix ordering: natural
> > > > factor fill ratio given 1., needed 1.
> > > > Factored matrix follows:
> > > > Mat Object: 1 MPI processes
> > > > type: seqaij
> > > > rows=46439, cols=46439
> > > > package used to perform factorization: petsc
> > > > total: nonzeros=502615, allocated nonzeros=502615
> > > > total number of mallocs used during MatSetValues calls =0
> > > > not using I-node routines
> > > > linear system matrix = precond matrix:
> > > > Mat Object: 1 MPI processes
> > > > type: seqaij
> > > > rows=46439, cols=46439
> > > > total: nonzeros=502615, allocated nonzeros=504081
> > > > total number of mallocs used during MatSetValues calls =0
> > > > not using I-node routines
> > > > linear system matrix = precond matrix:
> > > > Mat Object: 8 MPI processes
> > > > type: mpiaij
> > > > rows=368656, cols=368656
> > > > total: nonzeros=4636822, allocated nonzeros=4644168
> > > > total number of mallocs used during MatSetValues calls =0
> > > > not using I-node (on process 0) routines
> > > > <my output: reason, iterations, rtol, atol>
> > > > reason,its: -11 0 0.001 1e-20
> > > >
> > > >
> > >
> > >
> >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170811/2dd00adb/attachment.html>
More information about the petsc-users
mailing list