[petsc-users] Condition number of matrix
Florian Lindner
mailinglists at xgm.de
Tue Aug 30 09:22:05 CDT 2016
Thanks everybody, just to be sure, it's max/min of the last iteration?
Florian
Am 30.08.2016 um 16:10 schrieb Matthew Knepley:
> On Tue, Aug 30, 2016 at 9:03 AM, Florian Lindner <mailinglists at xgm.de <mailto:mailinglists at xgm.de>> wrote:
>
> Hi,
>
> Am 30.08.2016 um 14:05 schrieb Barry Smith:
> >
> > The format of .petscrc requires each option to be on its own line
> >
> >> -ksp_view
> >> -pc_type none
> >> -ksp_type gmres
> >> -ksp_monitor_singular_value
> >> -ksp_gmres_restart 1000
>
> Oh man, didn't know that. Sorry! Is using a hash # ok for comments in .petscrc?
>
> I added the option accordingly:
>
> -ksp_view
> -pc_type none
> -ksp_type gmres
> -ksp_monitor_singular_value
> -ksp_gmres_restart 1000
>
> petsc outputs a line like:
>
> 550 KSP Residual norm 1.374922291162e-07 % max 1.842011038215e+03 min 6.509297234157e-04 max/min 2.829815526858e+06
>
> for each iteration. Sorry about my mathematical illerateness, but where can I see the condition number of the matrix?
>
>
> Its max/min since this means max singular value/min singular value.
>
> Thanks,
>
> Matt
>
>
> Thanks,
> Florian
>
> >
> >
> >
> >> On Aug 30, 2016, at 7:01 AM, Florian Lindner <mailinglists at xgm.de <mailto:mailinglists at xgm.de>> wrote:
> >>
> >> Hello,
> >>
> >> there is a FAQ and a Stackoverflow article about getting the condition number of a petsc matrix:
> >>
> >> http://www.mcs.anl.gov/petsc/documentation/faq.html#conditionnumber
> <http://www.mcs.anl.gov/petsc/documentation/faq.html#conditionnumber>
> >>
> http://scicomp.stackexchange.com/questions/34/how-can-i-estimate-the-condition-number-of-a-large-sparse-matrix-using-petsc
> <http://scicomp.stackexchange.com/questions/34/how-can-i-estimate-the-condition-number-of-a-large-sparse-matrix-using-petsc>
> >>
> >> Both tell me to add:
> >>
> >> -pc_type none -ksp_type gmres -ksp_monitor_singular_value -ksp_gmres_restart 1000
> >>
> >> to my options.
> >>
> >> I add the line to .petscrc but nothing happens, no additional output at all. I added -ksp_view, so my .petscrc looks
> >> like that:
> >>
> >> -ksp_view
> >> -pc_type none -ksp_type gmres -ksp_monitor_singular_value -ksp_gmres_restart 1000
> >>
> >> The complete output is below, but something I wonder about:
> >>
> >> GMRES: restart=30, shouldn't that be 1000
> >>
> >> And where can I read out the condition number approximation?
> >>
> >> Thanks,
> >> Florian
> >>
> >>
> >> KSP Object: 1 MPI processes
> >> type: gmres
> >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
> >> GMRES: happy breakdown tolerance 1e-30
> >> maximum iterations=10000
> >> tolerances: relative=1e-09, absolute=1e-50, divergence=10000.
> >> left preconditioning
> >> using nonzero initial guess
> >> using PRECONDITIONED norm type for convergence test
> >> PC Object: 1 MPI processes
> >> type: none
> >> linear system matrix = precond matrix:
> >> Mat Object: C 1 MPI processes
> >> type: seqsbaij
> >> rows=14403, cols=14403
> >> total: nonzeros=1044787, allocated nonzeros=1123449
> >> total number of mallocs used during MatSetValues calls =72016
> >> block size is 1
> >> (0) 13:58:35 [precice::impl::SolverInterfaceImpl]:395 in initialize: it 1 of 1 | dt# 1 | t 0 of 1 | dt 1 | max dt 1 |
> >> ongoing yes | dt complete no |
> >> (0) 13:58:35 [precice::impl::SolverInterfaceImpl]:446 in advance: Iteration #1
> >> KSP Object: 1 MPI processes
> >> type: gmres
> >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
> >> GMRES: happy breakdown tolerance 1e-30
> >> maximum iterations=10000
> >> tolerances: relative=1e-09, absolute=1e-50, divergence=10000.
> >> left preconditioning
> >> using nonzero initial guess
> >> using PRECONDITIONED norm type for convergence test
> >> PC Object: 1 MPI processes
> >> type: none
> >> linear system matrix = precond matrix:
> >> Mat Object: C 1 MPI processes
> >> type: seqsbaij
> >> rows=14403, cols=14403
> >> total: nonzeros=1044787, allocated nonzeros=1123449
> >> total number of mallocs used during MatSetValues calls =72016
> >> block size is 1
> >
>
>
>
>
> --
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any
> results to which their experiments lead.
> -- Norbert Wiener
More information about the petsc-users
mailing list