[petsc-users] understanding gmres output

Smith, Barry F. bsmith at mcs.anl.gov
Thu Apr 26 13:24:47 CDT 2018


*  You may/likely won't get the same convergence with Matlab not due to GMRES but due to differences in the ILU in Matlab and PETSc. Better to run with Jacobi or no preconditioner if you wish to get consistent results.

*   Check carefully if Matlab using left or right preconditioning for GMRES (PETSc uses left by default).

> I see that the residual < norm(b)*rtol for the 4th iteration itself, I am not sure then why GMRES continues for one more iteration.

* Since PETSc is using left preconditioning it is norm(B*b)*rtol that determines the convergence, not norm(b)*rtol. Where B is application of preconditioner.

* The residual history for GMRES includes everything through any restarts so if it takes 90 total iterations (and the restart is 30) the residual history should have 90 values.

> In the above specific example, it took GMRES a total of 5 iterations to converge. Does it mean the convergence was achieved without having to restart?

  In most cases if you need to restart then you need a better preconditioner.


   Barry




> On Apr 26, 2018, at 9:46 AM, Ganesh Diwan <gcdiwan at gmail.com> wrote:
> 
> Dear Petsc developers,
> 
> I am trying to learn to use PETSc GMRES with petsc4py and also trying to follow the codes in petsc/src/ksp/ksp/examples/tutorials. Here is the Python code I use to read a linear system and solve it with GMRES in PETSc (petsc4py)
> 
> # gmrestest.py
> # usual imports
> import numpy as np
> import scipy.io as sio
> import scipy
> from sys import getrefcount
> from petsc4py import PETSc
> 
> # read the matrix data
> mat_contents = sio.loadmat('data.mat')
> mat_A = mat_contents['A']
> vec_b = mat_contents['b']
> n = mat_contents['n']
> vec_x = mat_contents['x']
> mat_nnzA = mat_contents['nnzA']
> 
> # form the petsc matrices
> x = PETSc.Vec().createWithArray(vec_x)
> b = PETSc.Vec().createWithArray(vec_b)
> p1=mat_A.indptr
> p2=mat_A.indices
> p3=mat_A.data
> A = PETSc.Mat().createAIJ(size=mat_A.shape,csr=(p1,p2,p3))
> A = scipy.transpose(A) # transpose the csr format as my matrix is column major originally
> A.assemblyBegin()
> A.assemblyEnd()
> 
> # solve
> ksp = PETSc.KSP()
> ksp.create(PETSc.COMM_WORLD)
> ksp.setType('gmres')
> ksp.setOperators(A)
> ksp.setFromOptions()
> rtol = 1e-4
> ksp.setTolerances(rtol, 1e-5, 10000., 10000)
> ksp.view()
> ksp.setConvergenceHistory()
> ksp.solve(b, x)
> 
> # print
> print 'iterations = ', ksp.getIterationNumber()
> print 'residual = ', '{:.2e}'.format(ksp.getResidualNorm())#  %.2E ksp.getResidualNorm()
> print 'norm(b)*rtol =  ', '{:.2e}'.format(PETSc.Vec.norm(b)*rtol)# %.2E PETSc.Vec.norm(b)*rtol
> print 'converge reason# = ', ksp.getConvergedReason()
> print 'residuals at each iter = ', ksp.getConvergenceHistory()
> #
> 
> Here is the output from the above code for a linear system of dimension 100 by 100.
> 
> 
> KSP Object: 1 MPI processes
>   type: gmres
>     restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
>     happy breakdown tolerance 1e-30
>   maximum iterations=10000, initial guess is zero
>   tolerances:  relative=0.0001, absolute=1e-05, divergence=10000.
>   left preconditioning
>   using DEFAULT norm type for convergence test
> PC Object: 1 MPI processes
>   type: ilu
>   PC has not been set up so information may be incomplete
>     out-of-place factorization
>     0 levels of fill
>     tolerance for zero pivot 2.22045e-14
>     matrix ordering: natural
>   linear system matrix = precond matrix:
>   Mat Object: 1 MPI processes
>     type: seqaij
>     rows=100, cols=100
>     total: nonzeros=2704, allocated nonzeros=2704
>     total number of mallocs used during MatSetValues calls =0
>       using I-node routines: found 25 nodes, limit used is 5
> iterations =  5
> residual =  1.38e-03
> norm(b)*rtol =   1.99e-01
> converge reason# =  2
> residuals at each iter =  [  2.05677686e+01   4.97916031e+00   4.82888782e-01   1.16849581e-01
>    8.87159777e-03   1.37992327e-03]
> 
> Sorry if this sounds stupid, but I am trying to understand the output from PETSc by contrasting it with Matlab GMRES.
> I see that the residual < norm(b)*rtol for the 4th iteration itself, I am not sure then why GMRES continues for one more iteration. Secondly, how does one get total number of iterations? For eg. let's say if it takes 3 outer iterations each with the default restart of 30, then one would expect the length of residual vector to be 150. In the above specific example, it took GMRES a total of 5 iterations to converge. Does it mean the convergence was achieved without having to restart?
> 
> Thank you, Ganesh



More information about the petsc-users mailing list