[petsc-users] CG with right preconditioning supports NONE norm type only

Barry Smith bsmith at mcs.anl.gov
Wed Mar 8 10:33:55 CST 2017


  Please tell us how you got this output. 

  PETSc CG doesn't even implement right preconditioning. If you ask for it it should error out. CG supports no norm computation with left preconditioning.

   Barry

> On Mar 8, 2017, at 10:26 AM, Kong, Fande <fande.kong at inl.gov> wrote:
> 
> Hi All,
> 
> The NONE norm type is supported only when CG is used with a right preconditioner. Any reason for this?
> 
> 
> 
> 0 Nonlinear |R| = 1.732051e+00
>       0 Linear |R| = 0.000000e+00
>       1 Linear |R| = 0.000000e+00
>       2 Linear |R| = 0.000000e+00
>       3 Linear |R| = 0.000000e+00
>       4 Linear |R| = 0.000000e+00
>       5 Linear |R| = 0.000000e+00
>       6 Linear |R| = 0.000000e+00
>  1 Nonlinear |R| = 1.769225e-08
>       0 Linear |R| = 0.000000e+00
>       1 Linear |R| = 0.000000e+00
>       2 Linear |R| = 0.000000e+00
>       3 Linear |R| = 0.000000e+00
>       4 Linear |R| = 0.000000e+00
>       5 Linear |R| = 0.000000e+00
>       6 Linear |R| = 0.000000e+00
>       7 Linear |R| = 0.000000e+00
>       8 Linear |R| = 0.000000e+00
>       9 Linear |R| = 0.000000e+00
>      10 Linear |R| = 0.000000e+00
>  2 Nonlinear |R| = 0.000000e+00
> SNES Object: 1 MPI processes
>   type: newtonls
>   maximum iterations=50, maximum function evaluations=10000
>   tolerances: relative=1e-08, absolute=1e-50, solution=1e-50
>   total number of linear solver iterations=18
>   total number of function evaluations=23
>   norm schedule ALWAYS
>   SNESLineSearch Object:   1 MPI processes
>     type: bt
>       interpolation: cubic
>       alpha=1.000000e-04
>     maxstep=1.000000e+08, minlambda=1.000000e-12
>     tolerances: relative=1.000000e-08, absolute=1.000000e-15, lambda=1.000000e-08
>     maximum iterations=40
>   KSP Object:   1 MPI processes
>     type: cg
>     maximum iterations=10000, initial guess is zero
>     tolerances:  relative=1e-05, absolute=1e-50, divergence=10000.
>     right preconditioning
>     using NONE norm type for convergence test
>   PC Object:   1 MPI processes
>     type: hypre
>       HYPRE BoomerAMG preconditioning
>       HYPRE BoomerAMG: Cycle type V
>       HYPRE BoomerAMG: Maximum number of levels 25
>       HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1
>       HYPRE BoomerAMG: Convergence tolerance PER hypre call 0.
>       HYPRE BoomerAMG: Threshold for strong coupling 0.25
>       HYPRE BoomerAMG: Interpolation truncation factor 0.
>       HYPRE BoomerAMG: Interpolation: max elements per row 0
>       HYPRE BoomerAMG: Number of levels of aggressive coarsening 0
>       HYPRE BoomerAMG: Number of paths for aggressive coarsening 1
>       HYPRE BoomerAMG: Maximum row sums 0.9
>       HYPRE BoomerAMG: Sweeps down         1
>       HYPRE BoomerAMG: Sweeps up           1
>       HYPRE BoomerAMG: Sweeps on coarse    1
>       HYPRE BoomerAMG: Relax down          symmetric-SOR/Jacobi
>       HYPRE BoomerAMG: Relax up            symmetric-SOR/Jacobi
>       HYPRE BoomerAMG: Relax on coarse     Gaussian-elimination
>       HYPRE BoomerAMG: Relax weight  (all)      1.
>       HYPRE BoomerAMG: Outer relax weight (all) 1.
>       HYPRE BoomerAMG: Using CF-relaxation
>       HYPRE BoomerAMG: Not using more complex smoothers.
>       HYPRE BoomerAMG: Measure type        local
>       HYPRE BoomerAMG: Coarsen type        Falgout
>       HYPRE BoomerAMG: Interpolation type  classical
>     linear system matrix followed by preconditioner matrix:
>     Mat Object:     1 MPI processes
>       type: mffd
>       rows=9, cols=9
>         Matrix-free approximation:
>           err=1.49012e-08 (relative error in function evaluation)
>           Using wp compute h routine
>               Does not compute normU
>     Mat Object:    ()     1 MPI processes
>       type: seqaij
>       rows=9, cols=9
>       total: nonzeros=49, allocated nonzeros=49
>       total number of mallocs used during MatSetValues calls =0
>         not using I-node routines
> 
> Fande,
> 



More information about the petsc-users mailing list