[petsc-users] how to speed up convergence?

Jed Brown jedbrown at mcs.anl.gov
Thu Nov 10 10:43:14 CST 2011

On Thu, Nov 10, 2011 at 10:25, Konstantinos Kontzialis <
ckontzialis at lycos.com> wrote:

> mpiexec -n 8 ./hoac cylinder -snes_mf_operator -llf_flux -n_out 2
> -end_time 0.4 -implicit -pc_type asm -sub_pc_type ilu
> -sub_pc_factor_mat_ordering_**type rcm -sub_pc_factor_reuse_ordering
> -sub_pc_factor_reuse_fill -gll -ksp_type fgmres -sub_pc_factor_levels 0
> -snes_monitor -snes_converged_reason -ksp_converged_reason -ts_view
> -ksp_pc_side right -sub_pc_factor_nonzeros_along_**diagonal -dt 1.0e-3
> -ts_type arkimex -ksp_gmres_restart 100 -ksp_max_it 500 -snes_max_fail 100
> -snes_max_linear_solve_fail 100

> Approximation order = 2
> # DOF = 115200
> # nodes in mesh = 1680
> # elements in mesh = 1600
> Navier-Stokes solution
> Using LLF flux

Are you using a limiter?

> Linear solve converged due to CONVERGED_RTOL iterations 1
> Timestep   0: dt = 0.001, T = 0, Res[rho] = 0.966549, Res[rhou] = 6.11366,
> Res[rhov] = 0.507325, Res[E] = 2.44463, CFL = 0.942045
>    0 SNES Function norm 3.203604511352e+03
>    Linear solve did not converge due to DIVERGED_ITS iterations 500
>    1 SNES Function norm 3.440800722147e+02
>    Linear solve did not converge due to DIVERGED_ITS iterations 500
>    2 SNES Function norm 2.008355246473e+02
>    Linear solve did not converge due to DIVERGED_ITS iterations 500
>    3 SNES Function norm 1.177925999321e+02
> as you may see the step size is quite small for this problem and I use and
> inexact solution for the linear part of the newton iterations. Furthermore,
> I compute numerically the jacobian of the matrix using coloring.

No need for -snes_mf_operator if you use coloring. What functions did you
use for coloring? See if "-mat_fd_type ds" affects the results (good or

How are you ordering degrees of freedom? I would not rely on RCM with AIJ
to coalesce all blocks so they can be solved together.

I would try -sub_pc_type lu to see if it's the ILU or something else that
is responsible for the lack of convergence.

What units are you using for state variables and residuals? It is important
to choose units so that the system is well-scaled when using implicit

How did you implement boundary conditions?

Is your system written in conservative or primitive variables? You can do a
"low-Mach" preconditioner using PCFieldSplit if the system is in primitve
variables. For conservative variables, the preconditioner needs a change of
variables and we don't have an interface to do it automatically, so you
have to use PCShell. While this might eventually be more efficient than
ASM, you should be able to make ASM work adequately.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20111110/1e2dd3f1/attachment-0001.htm>

More information about the petsc-users mailing list