[petsc-users] Issue using multi-grid as a pre-conditioner with KSP for a Poisson problem

Matthew Knepley knepley at gmail.com
Thu Jun 22 11:23:18 CDT 2017


On Wed, Jun 21, 2017 at 8:12 PM, Jason Lefley <jason.lefley at aclectic.com>
wrote:

> Hello,
>
> We are attempting to use the PETSc KSP solver framework in a fluid
> dynamics simulation we developed. The solution is part of a pressure
> projection and solves a Poisson problem. We use a cell-centered layout with
> a regular grid in 3d. We started with ex34.c from the KSP tutorials since
> it has the correct calls for the 3d DMDA, uses a cell-centered layout, and
> states that it works with multi-grid. We modified the operator construction
> function to match the coefficients and Dirichlet boundary conditions used
> in our problem (we’d also like to support Neumann but left those out for
> now to keep things simple). As a result of the modified boundary
> conditions, our version does not perform a null space removal on the right
> hand side or operator as the original did. We also modified the right hand
> side to contain a sinusoidal pattern for testing. Other than these changes,
> our code is the same as the original ex34.c
>
> With the default KSP options and using CG with the default pre-conditioner
> and without a pre-conditioner, we see good convergence. However, we’d like
> to accelerate the time to solution further and target larger problem sizes
> (>= 1024^3) if possible. Given these objectives, multi-grid as a
> pre-conditioner interests us. To understand the improvement that multi-grid
> provides, we ran ex45 from the KSP tutorials. ex34 with CG and no
> pre-conditioner appears to converge in a single iteration and we wanted to
> compare against a problem that has similar convergence patterns to our
> problem. Here’s the tests we ran with ex45:
>
> mpirun -n 16 ./ex45 -da_grid_x 129 -da_grid_y 129 -da_grid_z 129
>         time in KSPSolve(): 7.0178e+00
>         solver iterations: 157
>         KSP final norm of residual: 3.16874e-05
>
> mpirun -n 16 ./ex45 -da_grid_x 129 -da_grid_y 129 -da_grid_z 129 -ksp_type
> cg -pc_type none
>         time in KSPSolve(): 4.1072e+00
>         solver iterations: 213
>         KSP final norm of residual: 0.000138866
>
> mpirun -n 16 ./ex45 -da_grid_x 129 -da_grid_y 129 -da_grid_z 129 -ksp_type
> cg
>         time in KSPSolve(): 3.3962e+00
>         solver iterations: 88
>         KSP final norm of residual: 6.46242e-05
>
> mpirun -n 16 ./ex45 -da_grid_x 129 -da_grid_y 129 -da_grid_z 129 -pc_type
> mg -pc_mg_levels 5 -mg_levels_ksp_type richardson -mg_levels_ksp_max_it 1
> -mg_levels_pc_type bjacobi
>         time in KSPSolve(): 1.3201e+00
>         solver iterations: 4
>         KSP final norm of residual: 8.13339e-05
>
> mpirun -n 16 ./ex45 -da_grid_x 129 -da_grid_y 129 -da_grid_z 129 -pc_type
> mg -pc_mg_levels 5 -mg_levels_ksp_type richardson -mg_levels_ksp_max_it 1
> -mg_levels_pc_type bjacobi -ksp_type cg
>         time in KSPSolve(): 1.3008e+00
>         solver iterations: 4
>         KSP final norm of residual: 2.21474e-05
>
> We found the multi-grid pre-conditioner options in the KSP tutorials
> makefile. These results make sense; both the default GMRES and CG solvers
> converge and CG without a pre-conditioner takes more iterations. The
> multi-grid pre-conditioned runs are pretty dramatically accelerated and
> require only a handful of iterations.
>
> We ran our code (modified ex34.c as described above) with the same
> parameters:
>
> mpirun -n 16 ./solver_test -da_grid_x 128 -da_grid_y 128 -da_grid_z 128
>         time in KSPSolve(): 5.3729e+00
>         solver iterations: 123
>         KSP final norm of residual: 0.00595066
>
> mpirun -n 16 ./solver_test -da_grid_x 128 -da_grid_y 128 -da_grid_z 128
> -ksp_type cg -pc_type none
>         time in KSPSolve(): 3.6154e+00
>         solver iterations: 188
>         KSP final norm of residual: 0.00505943
>
> mpirun -n 16 ./solver_test -da_grid_x 128 -da_grid_y 128 -da_grid_z 128
> -ksp_type cg
>         time in KSPSolve(): 3.5661e+00
>         solver iterations: 98
>         KSP final norm of residual: 0.00967462
>
> mpirun -n 16 ./solver_test -da_grid_x 128 -da_grid_y 128 -da_grid_z 128
> -pc_type mg -pc_mg_levels 5 -mg_levels_ksp_type richardson
> -mg_levels_ksp_max_it 1 -mg_levels_pc_type bjacobi
>         time in KSPSolve(): 4.5606e+00
>         solver iterations: 44
>         KSP final norm of residual: 949.553
>

1) Dave is right

2) In order to see how many iterates to expect, first try using algebraic
multigrid

  -pc_type gamg

This should work out of the box for Poisson

3) For questions like this, we really need to see

  -ksp_view -ksp_monitor_true_residual

4) It sounds like you smoother is not strong enough. You could try

  -mg_levels_ksp_type richardson -mg_levels_ksp_richardson_self_scale
-mg_levels_ksp_max_it 5

or maybe GMRES until it works.

 Thanks,

    Matt


> mpirun -n 16 ./solver_test -da_grid_x 128 -da_grid_y 128 -da_grid_z 128
> -pc_type mg -pc_mg_levels 5 -mg_levels_ksp_type richardson
> -mg_levels_ksp_max_it 1 -mg_levels_pc_type bjacobi -ksp_type cg
>         time in KSPSolve(): 1.5481e+01
>         solver iterations: 198
>         KSP final norm of residual: 0.916558
>
> We performed all tests with petsc-3.7.6.
>
> The trends with CG and GMRES seem consistent with the results from ex45.
> However, with multi-grid, something doesn’t seem right. Convergence seems
> poor and the solves run for many more iterations than ex45 with multi-grid
> as a pre-conditioner. I extensively validated the code that builds the
> matrix and also confirmed that the solution produced by CG, when evaluated
> with the system of equations elsewhere in our simulation, produces the same
> residual as indicated by PETSc. Given that we only made minimal
> modifications to the original example code, it seems likely that the
> operators constructed for the multi-grid levels are ok.
>
> We also tried a variety of other suggested parameters for the multi-grid
> pre-conditioner as suggested in related mailing list posts but we didn’t
> observe any significant improvements over the results above.
>
> Is there anything we can do to check the validity of the coefficient
> matrices built for the different multi-grid levels? Does it look like there
> could be problems there? Or any other suggestions to achieve better results
> with multi-grid? I have the -log_view, -ksp_view, and convergence monitor
> output from the above tests and can post any of it if it would assist.
>
> Thanks




-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

http://www.caam.rice.edu/~mk51/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170622/ad28925d/attachment.html>


More information about the petsc-users mailing list