[petsc-users] How to speed up geometric multigrid

Matthew Knepley knepley at gmail.com
Wed Oct 2 17:25:02 CDT 2013


On Wed, Oct 2, 2013 at 5:12 PM, Michele Rosso <mrosso at uci.edu> wrote:

>  Barry,
>
> sorry for not replying to your other e-mail earlier.
> The equation I am solving is:
>
> [image: $\nabla\cdot(\frac{1}{\rho}\nabla p)=\nabla\cdot u^*$]
>
> where [image: $p$] is the pressure field, [image: $\rho$] the density
> field and [image: $u^*$]  the velocity field.
> Since I am using finite difference on a staggered grid, the pressure is
> defined on "cell" centers, while the velocity components on "cell" faces,
> even if
> no cell is actually defined.
>

If you just prescribe a constant rho, do you get 3-5 iterates? What about a
simple step through the domain?
Starting on the hardest problem does not sound like that way to understand
what is going on.

  Matt


> I am simulating a bi-phase flow, thus both density and pressure are
> discontinuos, but not the velocity (no mass trasfer is included at the
> moment).
> Therefore the right-hand-side (rhs) of the above equation does not have
> jumps, while $p$ and $rho$ do.
> In order to deal with such jumps, I am using a Ghost Fluid approach.
> Therefore the resulting linear system is slighly different from the typical
> Poisson system, though
> simmetry and diagonal dominance of the matrix are mantained.
> The boundary conditions are periodic in all the three space directions,
> therefore the system is singular. Thus I removed the nullspace of the
> matrix by using:
>
>         call MatNullSpaceCreate(
> PETSC_COMM_WORLD,PETSC_TRUE,PETSC_NULL_INTEGER,&
>                                & PETSC_NULL_INTEGER,nullspace,ierr)
>         call KSPSetNullspace(ksp,nullspace,ierr)
>         call MatNullSpaceDestroy(nullspace,ierr)
>
> Hope this helps. Please let me know if you need any other info.
> Also, I use the pressure at the previous time step as starting point for
> the solve. Could this be a reason why the convergence is so slow?
> Thanks a lot,
>
> Michele
>
>
>
>
>
>
>
> On 10/02/2013 11:39 AM, Barry Smith wrote:
>
>   Something is wrong, you should be getting better convergence. Please answer my other email.
>
>
> On Oct 2, 2013, at 1:10 PM, Michele Rosso <mrosso at uci.edu> <mrosso at uci.edu> wrote:
>
>
>  Thank you all for your contribution.
> So far the fastest solution is still the initial one proposed by Jed in an earlier round:
>
> -ksp_atol 1e-9  -ksp_monitor_true_residual  -ksp_view  -log_summary -mg_coarse_pc_factor_mat_solver_package superlu_dist
> -mg_coarse_pc_type lu    -mg_levels_ksp_max_it 3 -mg_levels_ksp_type richardson  -options_left -pc_mg_galerkin
> -pc_mg_levels 5  -pc_mg_log  -pc_type mg
>
> where I used  -mg_levels_ksp_max_it 3  as Barry suggested instead of  -mg_levels_ksp_max_it 1.
> I attached the diagnostics for this case. Any further idea?
> Thank you,
>
> Michele
>
>
> On 10/01/2013 11:44 PM, Barry Smith wrote:
>
>  On Oct 2, 2013, at 12:28 AM, Jed Brown <jedbrown at mcs.anl.gov> <jedbrown at mcs.anl.gov> wrote:
>
>
>  "Mark F. Adams" <mfadams at lbl.gov> <mfadams at lbl.gov> writes:
>
>  run3.txt uses:
>
> -ksp_type richardson
>
> This is bad and I doubt anyone recommended it intentionally.
>
>     Hell this is normal multigrid without a Krylov accelerator. Under normal circumstances with geometric multigrid this should be fine, often the best choice.
>
>
>  I would have expected FGMRES, but Barry likes Krylov smoothers and
> Richardson is one of a few methods that can tolerate nonlinear
> preconditioners.
>
>
>  You also have, in this file,
>
> -mg_levels_ksp_type gmres
>
> did you or the recommenders mean
>
> -mg_levels_ksp_type richardson  ???
>
> you are using gmres here, which forces you to use fgmres in the outer solver.  This is a safe thing to use you if you apply your BCa symmetrically with a low order discretization then
>
> -ksp_type cg
> -mg_levels_ksp_type richardson
> -mg_levels_pc_type sor
>
> is what I'd recommend.
>
>  I thought that was tried in an earlier round.
>
> I don't understand why SOR preconditioning in the Krylov smoother is so
> drastically more expensive than BJacobi/ILU and why SOR is called so
> many more times even though the number of outer iterations
>
> bjacobi: PCApply              322 1.0 4.1021e+01 1.0 6.44e+09 1.0 3.0e+07 1.6e+03 4.5e+04 74 86 98 88 92 28160064317351226 20106
> bjacobi: KSPSolve              46 1.0 4.6268e+01 1.0 7.52e+09 1.0 3.0e+07 1.8e+03 4.8e+04 83100100 99 99 31670065158291309 20800
>
> sor:     PCApply             1132 1.0 1.5532e+02 1.0 2.30e+10 1.0 1.0e+08 1.6e+03 1.6e+05 69 88 99 88 93 21871774317301274 18987
> sor:     KSPSolve             201 1.0 1.7101e+02 1.0 2.63e+10 1.0 1.1e+08 1.8e+03 1.7e+05 75100100 99 98 24081775248221352 19652
>
>   <best.txt>
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20131002/210d1970/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: tblatex-4.png
Type: image/png
Size: 601 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20131002/210d1970/attachment-0004.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: tblatex-5.png
Type: image/png
Size: 1358 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20131002/210d1970/attachment-0005.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: tblatex-3.png
Type: image/png
Size: 596 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20131002/210d1970/attachment-0006.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: tblatex-2.png
Type: image/png
Size: 648 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20131002/210d1970/attachment-0007.png>


More information about the petsc-users mailing list