[petsc-users] Efficient Use of GAMG for Poisson Equation with Full Neumann Boundary Conditions

Mark Adams mfadams at lbl.gov
Fri Feb 20 11:18:46 CST 2015


On Thu, Feb 19, 2015 at 12:45 PM, Hong <hzhang at mcs.anl.gov> wrote:

> Fabian,
> Too much time was spent on the matrix operations during setup phase, which
> has plenty room for optimization.
>


Yea, a little slow but not crazy.  With 2M equation on one proc you may be
having cache problems.



> Can you provide us a stand-alone code used in your experiment so we can
> investigate how to make our gamg more efficient?
>
> Hong
>
> On Wed, Feb 18, 2015 at 12:20 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
>>
>>   Fabian,
>>
>>    CG requires that the preconditioner be symmetric positive definite.
>> ICC even if given a symmetric positive definite matrix can generate an
>> indefinite preconditioner.
>>
>>   Similarly if an algebraic multigrid application is not "strong enough"
>> it can also result in a preconditioner that is indefinite.
>>
>>   You never want to use ICC for pressure type problems it cannot compete
>> with multigrid for large problems so let's forget about ICC and focus on
>> the GAMG.
>>
>> > -pressure_mg_coarse_sub_pc_type svd
>> > -pressure_mg_levels_ksp_rtol 1e-4
>> > -pressure_mg_levels_ksp_type richardson
>> > -pressure_mg_levels_pc_type sor
>> > -pressure_pc_gamg_agg_nsmooths 1
>> > -pressure_pc_type gamg
>>
>>   There are many many tuning parameters for MG.
>>
>>    First, is your pressure problem changing dramatically at each new
>> solver? That is, for example, is the mesh moving or are there very
>> different numerical values in the matrix?  Is the nonzero structure of the
>> pressure matrix changing? Currently the entire GAMG process is done for
>> each new solve, if you use the flag
>>
>> -pressure_pc_gamg_reuse_interpolation true
>>
>> it will create the interpolation needed for GAMG once and reuse it for
>> all the solves. Please try that and see what happens.
>>
>>  Then I will have many more suggestions.
>>
>>
>>   Barry
>>
>>
>>
>> > On Feb 17, 2015, at 9:14 AM, Fabian Gabel <gabel.fabian at gmail.com>
>> wrote:
>> >
>> > Dear PETSc team,
>> >
>> > I am trying to optimize the solver parameters for the linear system I
>> > get, when I discretize the pressure correction equation Poisson equation
>> > with Neumann boundary conditions) in a SIMPLE-type algorithm using a
>> > finite volume method.
>> >
>> > The resulting system is symmetric and positive semi-definite. A basis to
>> > the associated nullspace has been provided to the KSP object.
>> >
>> > Using a CG solver with ICC preconditioning the solver needs a lot of
>> > inner iterations to converge (-ksp_monitor -ksp_view output attached for
>> > a case with approx. 2e6 unknowns; the lines beginning with 000XXXX show
>> > the relative residual regarding the initial residual in the outer
>> > iteration no. 1 for the variables u,v,w,p). Furthermore I don't quite
>> > understand, why the solver reports
>> >
>> > Linear solve did not converge due to DIVERGED_INDEFINITE_PC
>> >
>> > at the later stages of my Picard iteration process (iteration 0001519).
>> >
>> > I then tried out CG+GAMG preconditioning with success regarding the
>> > number of inner iterations, but without advantages regarding wall time
>> > (output attached). Also the DIVERGED_INDEFINITE_PC reason shows up
>> > repeatedly after iteration 0001487. I used the following options
>> >
>> > -pressure_mg_coarse_sub_pc_type svd
>> > -pressure_mg_levels_ksp_rtol 1e-4
>> > -pressure_mg_levels_ksp_type richardson
>> > -pressure_mg_levels_pc_type sor
>> > -pressure_pc_gamg_agg_nsmooths 1
>> > -pressure_pc_type gamg
>> >
>> > I would like to get an opinion on how the solver performance could be
>> > increased further. -log_summary shows that my code spends 80% of the
>> > time solving the linear systems for the pressure correction (STAGE 2:
>> > PRESSCORR). Furthermore, do you know what could be causing the
>> > DIVERGED_INDEFINITE_PC converged reason?
>> >
>> > Regards,
>> > Fabian Gabel
>> > <gamg.128.out.531314><icc.128.out.429762>
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150220/a927d198/attachment.html>


More information about the petsc-users mailing list