[petsc-users] Efficient Use of GAMG for Poisson Equation with Full Neumann Boundary Conditions

Barry Smith bsmith at mcs.anl.gov
Fri Feb 20 07:47:32 CST 2015


  Excellent!  Now the only real hopes for improvement are 

1) run on an Intel system with the highest possible achievable memory bandwidth (this could possibly speed up the code by 50%)

2) to decrease the iteration counts a bit try -pressure_mg_levels_ksp_type chebyshev

3) some optimization on MatPtAPNumeric() sequential; we would have to do this, could shave off a small amount of time

  Barry



Here is the percentage of time spent in the pressure correction

MatMult           11 
MatMultAdd         4 
MatSOR             57
MatPtAPNumeric       17





> On Feb 20, 2015, at 3:35 AM, Fabian Gabel <gabel.fabian at gmail.com> wrote:
> 
> Barry,
> 
>> 
>>   First, is your pressure problem changing dramatically at each new solver? That is, for example, is the mesh moving or are there very different numerical values in the matrix?  Is the nonzero structure of the pressure matrix changing? 
> 
> No moving grids, the non-zero structure is maintained throughout the
> entire solution process. I am not sure about the "very different
> numerical values". I determined the minimal matrix coefficient to be
> approx -5e-7 and the maximal matrix coefficient to be 3e-6 (but once I
> use block structured grids with locally refined blocks the range will
> become wider), but there are some lines containing only a 1 on the
> diagonal. This comes from the variable indexing I use, which includes
> boundary values. If this should present a problem, I think I could scale
> the corresponding rows with a factor depending on the maximal/minimal
> element of the matrix.
> 
>> Currently the entire GAMG process is done for each new solve, if you use the flag
>> 
>> -pressure_pc_gamg_reuse_interpolation true
>> 
>> it will create the interpolation needed for GAMG once and reuse it for all the solves. Please try that and see what happens.
> 
> I attached the output for the additional solver option
> (-reuse_interpolation). Since there appear to be some inconsistencies
> with the previous output file for the GAMG solve I provided, I'll attach
> the results for the solution process without the flag for reusing the
> interpolation once again. So far wall clock time has been reduced by
> almost 50%.
> 
> Fabian
> 
>> 
>> Then I will have many more suggestions.
>> 
>> 
>>  Barry
>> 
>> 
>> 
>>> On Feb 17, 2015, at 9:14 AM, Fabian Gabel <gabel.fabian at gmail.com> wrote:
>>> 
>>> Dear PETSc team,
>>> 
>>> I am trying to optimize the solver parameters for the linear system I
>>> get, when I discretize the pressure correction equation Poisson equation
>>> with Neumann boundary conditions) in a SIMPLE-type algorithm using a
>>> finite volume method.
>>> 
>>> The resulting system is symmetric and positive semi-definite. A basis to
>>> the associated nullspace has been provided to the KSP object. 
>>> 
>>> Using a CG solver with ICC preconditioning the solver needs a lot of
>>> inner iterations to converge (-ksp_monitor -ksp_view output attached for
>>> a case with approx. 2e6 unknowns; the lines beginning with 000XXXX show
>>> the relative residual regarding the initial residual in the outer
>>> iteration no. 1 for the variables u,v,w,p). Furthermore I don't quite
>>> understand, why the solver reports
>>> 
>>> Linear solve did not converge due to DIVERGED_INDEFINITE_PC
>>> 
>>> at the later stages of my Picard iteration process (iteration 0001519).
>>> 
>>> I then tried out CG+GAMG preconditioning with success regarding the
>>> number of inner iterations, but without advantages regarding wall time
>>> (output attached). Also the DIVERGED_INDEFINITE_PC reason shows up
>>> repeatedly after iteration 0001487. I used the following options
>>> 
>>> -pressure_mg_coarse_sub_pc_type svd
>>> -pressure_mg_levels_ksp_rtol 1e-4
>>> -pressure_mg_levels_ksp_type richardson
>>> -pressure_mg_levels_pc_type sor
>>> -pressure_pc_gamg_agg_nsmooths 1
>>> -pressure_pc_type gamg
>>> 
>>> I would like to get an opinion on how the solver performance could be
>>> increased further. -log_summary shows that my code spends 80% of the
>>> time solving the linear systems for the pressure correction (STAGE 2:
>>> PRESSCORR). Furthermore, do you know what could be causing the
>>> DIVERGED_INDEFINITE_PC converged reason?
>>> 
>>> Regards,
>>> Fabian Gabel
>>> <gamg.128.out.531314><icc.128.out.429762>
>> 
> 
> 
> <reuse_interpolation.gamg.128.out.544044><gamg.128.out.541300>



More information about the petsc-users mailing list