[petsc-users] GAMG speed
Barry Smith
bsmith at mcs.anl.gov
Tue Aug 13 22:03:15 CDT 2013
On Aug 13, 2013, at 9:57 PM, Michele Rosso <mrosso at uci.edu> wrote:
> Hi Jed,
>
> I attached the output for both the runs you suggested. At the beginning of each file I included the options I used.
>
> On a side note, I tried to run with a grid of 256^3 (exactly as before) but with less levels, i.e. 3 instead of 4 or 5.
> My system stops the run because of an Out Of Memory condition. It is really odd since I have not changed anything except
> - pc_mg_levels. I cannot send you any output since there is none. Do you have any guess where the problem comes from?
By default it uses a direct solver (maybe even sequential) for the coarsest level; since the coarse level is big the direct solver requires too much memory. You could install PETSc with the ./configure option --download-superlu_dist and run with -mg_coarse_pc_type lu -mg_coarse_pc_factor_mat_solver_package superlu_dist
Barry
> Thanks,
>
> Michele
>
> On 08/13/2013 07:23 PM, Jed Brown wrote:
>> Michele Rosso <mrosso at uci.edu>
>> writes:
>>
>>> The matrix arises from discretization of the Poisson equation in
>>> incompressible flow calculations.
>>>
>> Can you try the two runs below and send -log_summary?
>>
>> -log_summary -ksp_monitor -ksp_view -ksp_converged_reason -pc_type mg -pc_mg_galerkin -pc_mg_levels 5 -mg_levels_ksp_type richardson -mg_levels_ksp_max_it 1
>>
>>
>> -log_summary -ksp_monitor -ksp_view -ksp_converged_reason -pc_type mg -pc_mg_galerkin -pc_mg_levels 5 -mg_levels_ksp_type richardson -mg_levels_ksp_max_it 1 -pc_mg_type full
>>
>
> <output_mg_1.txt><output_mg_2.txt>
More information about the petsc-users
mailing list