[petsc-dev] [petsc-users] Poor weak scaling when solving successive linearsystems
Junchao Zhang
jczhang at mcs.anl.gov
Thu Jun 14 17:45:02 CDT 2018
I tested -pc_gamg_repartition with 216 processors again. First I tested
with these options
-log_view \
-ksp_rtol 1E-6 \
-ksp_type cg \
-ksp_norm_type unpreconditioned \
-mg_levels_ksp_type richardson \
-mg_levels_ksp_norm_type none \
-mg_levels_pc_type sor \
-mg_levels_ksp_max_it 1 \
-mg_levels_pc_sor_its 1 \
-mg_levels_esteig_ksp_type cg \
-mg_levels_esteig_ksp_max_it 10 \
-pc_type gamg \
-pc_gamg_type agg \
-pc_gamg_threshold 0.05 \
-pc_gamg_type classical \
-gamg_est_ksp_type cg \
-pc_gamg_square_graph 10 \
-pc_gamg_threshold 0.0
then I tested with an extra -pc_gamg_repartition. With repartition, the
time increased from 120s to 140s. The code measures first KSPSolve and the
remaining in separate stages, so the repartition time was not counted in
the stage of interest. Actually, log_view says GMAG :repartition time (in
the first event stage) is about 1.5 sec., so it is not a big deal. I also
tested -pc_gamg_square_graph 4. It did not change the time.
I tested hypre with options "-log_view -ksp_rtol 1E-6 -ksp_type cg
-ksp_norm_type unpreconditioned -pc_type hypre" and nothing else. The code
ran out of time. In old tests, a job (1000 KSPSolve with 7 KSP iterations
each) took 4 minutes. With hypre, 1 KSPSolve + 6 KSP iterations each, takes
6 minutes.
I will test and profile the code on a single node, and apply some
vecscatter optimizations I recently did to see what happens.
--Junchao Zhang
On Thu, Jun 14, 2018 at 11:03 AM, Mark Adams <mfadams at lbl.gov> wrote:
> And with 7-point stensils and no large material discontinuities you
> probably want -pc_gamg_square_graph 10 -pc_gamg_threshold 0.0 and you
> could test the square graph parameter (eg, 1,2,3,4).
>
> And I would definitely test hypre.
>
> On Thu, Jun 14, 2018 at 8:54 AM Mark Adams <mfadams at lbl.gov> wrote:
>
>>
>>> Just -pc_type hypre instead of -pc_type gamg.
>>>
>>>
>> And you need to have configured PETSc with hypre.
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20180614/475aa750/attachment.html>
More information about the petsc-dev
mailing list