[petsc-users] Exposing further detail in -log_view for Hypre withe PETSc

Mark Adams mfadams at lbl.gov
Wed Apr 10 08:01:06 CDT 2024


I believe there is an option to get hypre to print its performance data.
Run with -help and grep on "pc_hypre" and look for something that looks
like a logging or view parameter.

Mark

On Wed, Apr 10, 2024 at 7:49 AM Khurana, Parv <p.khurana22 at imperial.ac.uk>
wrote:

> Hello PETSc users community, Thank you in advance for your help as always.
> I am using BoomerAMG from Hypre via PETSc as a part of preconditioner in my
> software (Nektar++). I am trying to understand the time profiling
> information that is printed
> ZjQcmQRYFpfptBannerStart
> This Message Is From an External Sender
> This message came from outside your organization.
>
> ZjQcmQRYFpfptBannerEnd
>
> Hello PETSc users community,
>
>
>
> Thank you in advance for your help as always.
>
>
>
> I am using BoomerAMG from Hypre via PETSc as a part of preconditioner in
> my software (Nektar++). I am trying to understand the time profiling
> information that is printed using the -log_view option.
>
>
>
> I want to understand how much time is spent in the smoothening step vs the
> time to solve on the coarsest grid I reach. The output that I get from
> -log_view (pasted below) gives me information of the KSPSolve and MatMult,
> but I think I need more granular time information to see a further
> breakdown of time spent within my routines. I would like to hear if anyone
> has any recommendations on obtaining this information?
>
>
>
> Best
>
> Parv Khurana
>
>
>
>
>
> PETSc database options used for solve:
>
>
>
> -ksp_monitor # (source: file)
>
> -ksp_type preonly # (source: file)
>
> -log_view # (source: file)
>
> -pc_hypre_boomeramg_coarsen_type hmis # (source: file)
>
> -pc_hypre_boomeramg_grid_sweeps_all 2 # (source: file)
>
> -pc_hypre_boomeramg_interp_type ext+i # (source: file)
>
> -pc_hypre_boomeramg_max_iter 1 # (source: file)
>
> -pc_hypre_boomeramg_P_max 2 # (source: file)
>
> -pc_hypre_boomeramg_print_debug 1 # (source: file)
>
> -pc_hypre_boomeramg_print_statistics 1 # (source: file)
>
> -pc_hypre_boomeramg_relax_type_all sor/jacobi # (source: file)
>
> -pc_hypre_boomeramg_strong_threshold 0.7 # (source: file)
>
> -pc_hypre_boomeramg_truncfactor 0.3 # (source: file)
>
> -pc_hypre_type boomeramg # (source: file)
>
> -pc_type hypre # (source: file)
>
>
>
>
>
> PETSc log_view output:
>
>
>
>
> ------------------------------------------------------------------------------------------------------------------------
>
> Event                Count      Time (sec)     Flop
>        --- Global ---  --- Stage ----  Total
>
>                    Max Ratio  Max     Ratio   Max  Ratio  Mess   AvgLen
> Reduct  %T %F %M %L %R  %T %F %M %L %R Mflop/s
>
>
> ------------------------------------------------------------------------------------------------------------------------
>
>
>
> --- Event Stage 0: Main Stage
>
>
>
> BuildTwoSided          1 1.0 9.6900e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
>
> MatMult               14 1.0 1.6315e-01 1.0 1.65e+08 1.0 0.0e+00 0.0e+00
> 0.0e+00  0 86  0  0  0   0 86  0  0  0  1011
>
> MatConvert             1 1.0 4.3092e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
>
> MatAssemblyBegin       3 1.0 3.1680e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
>
> MatAssemblyEnd         3 1.0 9.4178e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
>
> MatGetRowIJ            2 1.0 1.1630e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
>
> MatSetPreallCOO        1 1.0 3.2132e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
>
> MatSetValuesCOO        1 1.0 2.9956e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
>
> VecNorm               28 1.0 1.3981e-02 1.0 2.10e+07 1.0 0.0e+00 0.0e+00
> 0.0e+00  0 11  0  0  0   0 11  0  0  0  1499
>
> VecSet                13 1.0 6.5185e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
>
> VecAYPX               14 1.0 7.1511e-03 1.0 5.24e+06 1.0 0.0e+00 0.0e+00
> 0.0e+00  0  3  0  0  0   0  3  0  0  0   733
>
> VecAssemblyBegin      14 1.0 1.3998e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
>
> VecAssemblyEnd        14 1.0 4.2560e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
>
> VecScatterBegin       14 1.0 8.2761e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
>
> VecScatterEnd         14 1.0 4.4665e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
>
> SFSetGraph             1 1.0 6.5993e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
>
> SFSetUp                1 1.0 7.9212e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
>
> SFPack                14 1.0 5.8690e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
>
> SFUnpack              14 1.0 4.3370e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
>
> KSPSetUp               1 1.0 2.4910e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
>
> KSPSolve              14 1.0 2.1922e+00 1.0 1.91e+08 1.0 0.0e+00 0.0e+00
> 0.0e+00  0 100  0  0  0   0 100  0  0  0    87
>
> PCSetUp                1 1.0 1.3165e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
>
> PCApply               14 1.0 1.9990e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
>
>
> ------------------------------------------------------------------------------------------------------------------------
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240410/4984b039/attachment-0001.html>


More information about the petsc-users mailing list