[petsc-users] Exposing further detail in -log_view for Hypre withe PETSc
Mark Adams
mfadams at lbl.gov
Wed Apr 10 08:21:21 CDT 2024
Ask hypre how to do this and we can figure out how to get it in PETSc.
Mark
On Wed, Apr 10, 2024 at 9:09 AM Khurana, Parv <p.khurana22 at imperial.ac.uk>
wrote:
> Hi Mark,
>
>
>
> I have had a look at these, and I could only find these ones that looked
> relevant:
>
>
>
> -pc_hypre_boomeramg_print_statistics: Print statistics (None)
>
> -pc_hypre_boomeramg_print_debug: Print debug information (None)
>
>
>
> -pc_hypre_boomeramg_print_statistics prints information about the V-cycle
> (number of levels, Dofs per level etc.) It also gave this information
> (which is something I can potentially work with?)
>
>
>
> Proc = 0 Level = 0 Coarsen Time = 0.022089
>
> Proc = 0 Level = 0 Build Interp Time = 0.138844
>
> Proc = 0 Level = 0 Build Coarse Operator Time = 0.354488
>
> Proc = 0 Level = 1 Coarsen Time = 0.009334
>
> Proc = 0 Level = 1 Build Interp Time = 0.074119
>
> Proc = 0 Level = 1 Build Coarse Operator Time = 0.097702
>
> Proc = 0 Level = 2 Coarsen Time = 0.004301
>
> Proc = 0 Level = 2 Build Interp Time = 0.035835
>
> Proc = 0 Level = 2 Build Coarse Operator Time = 0.030501
>
> Proc = 0 Level = 3 Coarsen Time = 0.001876
>
> Proc = 0 Level = 3 Build Interp Time = 0.014711
>
> Proc = 0 Level = 3 Build Coarse Operator Time = 0.008681
>
> Proc = 0 Level = 4 Coarsen Time = 0.000557
>
> Proc = 0 Level = 4 Build Interp Time = 0.004307
>
> Proc = 0 Level = 4 Build Coarse Operator Time = 0.002373
>
> Proc = 0 Level = 5 Coarsen Time = 0.000268
>
> Proc = 0 Level = 5 Build Interp Time = 0.001061
>
> Proc = 0 Level = 5 Build Coarse Operator Time = 0.000589
>
> Proc = 0 Level = 6 Coarsen Time = 0.000149
>
> Proc = 0 Level = 6 Build Interp Time = 0.000339
>
> Proc = 0 Level = 6 Build Coarse Operator Time = 0.000206
>
> Proc = 0 Level = 7 Coarsen Time = 0.000090
>
> Proc = 0 Level = 7 Build Interp Time = 0.000148
>
> Proc = 0 Level = 7 Build Coarse Operator Time = 0.000085
>
> Proc = 0 Level = 8 Coarsen Time = 0.000054
>
> Proc = 0 Level = 8 Build Interp Time = 0.000100
>
> Proc = 0 Level = 8 Build Coarse Operator Time = 0.000053
>
>
>
> I have not tried -pc_hypre_boomeramg_print_debug yet.
>
> Think I can get the total coarsen time by summing up the time from all the
> levels here. I am still trying to understand how to get the time spent to
> solve the problem at the coarsest level.
>
>
>
> Best
>
> Parv
>
>
>
> *From:* Mark Adams <mfadams at lbl.gov>
> *Sent:* Wednesday, April 10, 2024 2:01 PM
> *To:* Khurana, Parv <p.khurana22 at imperial.ac.uk>
> *Cc:* petsc-users at mcs.anl.gov
> *Subject:* Re: [petsc-users] Exposing further detail in -log_view for
> Hypre withe PETSc
>
>
>
> This email from mfadams at lbl.gov originates from outside Imperial. Do not
> click on links and attachments unless you recognise the sender. If you
> trust the sender, add them to your safe senders list
> <https://urldefense.us/v3/__https://spam.ic.ac.uk/SpamConsole/Senders.aspx__;!!G_uCfscf7eWS!YS2WBZEKqf7TxLkwZdEHcQVn7ovrUUklX2wN90WJl3d8l_VEtJlZDHqPrTpyZKIzmxCtb9_wiTbKdjRwSMDX4R4$ > to disable email
> stamping for this address.
>
>
>
> I believe there is an option to get hypre to print its performance data.
>
> Run with -help and grep on "pc_hypre" and look for something that looks
> like a logging or view parameter.
>
>
>
> Mark
>
>
>
> On Wed, Apr 10, 2024 at 7:49 AM Khurana, Parv <p.khurana22 at imperial.ac.uk>
> wrote:
>
> Hello PETSc users community, Thank you in advance for your help as always.
> I am using BoomerAMG from Hypre via PETSc as a part of preconditioner in my
> software (Nektar++). I am trying to understand the time profiling
> information that is printed
>
> ZjQcmQRYFpfptBannerStart
>
> *This Message Is From an External Sender *
>
> This message came from outside your organization.
>
>
>
> ZjQcmQRYFpfptBannerEnd
>
> Hello PETSc users community,
>
>
>
> Thank you in advance for your help as always.
>
>
>
> I am using BoomerAMG from Hypre via PETSc as a part of preconditioner in
> my software (Nektar++). I am trying to understand the time profiling
> information that is printed using the -log_view option.
>
>
>
> I want to understand how much time is spent in the smoothening step vs the
> time to solve on the coarsest grid I reach. The output that I get from
> -log_view (pasted below) gives me information of the KSPSolve and MatMult,
> but I think I need more granular time information to see a further
> breakdown of time spent within my routines. I would like to hear if anyone
> has any recommendations on obtaining this information?
>
>
>
> Best
>
> Parv Khurana
>
>
>
>
>
> PETSc database options used for solve:
>
>
>
> -ksp_monitor # (source: file)
>
> -ksp_type preonly # (source: file)
>
> -log_view # (source: file)
>
> -pc_hypre_boomeramg_coarsen_type hmis # (source: file)
>
> -pc_hypre_boomeramg_grid_sweeps_all 2 # (source: file)
>
> -pc_hypre_boomeramg_interp_type ext+i # (source: file)
>
> -pc_hypre_boomeramg_max_iter 1 # (source: file)
>
> -pc_hypre_boomeramg_P_max 2 # (source: file)
>
> -pc_hypre_boomeramg_print_debug 1 # (source: file)
>
> -pc_hypre_boomeramg_print_statistics 1 # (source: file)
>
> -pc_hypre_boomeramg_relax_type_all sor/jacobi # (source: file)
>
> -pc_hypre_boomeramg_strong_threshold 0.7 # (source: file)
>
> -pc_hypre_boomeramg_truncfactor 0.3 # (source: file)
>
> -pc_hypre_type boomeramg # (source: file)
>
> -pc_type hypre # (source: file)
>
>
>
>
>
> PETSc log_view output:
>
>
>
>
> ------------------------------------------------------------------------------------------------------------------------
>
> Event Count Time (sec) Flop
> --- Global --- --- Stage ---- Total
>
> Max Ratio Max Ratio Max Ratio Mess AvgLen
> Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s
>
>
> ------------------------------------------------------------------------------------------------------------------------
>
>
>
> --- Event Stage 0: Main Stage
>
>
>
> BuildTwoSided 1 1.0 9.6900e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
>
> MatMult 14 1.0 1.6315e-01 1.0 1.65e+08 1.0 0.0e+00 0.0e+00
> 0.0e+00 0 86 0 0 0 0 86 0 0 0 1011
>
> MatConvert 1 1.0 4.3092e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
>
> MatAssemblyBegin 3 1.0 3.1680e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
>
> MatAssemblyEnd 3 1.0 9.4178e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
>
> MatGetRowIJ 2 1.0 1.1630e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
>
> MatSetPreallCOO 1 1.0 3.2132e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
>
> MatSetValuesCOO 1 1.0 2.9956e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
>
> VecNorm 28 1.0 1.3981e-02 1.0 2.10e+07 1.0 0.0e+00 0.0e+00
> 0.0e+00 0 11 0 0 0 0 11 0 0 0 1499
>
> VecSet 13 1.0 6.5185e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
>
> VecAYPX 14 1.0 7.1511e-03 1.0 5.24e+06 1.0 0.0e+00 0.0e+00
> 0.0e+00 0 3 0 0 0 0 3 0 0 0 733
>
> VecAssemblyBegin 14 1.0 1.3998e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
>
> VecAssemblyEnd 14 1.0 4.2560e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
>
> VecScatterBegin 14 1.0 8.2761e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
>
> VecScatterEnd 14 1.0 4.4665e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
>
> SFSetGraph 1 1.0 6.5993e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
>
> SFSetUp 1 1.0 7.9212e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
>
> SFPack 14 1.0 5.8690e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
>
> SFUnpack 14 1.0 4.3370e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
>
> KSPSetUp 1 1.0 2.4910e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
>
> KSPSolve 14 1.0 2.1922e+00 1.0 1.91e+08 1.0 0.0e+00 0.0e+00
> 0.0e+00 0 100 0 0 0 0 100 0 0 0 87
>
> PCSetUp 1 1.0 1.3165e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
>
> PCApply 14 1.0 1.9990e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00
> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0
>
>
> ------------------------------------------------------------------------------------------------------------------------
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240410/63c37736/attachment-0001.html>
More information about the petsc-users
mailing list