[petsc-users] Sparse matrix vector multiplication performance

Jed Brown jed at 59A2.org
Sat May 1 08:59:43 CDT 2010


On Sat, 1 May 2010 10:34:50 -0300, Bernardo Rocha <bernardosk at gmail.com> wrote:
> Hi everyone,
> 
> I´m wondering if it is possible to measure the CPU time of the sparse matrix
> vector multiplications within a iterative solver such as the CG in PETSc?
> 
> I´ve been reading the manual, but it seems that the options -log_summary and
> similar ones only track the convergence behaviour.

Actually, -log_summary is exactly what you are looking for, look for the
MatMult event.  You should see something like the following:

------------------------------------------------------------------------------------------------------------------------
See the 'Profiling' chapter of the users' manual for details on interpreting output.
Phase summary info:
   Count: number of times phase was executed
   Time and Flops: Max - maximum over all processors
                   Ratio - ratio of maximum to minimum over all processors
   Mess: number of messages sent
   Avg. len: average message length
   Reduct: number of global reductions
   Global: entire computation
   Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop().
      %T - percent time in this phase         %F - percent flops in this phase
      %M - percent messages in this phase     %L - percent message lengths in this phase
      %R - percent reductions in this phase
   Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors)
------------------------------------------------------------------------------------------------------------------------
Event                Count      Time (sec)     Flops                             --- Global ---  --- Stage ---   Total
                   Max Ratio  Max     Ratio   Max  Ratio  Mess   Avg len Reduct  %T %F %M %L %R  %T %F %M %L %R Mflop/s
------------------------------------------------------------------------------------------------------------------------
[...]
MatMult             1268 1.0 3.5579e+01 1.1 5.18e+09 1.1 5.1e+04 8.9e+03 0.0e+00 12 32 13 18  0  12 32 13 18  0  1124


Jed


More information about the petsc-users mailing list