[petsc-users] Understanding PETSc -log_summary
Ajit Desai
aero.aju at gmail.com
Thu Mar 16 10:11:32 CDT 2017
Thanks, Jed.
That is helpful.
*Ajit Desai*
PhD Scholar, Carleton University, Canada
On Thu, Mar 16, 2017 at 8:00 AM, Jed Brown <jed at jedbrown.org> wrote:
> Ajit Desai <aero.aju at gmail.com> writes:
>
> > Hello Everyone,
> > A couple of questions on the *-log_summary* provided by PETSc.
>
> -log_view is the preferred name.
>
> > 1. *Avg-Flops & Avg-Flops/sec* are averaged among the participating cores
> > or averaged over the simulation time or both?
>
> Flops on each process is just the count (as logged by PetscLogFlops,
> which PETSc numerical functions call and you can too) over the time
> between PetscInitialize until the log is printed (usually
> PetscFinalize). The time on each process is the wall clock time from
> PetscInitialize until the log is printed. The Average, Max, Min, and
> total are over processes.
>
> > 2. *Max/Min Flops & Flops/sec* is an indication of load balancing?
> > In simulation-2 these ratios are high compared to simulation-1. Does that
> > mean simulation-2 is not well balanced?
>
> Yes, not balanced in terms of flops (correlated with time, but not a
> measure of time). Look at the events to help see where.
>
> > Please follow the outputs from two different simulations
> > (Note: the problem size and the number of processors used are different).
> >
> > *Simulation-1*
> > Max Max/Min Avg
> > Total
> > Time (sec): 4.208e+02 1.00005 4.208e+02
> > Objects: 7.100e+01 1.00000 7.100e+01
> > Flops: 3.326e+11 1.31175 3.017e+11 4.826e+13
> > Flops/sec: 7.904e+08 1.31175 7.169e+08 1.147e+11
> > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00
> >
> >
> > *SImulation-2:*
> > Max Max/Min Avg
> > Total
> > Time (sec): 8.434e+02 1.00000 8.434e+02
> > Objects: 7.300e+01 1.02817 7.102e+01
> > Flops: 6.555e+11 1.85115 5.798e+11 3.711e+14
> > Flops/sec: 7.772e+08 1.85115 6.874e+08 4.400e+11
> > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00
> >
> > We're trying to understand the performance of our solver using these
> > outputs.
> > Any comments in relation to that will be helpful.
> >
> > Thanks & Regards,
> > *Ajit Desai*
> > PhD Scholar, Carleton University, Canada
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170316/a3f38cf2/attachment.html>
More information about the petsc-users
mailing list