[petsc-users] Weird behavior for log_summary

Michele Rosso mrosso at uci.edu
Mon Jun 8 12:16:59 CDT 2015


Hi Barry,

I run a small test case like you suggested: this results in no error,
but the problem with log_summary still persists.
Please find attached the output of log_summary.

Thanks,
Michele

On Fri, 2015-06-05 at 21:34 -0500, Barry Smith wrote:

> [NID 04001] 2015-06-04 19:07:24 Apid 25022256: initiated application termination
> Application 25022256 exit signals: Killed
> Application 25022256 resources: utime ~271s, stime ~15107s, Rss ~188536, inblocks ~5078831, outblocks ~12517984
> 
>    Usually this kind of message indicates that either the OS or the batch system killed the process for some reason: often because it ran out of time or maybe memory.
> 
>    Can you run in batch with a request for more time? Do smaller jobs run through ok?
> 
>    If utime means user time and stime means system time then this is very bad, the system time is HUGE relative to the user time.
> 
>   Barry
> 
> 
> 
> 
> > On Jun 5, 2015, at 9:22 PM, Michele Rosso <mrosso at uci.edu> wrote:
> > 
> > Hi,
> > 
> > I am checking the performances of my code via -log_summary, but the output is incomplete (please see attached) file.
> > I configured petsc with the following options:
> > 
> > if __name__ == '__main__':
> >   import sys
> >   import os
> >   sys.path.insert(0, os.path.abspath('config'))
> >   import configure
> >   configure_options = [
> >     '--with-batch=1 ',
> >     '--known-mpi-shared=0 ',
> >     '--known-mpi-shared-libraries=0',
> >     '--known-memcmp-ok ',
> >     '--with-blas-lapack-lib=-L/opt/acml/5.3.1/gfortran64/lib  -lacml',
> >     '--COPTFLAGS=-march=bdver1 -fopenmp -O3 -ffast-math -fPIC ',
> >     '--FOPTFLAGS=-march=bdver1 -fopenmp -O3 -ffast-math -fPIC ',
> >     '--CXXOPTFLAGS=-march=bdver1 -fopenmp -O3 -ffast-math -fPIC ',
> >     '--with-x=0 ',
> >     '--with-debugging=0',
> >     '--with-clib-autodetect=0 ',
> >     '--with-cxxlib-autodetect=0 ',
> >     '--with-fortranlib-autodetect=0 ',
> >     '--with-shared-libraries=0 ',
> >     '--with-mpi-compilers=1 ',
> >     '--with-cc=cc ',
> >     '--with-cxx=CC ',
> >     '--with-fc=ftn ',
> > #    '--with-64-bit-indices',
> >     '--download-hypre=1',
> >     '--download-blacs=1 ',
> >     '--download-scalapack=1 ',
> >     '--download-superlu_dist=1 ',
> >     '--download-metis=1 ',
> >     '--download-parmetis=1 ',
> >    ]
> >   configure.petsc_configure(configure_options)
> > 
> > Any idea about this issue?
> > Thanks,
> > 
> > Michele
> > 
> > 
> > 
> > 
> > <log_summary>
> 


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150608/63504b8e/attachment.html>
-------------- next part --------------
 

************************************************************************************************************************
***             WIDEN YOUR WINDOW TO 120 CHARACTERS.  Use 'enscript -r -fCourier9' to print this document            ***
************************************************************************************************************************

---------------------------------------------- PETSc Performance Summary: ----------------------------------------------

./test3.exe on a gnu-opt-32idx named . with 16 processors, by mrosso Mon Jun  8 11:54:16 2015
Using Petsc Release Version 3.5.4, May, 23, 2015 

                         Max       Max/Min        Avg      Total 
Time (sec):           4.387e-01      1.00635   4.373e-01
Objects:              5.410e+02      1.00000   5.410e+02
Flops:                2.648e+07      1.00131   2.647e+07  4.234e+08
Flops/sec:            6.072e+07      1.00727   6.052e+07  9.683e+08
MPI Messages:         1.668e+03      1.06992   1.614e+03  2.582e+04
MPI Message Lengths:  2.914e+06      1.05005   1.763e+03  4.552e+07
MPI Reductions:       1.106e+03      1.00000

Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract)
                            e.g., VecAXPY() for real vectors of length N --> 2N flops
                            and VecAXPY() for complex vectors of length N --> 8N flops

Summary of Stages:   ----- Time ------  ----- Flops -----  --- Messages ---  -- Message Lengths --  -- Reductions --
                        Avg     %Total     Avg     %Total   counts   %Total     Avg         %Total   counts   %Total 
 0:      Main Stage: 4.3732e-01 100.0%  4.2344e+08 100.0%  2.582e+04 100.0%  1.763e+03      100.0%  1.105e+03  99.9% 

------------------------------------------------------------------------------------------------------------------------
See the 'Profiling' chapter of the users' manual for details on interpreting output.
Phase summary info:
   Count: number of times phase was executed
   Time and Flops: Max - maximum over all processors
                   Ratio - ratio of maximum to minimum over all processors
   Mess: number of messages sent
   Avg. len: average message length (bytes)
   Reduct: number of global reductions
   Global: entire computation
   Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop().
      %T - percent time in this phase         %F - percent flops in this phase
      %M - percent messages in this phase     %L - percent message lengths in this phase
      %R - percent reductions in this phase
   Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors)
------------------------------------------------------------------------------------------------------------------------
Event                Count      Time (sec)     Flops                             --- Global ---  --- Stage ---   Total
                   Max Ratio  Max     Ratio   Max  Ratio  Mess   Avg len Reduct  %T %F %M %L %R  %T %F %M %L %R Mflop/s
------------------------------------------------------------------------------------------------------------------------

--- Event Stage 0: Main Stage

------------------------------------------------------------------------------------------------------------------------

Memory usage is given in bytes:

Object Type          Creations   Destructions     Memory  Descendants' Mem.
Reports information only for process 0.

--- Event Stage 0: Main Stage

              Vector   240            134      1196608     0
      Vector Scatter    30              0            0     0
              Matrix    88              0            0     0
   Matrix Null Space     4              0            0     0
    Distributed Mesh    10              0            0     0
Star Forest Bipartite Graph    20              0            0     0
     Discrete System    10              0            0     0
           Index Set    76             68       140480     0
   IS L to G Mapping    10              0            0     0
       Krylov Solver    20              4         4640     0
     DMKSP interface     8              0            0     0
      Preconditioner    20              4         4000     0
              Viewer     5              4         2976     0
========================================================================================================================
Average time to get PetscTime(): 9.53674e-08
Average time for MPI_Barrier(): 6.62804e-06
Average time for zero size MPI_Send(): 2.25008e-06
#PETSc Option Table entries:
-ksp_initial_guess_nonzero yes
-ksp_norm_type unpreconditioned
-ksp_rtol 1e-9
-ksp_type cg
-log_summary
-mg_coarse_ksp_type preonly
-mg_coarse_pc_factor_mat_solver_package superlu_dist
-mg_coarse_pc_type lu
-mg_levels_ksp_max_it 1
-mg_levels_ksp_type richardson
-options_left
-pc_mg_galerkin
-pc_mg_levels 3
-pc_type mg
#End of PETSc Option Table entries
Compiled without FORTRAN kernels
Compiled with full precision matrices (default)
sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4
Configure options: --known-level1-dcache-size=16384 --known-level1-dcache-linesize=64 --known-level1-dcache-assoc=4 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --known-mpi-int64_t=1 --known-mpi-c-double-complex=1 --known-sdot-returns-double=0 --known-snrm2-returns-double=0 --with-batch="1 " --known-mpi-shared="0 " --known-mpi-shared-libraries=0 --known-memcmp-ok  --with-blas-lapack-lib="-L/opt/acml/5.3.1/gfortran64/lib  -lacml" --COPTFLAGS="-march=bdver1 -fopenmp -O3 -ffast-math -fPIC " --FOPTFLAGS="-march=bdver1 -fopenmp -O3 -ffast-math -fPIC " --CXXOPTFLAGS="-march=bdver1 -fopenmp -O3 -ffast-math -fPIC " --with-x="0 " --with-debugging=0 --with-clib-autodetect="0 " --with-cxxlib-autodetect="0 " --with-fortranlib-autodetect="0 " --with-shared-libraries="0 " --with-mpi-compilers="1 " --with-cc="cc " --with-cxx="CC " --with-fc="ftn " --download-hypre=1 --download-blacs="1 " --download-scalapack="1 " --download-superlu_dist="1 " --download-metis="1 " --download-parmetis="1 " PETSC_ARCH=gnu-opt-32idx
-----------------------------------------
Libraries compiled on Wed Jun  3 14:35:20 2015 on h2ologin3 
Machine characteristics: Linux-3.0.101-0.46-default-x86_64-with-SuSE-11-x86_64
Using PETSc directory: /mnt/a/u/sciteam/mrosso/LIBS/petsc-3.5.4
Using PETSc arch: gnu-opt-32idx
-----------------------------------------

Using C compiler: cc  -march=bdver1 -fopenmp -O3 -ffast-math -fPIC  ${COPTFLAGS} ${CFLAGS}
Using Fortran compiler: ftn  -march=bdver1 -fopenmp -O3 -ffast-math -fPIC   ${FOPTFLAGS} ${FFLAGS} 
-----------------------------------------

Using include paths: -I/mnt/a/u/sciteam/mrosso/LIBS/petsc-3.5.4/gnu-opt-32idx/include -I/mnt/a/u/sciteam/mrosso/LIBS/petsc-3.5.4/include -I/mnt/a/u/sciteam/mrosso/LIBS/petsc-3.5.4/include -I/mnt/a/u/sciteam/mrosso/LIBS/petsc-3.5.4/gnu-opt-32idx/include
-----------------------------------------

Using C linker: cc
Using Fortran linker: ftn
Using libraries: -Wl,-rpath,/mnt/a/u/sciteam/mrosso/LIBS/petsc-3.5.4/gnu-opt-32idx/lib -L/mnt/a/u/sciteam/mrosso/LIBS/petsc-3.5.4/gnu-opt-32idx/lib -lpetsc -Wl,-rpath,/mnt/a/u/sciteam/mrosso/LIBS/petsc-3.5.4/gnu-opt-32idx/lib -L/mnt/a/u/sciteam/mrosso/LIBS/petsc-3.5.4/gnu-opt-32idx/lib -lsuperlu_dist_3.3 -lHYPRE -L/opt/acml/5.3.1/gfortran64/lib -lacml -lparmetis -lmetis -lpthread -lssl -lcrypto -ldl  
-----------------------------------------

#PETSc Option Table entries:
-ksp_initial_guess_nonzero yes
-ksp_norm_type unpreconditioned
-ksp_rtol 1e-9
-ksp_type cg
-log_summary
-mg_coarse_ksp_type preonly
-mg_coarse_pc_factor_mat_solver_package superlu_dist
-mg_coarse_pc_type lu
-mg_levels_ksp_max_it 1
-mg_levels_ksp_type richardson
-options_left
-pc_mg_galerkin
-pc_mg_levels 3
-pc_type mg
#End of PETSc Option Table entries
There are no unused options.
[NID 15170] 2015-06-08 11:54:16 Apid 25082027: initiated application termination
Application 25082027 resources: utime ~1s, stime ~9s, Rss ~27320, inblocks ~19758, outblocks ~49423


More information about the petsc-users mailing list