Linear solve converged due to CONVERGED_RTOL iterations 8 1 step time: 2.4196798801422119 norm1 error: 4.5520041378041393E-007 norm inf error: 5.6458624529954005E-003 Summary of Memory Usage in PETSc Maximum (over computational time) process memory: total 7.7825e+08 max 1.0471e+05 min 9.0404e+04 Current process memory: total 7.7825e+08 max 1.0471e+05 min 9.0404e+04 ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- ./test_ksp.exe on a gnu-opt named ¯ÿÿÿ with 8192 processors, by wang11 Tue Oct 4 04:48:34 2016 Using Petsc Development GIT revision: v3.6.3-2059-geab7831 GIT Date: 2016-01-20 10:58:35 -0600 Max Max/Min Avg Total Time (sec): 3.063e+00 1.00245 3.060e+00 Objects: 4.330e+02 1.78189 2.445e+02 Flops: 1.834e+08 1.19492 1.537e+08 1.259e+12 Flops/sec: 5.990e+07 1.19588 5.022e+07 4.114e+11 MPI Messages: 7.318e+03 2.55490 2.921e+03 2.393e+07 MPI Message Lengths: 7.667e+06 1.24890 2.106e+03 5.041e+10 MPI Reductions: 5.350e+02 1.65635 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 3.0599e+00 100.0% 1.2589e+12 100.0% 2.393e+07 100.0% 2.106e+03 100.0% 3.237e+02 60.5% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage BuildTwoSidedF 1 1.0 4.4755e-02 2.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 VecTDot 16 1.0 3.8015e-02 1.7 4.19e+06 1.0 0.0e+00 0.0e+00 1.6e+01 1 3 0 0 3 1 3 0 0 5 903835 VecNorm 9 1.0 1.3690e-0116.8 2.36e+06 1.0 0.0e+00 0.0e+00 9.0e+00 4 2 0 0 2 4 2 0 0 3 141177 VecScale 48 2.0 3.8075e-04 3.5 6.67e+04 1.4 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1029039 VecCopy 10 1.0 2.1766e-02 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 VecSet 274 1.8 7.7467e-03 1.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 32 1.0 4.2971e-02 1.3 8.39e+06 1.0 0.0e+00 0.0e+00 0.0e+00 1 5 0 0 0 1 5 0 0 0 1599219 VecAYPX 71 1.5 3.0426e-02 1.9 4.12e+06 1.0 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 1099039 VecAssemblyBegin 1 1.0 4.4774e-02 2.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 VecAssemblyEnd 1 1.0 2.0027e-0510.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecScatterBegin 287 1.7 3.1764e-02 1.9 0.00e+00 0.0 1.7e+07 2.1e+03 0.0e+00 1 0 69 70 0 1 0 69 70 0 0 VecScatterEnd 287 1.7 1.9770e-01 4.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 5 0 0 0 0 5 0 0 0 0 0 MatMult 80 1.4 3.0042e-01 1.2 5.08e+07 1.0 6.9e+06 4.2e+03 0.0e+00 9 32 29 57 0 9 32 29 57 0 1331632 MatMultAdd 56 1.8 5.2409e-02 1.3 8.34e+06 1.0 1.8e+06 8.2e+02 0.0e+00 1 5 8 3 0 1 5 8 3 0 1264393 MatMultTranspose 71 1.6 7.7323e-02 1.7 9.38e+06 1.0 2.6e+06 6.6e+02 0.0e+00 2 6 11 3 0 2 6 11 3 0 964174 MatSolve 8 0.0 2.6293e-03 0.0 1.92e+06 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 46717 MatSOR 112 1.8 4.2231e-01 1.1 5.28e+07 1.1 5.2e+06 6.1e+02 1.2e-01 13 31 22 6 0 13 31 22 6 0 933649 MatLUFactorSym 1 0.0 5.3170e-03 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLUFactorNum 1 0.0 2.2190e-02 0.0 1.95e+07 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 56284 MatConvert 1 0.0 2.0695e-04 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatResidual 56 1.8 1.4632e-01 1.3 2.48e+07 1.1 5.5e+06 2.1e+03 0.0e+00 4 15 23 23 0 4 15 23 23 0 1274804 MatAssemblyBegin 37 1.6 9.2539e-02 1.5 0.00e+00 0.0 0.0e+00 0.0e+00 2.4e+01 3 0 0 0 5 3 0 0 0 7 0 MatAssemblyEnd 37 1.6 3.5879e-01 1.0 0.00e+00 0.0 2.8e+06 3.0e+02 8.8e+01 11 0 12 2 17 11 0 12 2 27 0 MatGetRowIJ 1 0.0 1.5593e-04 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetSubMatrice 2 2.0 1.5009e-01 4.7 0.00e+00 0.0 4.1e+04 2.1e+03 3.0e+00 2 0 0 0 1 2 0 0 0 1 0 MatGetOrdering 1 0.0 6.5589e-04 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatPtAP 8 1.6 7.4498e-01 1.0 2.16e+07 1.1 6.0e+06 2.3e+03 8.5e+01 23 13 25 27 16 23 13 25 27 26 223546 MatPtAPSymbolic 8 1.6 3.9067e-01 1.1 0.00e+00 0.0 3.6e+06 2.8e+03 3.5e+01 12 0 15 20 7 12 0 15 20 11 0 MatPtAPNumeric 8 1.6 3.5955e-01 1.1 2.16e+07 1.1 2.4e+06 1.5e+03 5.0e+01 11 13 10 7 9 11 13 10 7 16 463183 MatRedundantMat 1 0.0 2.0258e-03 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.1e-02 0 0 0 0 0 0 0 0 0 0 0 MatMPIConcateSeq 1 0.0 1.1904e-01 0.0 0.00e+00 0.0 3.3e+03 1.4e+02 1.2e-01 0 0 0 0 0 0 0 0 0 0 0 MatGetLocalMat 8 1.6 3.0560e-02 1.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 MatGetBrAoCol 8 1.6 4.3396e-02 1.3 0.00e+00 0.0 2.7e+06 3.0e+03 0.0e+00 1 0 11 16 0 1 0 11 16 0 0 MatGetSymTrans 16 1.6 1.0850e-02 1.9 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 DMCoarsen 7 1.8 9.1541e-02 1.0 0.00e+00 0.0 4.0e+05 4.5e+02 4.4e+01 3 0 2 0 8 3 0 2 0 14 0 DMCreateInterpolation 7 1.8 2.4951e-01 1.0 1.04e+06 1.0 6.9e+05 4.1e+02 6.4e+01 8 1 3 1 12 8 1 3 1 20 33197 KSPSetUp 12 2.0 2.8754e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 1.4e+01 1 0 0 0 3 1 0 0 0 4 0 KSPSolve 1 1.0 2.4205e+00 1.0 1.83e+08 1.2 2.4e+07 2.1e+03 2.9e+02 79100 99 98 55 79100 99 98 91 520109 PCSetUp 2 2.0 1.4065e+00 1.2 4.21e+07 2.0 7.3e+06 1.9e+03 2.4e+02 40 14 31 28 45 40 14 31 28 74 125179 PCApply 8 1.0 9.1063e-01 1.0 1.36e+08 1.3 1.6e+07 1.7e+03 1.7e+01 30 69 67 53 3 30 69 67 53 5 954751 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 183 183 15249968 0. Vector Scatter 30 30 1268320 0. Matrix 74 74 42692012 0. Matrix Null Space 1 1 592 0. Distributed Mesh 9 9 44928 0. Star Forest Bipartite Graph 18 18 15264 0. Discrete System 9 9 7704 0. Index Set 66 66 885836 0. IS L to G Mapping 9 9 725648 0. Krylov Solver 13 13 16200 0. DMKSP interface 7 7 4536 0. Preconditioner 13 13 12960 0. Viewer 1 0 0 0. ======================================================================================================================== Average time to get PetscTime(): 2.14577e-07 Average time for MPI_Barrier(): 4.17709e-05 Average time for zero size MPI_Send(): 9.93468e-06 #PETSc Option Table entries: -ksp_converged_reason -ksp_initial_guess_nonzero yes -ksp_norm_type unpreconditioned -ksp_rtol 1e-7 -ksp_type cg -log_view -matptap_scalable -matrap 0 -memory_view -mg_coarse_ksp_type preonly -mg_coarse_pc_telescope_reduction_factor 128 -mg_coarse_pc_type telescope -mg_coarse_telescope_ksp_type preonly -mg_coarse_telescope_mg_coarse_ksp_type preonly -mg_coarse_telescope_mg_coarse_pc_type redundant -mg_coarse_telescope_mg_levels_ksp_max_it 1 -mg_coarse_telescope_mg_levels_ksp_type richardson -mg_coarse_telescope_pc_mg_galerkin -mg_coarse_telescope_pc_mg_levels 4 -mg_coarse_telescope_pc_type mg -mg_levels_ksp_max_it 1 -mg_levels_ksp_type richardson -N 1024 -options_left 1 -pc_mg_galerkin -pc_mg_levels 5 -pc_type mg -ppe_max_iter 20 -px 32 -py 16 -pz 16 #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure options: --known-level1-dcache-size=16384 --known-level1-dcache-linesize=64 --known-level1-dcache-assoc=4 --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8 --known-bits-per-byte=8 --known-memcmp-ok=1 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --known-mpi-int64_t=1 --known-mpi-c-double-complex=1 --known-sdot-returns-double=0 --known-snrm2-returns-double=0 --known-has-attribute-aligned=1 --with-batch="1 " --known-mpi-shared="0 " --known-mpi-shared-libraries=0 --known-memcmp-ok --with-blas-lapack-lib=/opt/acml/5.3.1/gfortran64/lib/libacml.a --COPTFLAGS="-march=bdver1 -O3 -ffast-math -fPIC " --FOPTFLAGS="-march=bdver1 -O3 -ffast-math -fPIC " --CXXOPTFLAGS="-march=bdver1 -O3 -ffast-math -fPIC " --with-x="0 " --with-debugging="0 " --with-clib-autodetect="0 " --with-cxxlib-autodetect="0 " --with-fortranlib-autodetect="0 " --with-shared-libraries="0 " --with-mpi-compilers="1 " --with-cc="cc " --with-cxx="CC " --with-fc="ftn " --download-hypre="1 " --download-blacs="1 " --download-scalapack="1 " --download-superlu_dist="1 " --download-metis="1 " --download-parmetis="1 " PETSC_ARCH=gnu-opt ----------------------------------------- Libraries compiled on Tue Feb 16 12:57:46 2016 on h2ologin3 Machine characteristics: Linux-3.0.101-0.46-default-x86_64-with-SuSE-11-x86_64 Using PETSc directory: /mnt/a/u/sciteam/wang11/Sftw/petsc Using PETSc arch: gnu-opt ----------------------------------------- Using C compiler: cc -march=bdver1 -O3 -ffast-math -fPIC ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: ftn -march=bdver1 -O3 -ffast-math -fPIC ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/mnt/a/u/sciteam/wang11/Sftw/petsc/gnu-opt/include -I/mnt/a/u/sciteam/wang11/Sftw/petsc/include -I/mnt/a/u/sciteam/wang11/Sftw/petsc/include -I/mnt/a/u/sciteam/wang11/Sftw/petsc/gnu-opt/include ----------------------------------------- Using C linker: cc Using Fortran linker: ftn Using libraries: -Wl,-rpath,/mnt/a/u/sciteam/wang11/Sftw/petsc/gnu-opt/lib -L/mnt/a/u/sciteam/wang11/Sftw/petsc/gnu-opt/lib -lpetsc -Wl,-rpath,/mnt/a/u/sciteam/wang11/Sftw/petsc/gnu-opt/lib -L/mnt/a/u/sciteam/wang11/Sftw/petsc/gnu-opt/lib -lsuperlu_dist_4.3 -lHYPRE -lscalapack -Wl,-rpath,/opt/acml/5.3.1/gfortran64/lib -L/opt/acml/5.3.1/gfortran64/lib -lacml -lparmetis -lmetis -lssl -lcrypto -ldl ----------------------------------------- #PETSc Option Table entries: -ksp_converged_reason -ksp_initial_guess_nonzero yes -ksp_norm_type unpreconditioned -ksp_rtol 1e-7 -ksp_type cg -log_view -matptap_scalable -matrap 0 -memory_view -mg_coarse_ksp_type preonly -mg_coarse_pc_telescope_reduction_factor 128 -mg_coarse_pc_type telescope -mg_coarse_telescope_ksp_type preonly -mg_coarse_telescope_mg_coarse_ksp_type preonly -mg_coarse_telescope_mg_coarse_pc_type redundant -mg_coarse_telescope_mg_levels_ksp_max_it 1 -mg_coarse_telescope_mg_levels_ksp_type richardson -mg_coarse_telescope_pc_mg_galerkin -mg_coarse_telescope_pc_mg_levels 4 -mg_coarse_telescope_pc_type mg -mg_levels_ksp_max_it 1 -mg_levels_ksp_type richardson -N 1024 -options_left 1 -pc_mg_galerkin -pc_mg_levels 5 -pc_type mg -ppe_max_iter 20 -px 32 -py 16 -pz 16 #End of PETSc Option Table entries There is one unused database option. It is: Option left: name:-ppe_max_iter value: 20 Application 48712726 resources: utime ~33377s, stime ~11349s, Rss ~104708, inblocks ~12352393, outblocks ~8092986