Linear solve converged due to CONVERGED_RTOL iterations 8 1 step time: 5.4150269031524658 norm1 error: 2.0137755633475663E-007 norm inf error: 4.9194723446491315E-003 Summary of Memory Usage in PETSc Maximum (over computational time) process memory: total 1.2760e+09 max 1.2200e+05 min 6.8304e+04 Current process memory: total 1.2760e+09 max 1.2200e+05 min 6.8304e+04 ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- ./test_ksp.exe on a gnu-opt named ¯ÿÿÿ with 16384 processors, by wang11 Tue Oct 4 04:43:44 2016 Using Petsc Development GIT revision: v3.6.3-2059-geab7831 GIT Date: 2016-01-20 10:58:35 -0600 Max Max/Min Avg Total Time (sec): 5.925e+00 1.00147 5.920e+00 Objects: 3.850e+02 1.58436 2.474e+02 Flops: 2.310e+09 30.18287 1.463e+08 2.398e+12 Flops/sec: 3.900e+08 30.17591 2.471e+07 4.049e+11 MPI Messages: 9.792e+03 3.52656 3.017e+03 4.943e+07 MPI Message Lengths: 4.433e+06 1.14818 1.286e+03 6.358e+10 MPI Reductions: 4.870e+02 1.50774 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 5.9199e+00 100.0% 2.3977e+12 100.0% 4.943e+07 100.0% 1.286e+03 100.0% 3.271e+02 67.2% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage BuildTwoSidedF 1 1.0 5.3220e-02 3.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 VecTDot 16 1.0 3.3422e-02 2.0 2.10e+06 1.0 0.0e+00 0.0e+00 1.6e+01 0 1 0 0 3 0 1 0 0 5 1028050 VecNorm 9 1.0 4.6150e-02 8.1 1.18e+06 1.0 0.0e+00 0.0e+00 9.0e+00 1 1 0 0 2 1 1 0 0 3 418794 VecScale 40 1.7 1.4820e-0318.9 3.58e+04 1.2 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 340679 VecCopy 10 1.0 1.4787e-02 4.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 245 1.6 2.3055e-03 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 32 1.0 1.2739e-02 1.2 4.19e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 3 0 0 0 0 3 0 0 0 5394542 VecAYPX 63 1.3 1.0296e-02 1.8 2.05e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 3247767 VecAssemblyBegin 1 1.0 5.3233e-02 3.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 VecAssemblyEnd 1 1.0 5.1975e-0527.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecScatterBegin 254 1.5 3.6260e-02 2.8 0.00e+00 0.0 3.5e+07 1.3e+03 0.0e+00 0 0 70 70 0 0 0 70 70 0 0 VecScatterEnd 254 1.5 3.7343e+0023.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 61 0 0 0 0 61 0 0 0 0 0 MatMult 72 1.3 2.6418e-01 1.8 2.47e+07 1.0 1.3e+07 2.7e+03 0.0e+00 3 17 27 57 0 3 17 27 57 0 1514304 MatMultAdd 48 1.5 3.6875e-02 1.9 4.07e+06 1.0 3.7e+06 5.1e+02 0.0e+00 0 3 8 3 0 0 3 8 3 0 1797004 MatMultTranspose 62 1.4 5.4392e-02 2.2 4.58e+06 1.0 4.7e+06 4.6e+02 0.0e+00 1 3 10 3 0 1 3 10 3 0 1370657 MatSolve 8 0.0 6.2803e-02 0.0 5.01e+07 0.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 408472 MatSOR 96 1.5 2.2264e-01 1.2 2.44e+07 1.0 1.0e+07 3.9e+02 5.0e-01 3 16 21 6 0 3 16 21 6 0 1756728 MatLUFactorSym 1 0.0 1.3425e-01 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLUFactorNum 1 0.0 3.0468e+00 0.0 2.18e+09 0.0 0.0e+00 0.0e+00 0.0e+00 2 47 0 0 0 2 47 0 0 0 366778 MatConvert 1 0.0 1.4038e-03 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatResidual 48 1.5 1.6796e-01 2.5 1.16e+07 1.0 1.1e+07 1.3e+03 0.0e+00 1 8 23 23 0 1 8 23 23 0 1110512 MatAssemblyBegin 33 1.4 1.1824e-01 1.5 0.00e+00 0.0 0.0e+00 0.0e+00 2.4e+01 2 0 0 0 5 2 0 0 0 7 0 MatAssemblyEnd 33 1.4 3.2890e-01 1.1 0.00e+00 0.0 5.6e+06 1.9e+02 8.9e+01 5 0 11 2 18 5 0 11 2 27 0 MatGetRowIJ 1 0.0 1.1580e-03 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetSubMatrice 2 2.0 5.4987e-02 5.9 0.00e+00 0.0 7.9e+04 1.1e+03 3.1e+00 0 0 0 0 1 0 0 0 0 1 0 MatGetOrdering 1 0.0 8.1720e-03 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatPtAP 7 1.4 6.2958e-01 1.0 1.03e+07 1.0 1.2e+07 1.4e+03 8.6e+01 10 7 24 27 18 10 7 24 27 26 264868 MatPtAPSymbolic 7 1.4 3.6594e-01 1.0 0.00e+00 0.0 7.1e+06 1.8e+03 3.5e+01 6 0 14 20 7 6 0 14 20 11 0 MatPtAPNumeric 7 1.4 2.6525e-01 1.0 1.03e+07 1.0 4.8e+06 9.4e+02 5.1e+01 4 7 10 7 10 4 7 10 7 15 628685 MatRedundantMat 1 0.0 1.7978e-02 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.2e-01 0 0 0 0 0 0 0 0 0 0 0 MatMPIConcateSeq 1 0.0 3.7515e-02 0.0 0.00e+00 0.0 2.7e+04 4.0e+01 4.7e-01 0 0 0 0 0 0 0 0 0 0 0 MatGetLocalMat 7 1.4 1.1893e-02 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetBrAoCol 7 1.4 4.4069e-02 1.4 0.00e+00 0.0 5.5e+06 1.9e+03 0.0e+00 1 0 11 16 0 1 0 11 16 0 0 MatGetSymTrans 14 1.4 3.2480e-03 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 DMCoarsen 6 1.5 1.4491e-01 1.0 0.00e+00 0.0 8.0e+05 2.8e+02 4.5e+01 2 0 2 0 9 2 0 2 0 14 0 DMCreateInterpolation 6 1.5 2.7302e-01 1.0 5.09e+05 1.0 1.4e+06 2.6e+02 6.5e+01 5 0 3 1 13 5 0 3 1 20 30339 KSPSetUp 11 1.8 2.6836e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 1.4e+01 0 0 0 0 3 0 0 0 0 4 0 KSPSolve 1 1.0 5.4161e+00 1.0 2.31e+0930.2 4.9e+07 1.3e+03 3.0e+02 91100 99 98 61 91100 99 98 91 442709 PCSetUp 2 2.0 4.6810e+00 4.0 2.19e+09205.4 1.5e+07 1.2e+03 2.4e+02 22 54 30 28 50 22 54 30 28 74 276126 PCApply 8 1.0 4.0840e+00 1.0 2.29e+0943.3 3.4e+07 9.9e+02 2.0e+01 69 84 69 53 4 69 84 69 53 6 491686 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 160 160 7611312 0. Vector Scatter 27 27 633040 0. Matrix 66 66 59623768 0. Matrix Null Space 1 1 592 0. Distributed Mesh 8 8 39936 0. Star Forest Bipartite Graph 16 16 13568 0. Discrete System 8 8 6848 0. Index Set 60 60 538064 0. IS L to G Mapping 8 8 368384 0. Krylov Solver 12 12 14920 0. DMKSP interface 6 6 3888 0. Preconditioner 12 12 11984 0. Viewer 1 0 0 0. ======================================================================================================================== Average time to get PetscTime(): 9.53674e-08 Average time for MPI_Barrier(): 0.000100613 Average time for zero size MPI_Send(): 1.10291e-05 #PETSc Option Table entries: -ksp_converged_reason -ksp_initial_guess_nonzero yes -ksp_norm_type unpreconditioned -ksp_rtol 1e-7 -ksp_type cg -log_view -matptap_scalable -matrap 0 -memory_view -mg_coarse_ksp_type preonly -mg_coarse_pc_telescope_reduction_factor 32 -mg_coarse_pc_type telescope -mg_coarse_telescope_ksp_type preonly -mg_coarse_telescope_mg_coarse_ksp_type preonly -mg_coarse_telescope_mg_coarse_pc_type redundant -mg_coarse_telescope_mg_levels_ksp_max_it 1 -mg_coarse_telescope_mg_levels_ksp_type richardson -mg_coarse_telescope_pc_mg_galerkin -mg_coarse_telescope_pc_mg_levels 3 -mg_coarse_telescope_pc_type mg -mg_levels_ksp_max_it 1 -mg_levels_ksp_type richardson -N 1024 -options_left 1 -pc_mg_galerkin -pc_mg_levels 5 -pc_type mg -ppe_max_iter 20 -px 32 -py 32 -pz 16 #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure options: --known-level1-dcache-size=16384 --known-level1-dcache-linesize=64 --known-level1-dcache-assoc=4 --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8 --known-bits-per-byte=8 --known-memcmp-ok=1 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --known-mpi-int64_t=1 --known-mpi-c-double-complex=1 --known-sdot-returns-double=0 --known-snrm2-returns-double=0 --known-has-attribute-aligned=1 --with-batch="1 " --known-mpi-shared="0 " --known-mpi-shared-libraries=0 --known-memcmp-ok --with-blas-lapack-lib=/opt/acml/5.3.1/gfortran64/lib/libacml.a --COPTFLAGS="-march=bdver1 -O3 -ffast-math -fPIC " --FOPTFLAGS="-march=bdver1 -O3 -ffast-math -fPIC " --CXXOPTFLAGS="-march=bdver1 -O3 -ffast-math -fPIC " --with-x="0 " --with-debugging="0 " --with-clib-autodetect="0 " --with-cxxlib-autodetect="0 " --with-fortranlib-autodetect="0 " --with-shared-libraries="0 " --with-mpi-compilers="1 " --with-cc="cc " --with-cxx="CC " --with-fc="ftn " --download-hypre="1 " --download-blacs="1 " --download-scalapack="1 " --download-superlu_dist="1 " --download-metis="1 " --download-parmetis="1 " PETSC_ARCH=gnu-opt ----------------------------------------- Libraries compiled on Tue Feb 16 12:57:46 2016 on h2ologin3 Machine characteristics: Linux-3.0.101-0.46-default-x86_64-with-SuSE-11-x86_64 Using PETSc directory: /mnt/a/u/sciteam/wang11/Sftw/petsc Using PETSc arch: gnu-opt ----------------------------------------- Using C compiler: cc -march=bdver1 -O3 -ffast-math -fPIC ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: ftn -march=bdver1 -O3 -ffast-math -fPIC ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/mnt/a/u/sciteam/wang11/Sftw/petsc/gnu-opt/include -I/mnt/a/u/sciteam/wang11/Sftw/petsc/include -I/mnt/a/u/sciteam/wang11/Sftw/petsc/include -I/mnt/a/u/sciteam/wang11/Sftw/petsc/gnu-opt/include ----------------------------------------- Using C linker: cc Using Fortran linker: ftn Using libraries: -Wl,-rpath,/mnt/a/u/sciteam/wang11/Sftw/petsc/gnu-opt/lib -L/mnt/a/u/sciteam/wang11/Sftw/petsc/gnu-opt/lib -lpetsc -Wl,-rpath,/mnt/a/u/sciteam/wang11/Sftw/petsc/gnu-opt/lib -L/mnt/a/u/sciteam/wang11/Sftw/petsc/gnu-opt/lib -lsuperlu_dist_4.3 -lHYPRE -lscalapack -Wl,-rpath,/opt/acml/5.3.1/gfortran64/lib -L/opt/acml/5.3.1/gfortran64/lib -lacml -lparmetis -lmetis -lssl -lcrypto -ldl ----------------------------------------- #PETSc Option Table entries: -ksp_converged_reason -ksp_initial_guess_nonzero yes -ksp_norm_type unpreconditioned -ksp_rtol 1e-7 -ksp_type cg -log_view -matptap_scalable -matrap 0 -memory_view -mg_coarse_ksp_type preonly -mg_coarse_pc_telescope_reduction_factor 32 -mg_coarse_pc_type telescope -mg_coarse_telescope_ksp_type preonly -mg_coarse_telescope_mg_coarse_ksp_type preonly -mg_coarse_telescope_mg_coarse_pc_type redundant -mg_coarse_telescope_mg_levels_ksp_max_it 1 -mg_coarse_telescope_mg_levels_ksp_type richardson -mg_coarse_telescope_pc_mg_galerkin -mg_coarse_telescope_pc_mg_levels 3 -mg_coarse_telescope_pc_type mg -mg_levels_ksp_max_it 1 -mg_levels_ksp_type richardson -N 1024 -options_left 1 -pc_mg_galerkin -pc_mg_levels 5 -pc_type mg -ppe_max_iter 20 -px 32 -py 32 -pz 16 #End of PETSc Option Table entries There is one unused database option. It is: Option left: name:-ppe_max_iter value: 20 Application 48712716 resources: utime ~129251s, stime ~23539s, Rss ~122000, inblocks ~28312847, outblocks ~16155044