Linear solve converged due to CONVERGED_RTOL iterations 8 1 step time: 5.6067509651184082 norm1 error: 8.6823275619066220E-008 norm inf error: 4.1513369925681139E-003 Summary of Memory Usage in PETSc Maximum (over computational time) process memory: total 2.6721e+09 max 1.2921e+05 min 6.3376e+04 Current process memory: total 2.6721e+09 max 1.2921e+05 min 6.3376e+04 ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- ./test_ksp.exe on a gnu-opt named ßÿÿÿ with 32768 processors, by wang11 Fri Sep 30 17:25:17 2016 Using Petsc Development GIT revision: v3.6.3-2059-geab7831 GIT Date: 2016-01-20 10:58:35 -0600 Max Max/Min Avg Total Time (sec): 6.075e+00 1.00129 6.070e+00 Objects: 3.850e+02 1.58436 2.452e+02 Flops: 2.272e+09 59.49877 7.309e+07 2.395e+12 Flops/sec: 3.741e+08 59.49159 1.204e+07 3.944e+11 MPI Messages: 1.019e+04 3.67063 2.914e+03 9.548e+07 MPI Message Lengths: 2.911e+06 1.24640 8.049e+02 7.686e+10 MPI Reductions: 4.870e+02 1.50774 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 6.0704e+00 100.0% 2.3949e+12 100.0% 9.548e+07 100.0% 8.049e+02 100.0% 3.246e+02 66.6% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage BuildTwoSidedF 1 1.0 6.6517e-02 4.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 VecTDot 16 1.0 2.5051e-02 2.8 1.05e+06 1.0 0.0e+00 0.0e+00 1.6e+01 0 1 0 0 3 0 1 0 0 5 1371551 VecNorm 9 1.0 1.1081e-01 8.0 5.90e+05 1.0 0.0e+00 0.0e+00 9.0e+00 2 1 0 0 2 2 1 0 0 3 174416 VecScale 40 1.7 3.4213e-04 5.8 2.41e+04 1.3 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1828178 VecCopy 10 1.0 2.8744e-03 2.6 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 245 1.6 1.1914e-03 1.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 32 1.0 5.2736e-03 1.4 2.10e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 3 0 0 0 0 3 0 0 0 13030895 VecAYPX 63 1.3 4.8871e-03 1.9 1.03e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 6842365 VecAssemblyBegin 1 1.0 6.6530e-02 4.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 VecAssemblyEnd 1 1.0 3.6955e-0519.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecScatterBegin 254 1.5 3.0157e-02 2.9 0.00e+00 0.0 6.6e+07 8.1e+02 0.0e+00 0 0 70 70 0 0 0 70 70 0 0 VecScatterEnd 254 1.5 3.8389e+0029.6 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 62 0 0 0 0 62 0 0 0 0 0 MatMult 72 1.3 1.5445e-01 2.1 1.24e+07 1.0 2.6e+07 1.7e+03 0.0e+00 1 17 28 57 0 1 17 28 57 0 2590109 MatMultAdd 48 1.5 2.3583e-02 2.4 2.05e+06 1.0 7.4e+06 3.2e+02 0.0e+00 0 3 8 3 0 0 3 8 3 0 2809812 MatMultTranspose 62 1.4 3.8921e-02 3.4 2.31e+06 1.0 9.4e+06 2.8e+02 0.0e+00 0 3 10 3 0 0 3 10 3 0 1915483 MatSolve 8 0.0 6.3777e-02 0.0 5.01e+07 0.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 402233 MatSOR 96 1.5 1.1042e-01 1.3 1.24e+07 1.0 2.1e+07 2.4e+02 2.5e-01 2 16 22 7 0 2 16 22 7 0 3514088 MatLUFactorSym 1 0.0 1.3715e-01 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLUFactorNum 1 0.0 3.0408e+00 0.0 2.18e+09 0.0 0.0e+00 0.0e+00 0.0e+00 1 47 0 0 0 1 47 0 0 0 367497 MatConvert 1 0.0 1.3850e-03 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatResidual 48 1.5 1.0893e-01 3.1 5.94e+06 1.0 2.2e+07 8.0e+02 0.0e+00 1 8 23 23 0 1 8 23 23 0 1712251 MatAssemblyBegin 33 1.4 1.6177e-01 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 2.4e+01 2 0 0 0 5 2 0 0 0 7 0 MatAssemblyEnd 33 1.4 3.8307e-01 1.1 0.00e+00 0.0 1.1e+07 1.2e+02 8.9e+01 6 0 12 2 18 6 0 12 2 27 0 MatGetRowIJ 1 0.0 1.1508e-03 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetSubMatrice 2 2.0 1.0705e-01 5.0 0.00e+00 0.0 1.6e+05 5.4e+02 3.1e+00 1 0 0 0 1 1 0 0 0 1 0 MatGetOrdering 1 0.0 9.1419e-03 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatPtAP 7 1.4 6.5116e-01 1.0 5.25e+06 1.0 2.4e+07 8.8e+02 8.6e+01 11 7 25 27 18 11 7 25 27 26 256435 MatPtAPSymbolic 7 1.4 3.7687e-01 1.0 0.00e+00 0.0 1.4e+07 1.1e+03 3.5e+01 6 0 15 20 7 6 0 15 20 11 0 MatPtAPNumeric 7 1.4 2.7777e-01 1.0 5.25e+06 1.0 9.6e+06 5.7e+02 5.0e+01 4 7 10 7 10 4 7 10 7 16 601149 MatRedundantMat 1 0.0 1.6249e-02 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 6.2e-02 0 0 0 0 0 0 0 0 0 0 0 MatMPIConcateSeq 1 0.0 9.2777e-02 0.0 0.00e+00 0.0 2.7e+04 4.0e+01 2.3e-01 0 0 0 0 0 0 0 0 0 0 0 MatGetLocalMat 7 1.4 5.4631e-03 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetBrAoCol 7 1.4 2.7932e-02 2.0 0.00e+00 0.0 1.1e+07 1.1e+03 0.0e+00 0 0 11 16 0 0 0 11 16 0 0 MatGetSymTrans 14 1.4 1.5435e-03 1.6 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 DMCoarsen 6 1.5 2.2486e-01 1.1 0.00e+00 0.0 1.6e+06 1.7e+02 4.4e+01 4 0 2 0 9 4 0 2 0 14 0 DMCreateInterpolation 6 1.5 2.9050e-01 1.0 2.57e+05 1.0 2.8e+06 1.6e+02 6.4e+01 5 0 3 1 13 5 0 3 1 20 28514 KSPSetUp 11 1.8 4.8519e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 1.4e+01 1 0 0 0 3 1 0 0 0 4 0 KSPSolve 1 1.0 5.6078e+00 1.0 2.27e+0959.5 9.5e+07 7.9e+02 2.9e+02 92100 99 98 60 92100 99 98 91 427066 PCSetUp 2 2.0 5.0961e+00 3.7 2.19e+09409.3 2.9e+07 7.4e+02 2.4e+02 24 54 31 28 49 24 54 31 28 74 253678 PCApply 8 1.0 4.0242e+00 1.0 2.26e+0986.0 6.5e+07 6.3e+02 1.8e+01 66 84 68 53 4 66 84 68 53 6 498229 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 160 160 4041376 0. Vector Scatter 27 27 333392 0. Matrix 66 66 50342800 0. Matrix Null Space 1 1 592 0. Distributed Mesh 8 8 39936 0. Star Forest Bipartite Graph 16 16 13568 0. Discrete System 8 8 6848 0. Index Set 60 60 341192 0. IS L to G Mapping 8 8 195776 0. Krylov Solver 12 12 14920 0. DMKSP interface 6 6 3888 0. Preconditioner 12 12 11984 0. Viewer 1 0 0 0. ======================================================================================================================== Average time to get PetscTime(): 1.19209e-07 Average time for MPI_Barrier(): 0.000212812 Average time for zero size MPI_Send(): 1.5335e-05 #PETSc Option Table entries: -ksp_converged_reason -ksp_initial_guess_nonzero yes -ksp_norm_type unpreconditioned -ksp_rtol 1e-7 -ksp_type cg -log_view -matptap_scalable -matrap 0 -memory_view -mg_coarse_ksp_type preonly -mg_coarse_pc_telescope_reduction_factor 64 -mg_coarse_pc_type telescope -mg_coarse_telescope_ksp_type preonly -mg_coarse_telescope_mg_coarse_ksp_type preonly -mg_coarse_telescope_mg_coarse_pc_type redundant -mg_coarse_telescope_mg_levels_ksp_max_it 1 -mg_coarse_telescope_mg_levels_ksp_type richardson -mg_coarse_telescope_pc_mg_galerkin -mg_coarse_telescope_pc_mg_levels 3 -mg_coarse_telescope_pc_type mg -mg_levels_ksp_max_it 1 -mg_levels_ksp_type richardson -N 1024 -options_left 1 -P 32 -pc_mg_galerkin -pc_mg_levels 5 -pc_type mg -ppe_max_iter 20 #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure options: --known-level1-dcache-size=16384 --known-level1-dcache-linesize=64 --known-level1-dcache-assoc=4 --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8 --known-bits-per-byte=8 --known-memcmp-ok=1 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --known-mpi-int64_t=1 --known-mpi-c-double-complex=1 --known-sdot-returns-double=0 --known-snrm2-returns-double=0 --known-has-attribute-aligned=1 --with-batch="1 " --known-mpi-shared="0 " --known-mpi-shared-libraries=0 --known-memcmp-ok --with-blas-lapack-lib=/opt/acml/5.3.1/gfortran64/lib/libacml.a --COPTFLAGS="-march=bdver1 -O3 -ffast-math -fPIC " --FOPTFLAGS="-march=bdver1 -O3 -ffast-math -fPIC " --CXXOPTFLAGS="-march=bdver1 -O3 -ffast-math -fPIC " --with-x="0 " --with-debugging="0 " --with-clib-autodetect="0 " --with-cxxlib-autodetect="0 " --with-fortranlib-autodetect="0 " --with-shared-libraries="0 " --with-mpi-compilers="1 " --with-cc="cc " --with-cxx="CC " --with-fc="ftn " --download-hypre="1 " --download-blacs="1 " --download-scalapack="1 " --download-superlu_dist="1 " --download-metis="1 " --download-parmetis="1 " PETSC_ARCH=gnu-opt ----------------------------------------- Libraries compiled on Tue Feb 16 12:57:46 2016 on h2ologin3 Machine characteristics: Linux-3.0.101-0.46-default-x86_64-with-SuSE-11-x86_64 Using PETSc directory: /mnt/a/u/sciteam/wang11/Sftw/petsc Using PETSc arch: gnu-opt ----------------------------------------- Using C compiler: cc -march=bdver1 -O3 -ffast-math -fPIC ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: ftn -march=bdver1 -O3 -ffast-math -fPIC ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/mnt/a/u/sciteam/wang11/Sftw/petsc/gnu-opt/include -I/mnt/a/u/sciteam/wang11/Sftw/petsc/include -I/mnt/a/u/sciteam/wang11/Sftw/petsc/include -I/mnt/a/u/sciteam/wang11/Sftw/petsc/gnu-opt/include ----------------------------------------- Using C linker: cc Using Fortran linker: ftn Using libraries: -Wl,-rpath,/mnt/a/u/sciteam/wang11/Sftw/petsc/gnu-opt/lib -L/mnt/a/u/sciteam/wang11/Sftw/petsc/gnu-opt/lib -lpetsc -Wl,-rpath,/mnt/a/u/sciteam/wang11/Sftw/petsc/gnu-opt/lib -L/mnt/a/u/sciteam/wang11/Sftw/petsc/gnu-opt/lib -lsuperlu_dist_4.3 -lHYPRE -lscalapack -Wl,-rpath,/opt/acml/5.3.1/gfortran64/lib -L/opt/acml/5.3.1/gfortran64/lib -lacml -lparmetis -lmetis -lssl -lcrypto -ldl ----------------------------------------- #PETSc Option Table entries: -ksp_converged_reason -ksp_initial_guess_nonzero yes -ksp_norm_type unpreconditioned -ksp_rtol 1e-7 -ksp_type cg -log_view -matptap_scalable -matrap 0 -memory_view -mg_coarse_ksp_type preonly -mg_coarse_pc_telescope_reduction_factor 64 -mg_coarse_pc_type telescope -mg_coarse_telescope_ksp_type preonly -mg_coarse_telescope_mg_coarse_ksp_type preonly -mg_coarse_telescope_mg_coarse_pc_type redundant -mg_coarse_telescope_mg_levels_ksp_max_it 1 -mg_coarse_telescope_mg_levels_ksp_type richardson -mg_coarse_telescope_pc_mg_galerkin -mg_coarse_telescope_pc_mg_levels 3 -mg_coarse_telescope_pc_type mg -mg_levels_ksp_max_it 1 -mg_levels_ksp_type richardson -N 1024 -options_left 1 -P 32 -pc_mg_galerkin -pc_mg_levels 5 -pc_type mg -ppe_max_iter 20 #End of PETSc Option Table entries There is one unused database option. It is: Option left: name:-ppe_max_iter value: 20 Application 48682697 resources: utime ~261520s, stime ~39163s, Rss ~129212, inblocks ~42413034, outblocks ~32251484