0: [0] PetscFinalize(): PetscFinalize() called 1: [1] PetscFinalize(): PetscFinalize() called 0: [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 1: [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 1: [1] PetscGetHostName(): Rejecting domainname, likely is NIS cn013.(none) 0: ************************************************************************************************************************ 0: *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** 0: ************************************************************************************************************************ 0: 0: ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- 0: 0: [0] PetscGetHostName(): Rejecting domainname, likely is NIS cn013.(none) 0: Unknown Name on a arch-linu named cn013 with 2 processors, by sellenet Fri Jun 29 10:10:34 2012 0: Using Petsc Release Version 3.2.0, Patch 7, Thu Mar 15 09:30:51 CDT 2012 0: 0: Max Max/Min Avg Total 0: Time (sec): 3.109e+01 1.00000 3.109e+01 0: Objects: 5.000e+01 1.00000 5.000e+01 0: Flops: 3.148e+10 1.03707 3.092e+10 6.183e+10 0: Flops/sec: 1.012e+09 1.03706 9.942e+08 1.988e+09 0: MPI Messages: 4.155e+02 1.00000 4.155e+02 8.310e+02 0: MPI Message Lengths: 3.206e+07 1.00082 7.713e+04 6.409e+07 0: MPI Reductions: 8.350e+02 1.00000 0: 0: Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) 0: e.g., VecAXPY() for real vectors of length N --> 2N flops 0: and VecAXPY() for complex vectors of length N --> 8N flops 0: 0: Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- 0: Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: 0: Main Stage: 3.1094e+01 100.0% 6.1830e+10 100.0% 8.310e+02 100.0% 7.713e+04 100.0% 8.340e+02 99.9% 0: 0: ------------------------------------------------------------------------------------------------------------------------ 0: See the 'Profiling' chapter of the users' manual for details on interpreting output. 0: Phase summary info: 0: Count: number of times phase was executed 0: Time and Flops: Max - maximum over all processors 0: Ratio - ratio of maximum to minimum over all processors 0: Mess: number of messages sent 0: Avg. len: average message length 0: Reduct: number of global reductions 0: Global: entire computation 0: Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). 0: %T - percent time in this phase %f - percent flops in this phase 0: %M - percent messages in this phase %L - percent message lengths in this phase 0: %R - percent reductions in this phase 0: Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) 0: ------------------------------------------------------------------------------------------------------------------------ 0: Event Count Time (sec) Flops --- Global --- --- Stage --- Total 0: Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %f %M %L %R %T %f %M %L %R Mflop/s 0: ------------------------------------------------------------------------------------------------------------------------ 0: 0: --- Event Stage 0: Main Stage 0: 0: VecMDot 397 1.0 2.3842e+00 1.0 4.21e+09 1.0 0.0e+00 0.0e+00 4.0e+02 8 13 0 0 48 8 13 0 0 48 3469 0: VecNorm 413 1.0 1.1969e+00 1.2 2.86e+08 1.0 0.0e+00 0.0e+00 4.1e+02 3 1 0 0 49 3 1 0 0 50 470 0: VecScale 411 1.0 6.9009e-02 1.0 1.42e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 4056 0: VecCopy 14 1.0 7.2660e-03 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 0: VecSet 16 1.0 4.7505e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 0: VecAXPY 27 1.0 1.1338e-02 1.0 1.87e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 3243 0: VecAYPX 1 1.0 8.8406e-04 1.1 3.47e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 770 0: VecMAXPY 411 1.0 3.0013e+00 1.1 4.49e+09 1.0 0.0e+00 0.0e+00 0.0e+00 9 14 0 0 0 9 14 0 0 0 2936 0: VecAssemblyBegin 1 1.0 1.3020e-03 1.6 0.00e+00 0.0 2.0e+00 7.3e+04 3.0e+00 0 0 0 0 0 0 0 0 0 0 0 0: VecAssemblyEnd 1 1.0 6.8903e-05 2.6 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 0: VecPointwiseMult 411 1.0 3.6661e-01 1.1 1.42e+08 1.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 763 0: VecScatterBegin 411 1.0 1.2212e-02 1.0 0.00e+00 0.0 8.2e+02 6.5e+04 0.0e+00 0 0 99 84 0 0 0 99 84 0 0 0: VecScatterEnd 411 1.0 5.4960e-0112.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 0: VecNormalize 411 1.0 1.2618e+00 1.2 4.27e+08 1.0 0.0e+00 0.0e+00 4.1e+02 4 1 0 0 49 4 1 0 0 49 665 0: MatMult 411 1.0 1.5956e+01 1.0 2.22e+10 1.0 8.2e+02 6.5e+04 0.0e+00 51 70 99 84 0 51 70 99 84 0 2732 0: MatAssemblyBegin 1 1.0 3.0897e-02 6.2 0.00e+00 0.0 3.0e+00 3.4e+06 2.0e+00 0 0 0 16 0 0 0 0 16 0 0 0: MatAssemblyEnd 1 1.0 1.6195e-01 1.0 0.00e+00 0.0 4.0e+00 8.2e+03 9.0e+00 1 0 0 0 1 1 0 0 0 1 0 0: KSPGMRESOrthog 397 1.0 5.1464e+00 1.0 8.42e+09 1.0 0.0e+00 0.0e+00 4.0e+02 16 27 0 0 48 16 27 0 0 48 3214 0: KSPSetup 1 1.0 1.7998e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 0: KSPSolve 1 1.0 2.2715e+01 1.0 3.14e+10 1.0 8.2e+02 6.5e+04 8.1e+02 73100 99 83 97 73100 99 83 97 2717 0: PCSetUp 1 1.0 3.6001e-05 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 0: PCApply 411 1.0 4.0137e-01 1.1 1.42e+08 1.0 0.0e+00 0.0e+00 2.0e+00 1 0 0 0 0 1 0 0 0 0 697 0: ------------------------------------------------------------------------------------------------------------------------ 0: 0: Memory usage is given in bytes: 0: 0: Object Type Creations Destructions Memory Descendants' Mem. 0: Reports information only for process 0. 0: 0: --- Event Stage 0: Main Stage 0: 0: Vector 41 40 105538336 0 0: Vector Scatter 1 1 1036 0 0: Matrix 3 3 341061584 0 0: Index Set 2 2 1480 0 0: Krylov Solver 1 1 18288 0 0: Preconditioner 1 1 792 0 0: Viewer 1 0 0 0 0: ======================================================================================================================== 0: Average time to get PetscTime(): 0 0: Average time for MPI_Barrier(): 5.72205e-07 0: Average time for zero size MPI_Send(): 2.02656e-06 0: #PETSc Option Table entries: 0: -info 0: -log_summary 1: [1] Petsc_DelViewer(): Removing viewer data attribute in an MPI_Comm -2080374784 0: #End of PETSc Option Table entries 0: Compiled without FORTRAN kernels 0: Compiled with full precision matrices (default) 0: sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 0: Configure run at: Fri Mar 23 19:24:03 2012 0: Configure options: --prefix=/home/desoza/EL_18296/petsc-3.2-p7-install-mpi --with-mpi=1 --with-debugging=0 --with-x=0 --with-cc=mpiicc --with-fc=mpiifort --with-blas-lapack-dir=/logiciels/intel/Compiler/11.1/046/mkl --PETSC_ARCH=arch-linux2-c-opt-mpi 0: ----------------------------------------- 0: Libraries compiled on Fri Mar 23 19:24:03 2012 on frontal1 0: Machine characteristics: Linux-2.6.26-2-amd64-x86_64-with-debian-5.0.4 0: Using PETSc directory: /aster/public/petsc-3.2-p7 0: Using PETSc arch: arch-linux2-c-opt-mpi 0: ----------------------------------------- 0: 0: Using C compiler: mpiicc -wd1572 -Qoption,cpp,--extended_float_type -O3 ${COPTFLAGS} ${CFLAGS} 0: Using Fortran compiler: mpiifort -O3 ${FOPTFLAGS} ${FFLAGS} 0: ----------------------------------------- 0: 0: Using include paths: -I/aster/public/petsc-3.2-p7/arch-linux2-c-opt-mpi/include -I/aster/public/petsc-3.2-p7/include -I/aster/public/petsc-3.2-p7/include -I/aster/public/petsc-3.2-p7/arch-linux2-c-opt-mpi/include -I/logiciels/impi/intel64/include 0: ----------------------------------------- 0: 0: Using C linker: mpiicc 0: Using Fortran linker: mpiifort 0: Using libraries: -Wl,-rpath,/aster/public/petsc-3.2-p7/arch-linux2-c-opt-mpi/lib -L/aster/public/petsc-3.2-p7/arch-linux2-c-opt-mpi/lib -lpetsc -lpthread -Wl,-rpath,/logiciels/intel/Compiler/11.1/046/mkl -L/logiciels/intel/Compiler/11.1/046/mkl -lmkl_intel_lp64 -lmkl_intel_thread -lmkl_core -liomp5 -lpthread -ldl -L/logiciels/impi/intel64/lib -lmpi -lmpigf -lmpigi -lpthread -lrt -L/logiciels/intel/Compiler/11.1/046/mkl/lib/em64t -L/logiciels/intel/Compiler/11.1/046/lib/intel64 -L/logiciels/intel/Compiler/11.1/046/ipp/em64t/lib -L/logiciels/intel/Compiler/11.1/046/tbb/em64t/cc3.4.3_libc2.3.4_kernel2.6.9/lib -L/usr/lib/gcc/x86_64-linux-gnu/4.3.2 -limf -lsvml -lipgo -ldecimal -lirc -lgcc_s -lirc_s -Wl,-rpath,/logiciels/aster/public/petsc-3.2-p7/-Xlinker -Wl,-rpath,/logiciels/impi/intel64/lib -Wl,-rpath,/opt/intel/mpi-rt/4.0.0 -lifport -lifcore -lm -lm -ldl -lmpi -lmpigf -lmpigi -lpthread -lrt -limf -lsvml -lipgo -ldecimal -lirc -lgcc_s -lirc_s -ldl 0: ----------------------------------------- 0: 0: [0] Petsc_DelViewer(): Removing viewer data attribute in an MPI_Comm -2080374784