TS Object: 64 MPI processes type: cn maximum steps=25 maximum time=5e-06 total number of linear solver iterations=247 total number of linear solve failures=0 total number of rejected steps=0 using relative error tolerance of 0.0001, using absolute error tolerance of 0.0001 TSAdapt Object: 64 MPI processes type: none SNES Object: 64 MPI processes type: ksponly maximum iterations=50, maximum function evaluations=10000 tolerances: relative=1e-08, absolute=1e-50, solution=1e-08 total number of linear solver iterations=10 total number of function evaluations=1 norm schedule ALWAYS SNESLineSearch Object: 64 MPI processes type: basic maxstep=1.000000e+08, minlambda=1.000000e-12 tolerances: relative=1.000000e-08, absolute=1.000000e-15, lambda=1.000000e-08 maximum iterations=1 KSP Object: 64 MPI processes type: gmres restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 64 MPI processes type: gamg type is MULTIPLICATIVE, levels=1 cycles=v Cycles per PCApply=1 Using externally compute Galerkin coarse grid matrices GAMG specific options Threshold for dropping small values in graph on each level = Threshold scaling factor for each level not specified = 1. AGG specific options Symmetric graph false Number of levels to square graph 1 Number smoothing steps 1 Complexity: grid = 1. Coarse grid solver -- level ------------------------------- KSP Object: (mg_levels_0_) 64 MPI processes type: preonly maximum iterations=2, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (mg_levels_0_) 64 MPI processes type: sor type = local_symmetric, iterations = 1, local iterations = 1, omega = 1. linear system matrix followed by preconditioner matrix: Mat Object: 64 MPI processes type: mpiaij rows=25000000, cols=25000000 total: nonzeros=124920016, allocated nonzeros=124920016 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Mat Object: 64 MPI processes type: mpiaij rows=25000000, cols=25000000 total: nonzeros=124920016, allocated nonzeros=124920016 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines linear system matrix followed by preconditioner matrix: Mat Object: 64 MPI processes type: mpiaij rows=25000000, cols=25000000 total: nonzeros=124920016, allocated nonzeros=124920016 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines Mat Object: 64 MPI processes type: mpiaij rows=25000000, cols=25000000 total: nonzeros=124920016, allocated nonzeros=124920016 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- ./ex_modify on a named apsxrmd-0001 with 64 processors, by sajid Wed Feb 27 16:01:55 2019 Using Petsc Development GIT revision: 4dbc1805575afffed4e440f1353fcfccbc893081 GIT Date: 2019-02-20 13:54:54 -0600 Max Max/Min Avg Total Time (sec): 4.461e+02 1.000 4.461e+02 Objects: 5.020e+02 1.000 5.020e+02 Flop: 1.938e+10 1.004 1.938e+10 1.240e+12 Flop/sec: 4.345e+07 1.004 4.344e+07 2.780e+09 MPI Messages: 8.000e+02 2.000 7.875e+02 5.040e+04 MPI Message Lengths: 4.970e+07 2.000 6.213e+04 3.131e+09 MPI Reductions: 1.116e+03 1.000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flop and VecAXPY() for complex vectors of length N --> 8N flop Summary of Stages: ----- Time ------ ----- Flop ------ --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total Count %Total Avg %Total Count %Total 0: Main Stage: 4.4609e+02 100.0% 1.2403e+12 100.0% 5.040e+04 100.0% 6.213e+04 100.0% 1.109e+03 99.4% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flop: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent AvgLen: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flop in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flop over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flop --- Global --- --- Stage ---- Total Max Ratio Max Ratio Max Ratio Mess AvgLen Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage BuildTwoSided 1 1.0 9.3928e-0226.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 BuildTwoSidedF 59 1.0 1.2350e+0174.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 3 0 0 0 0 3 0 0 0 0 0 VecView 1 1.0 2.9198e+01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 7 0 0 0 0 7 0 0 0 0 0 VecMDot 247 1.0 2.3725e+02 1.3 4.20e+09 1.0 0.0e+00 0.0e+00 2.5e+02 50 22 0 0 22 50 22 0 0 22 1134 VecNorm 272 1.0 2.3704e+01 1.6 8.50e+08 1.0 0.0e+00 0.0e+00 2.7e+02 5 4 0 0 24 5 4 0 0 25 2295 VecScale 297 1.0 3.3818e+00 1.2 4.64e+08 1.0 0.0e+00 0.0e+00 0.0e+00 1 2 0 0 0 1 2 0 0 0 8782 VecCopy 100 1.0 1.2190e+00 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 679 1.0 3.3912e+00 2.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 VecAXPY 75 1.0 1.0435e+00 1.3 2.34e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 14375 VecAYPX 50 1.0 5.7769e-01 1.6 7.81e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 8655 VecAXPBYCZ 25 1.0 4.0112e-01 1.5 1.17e+08 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 18698 VecMAXPY 272 1.0 1.5910e+01 1.2 4.98e+09 1.0 0.0e+00 0.0e+00 0.0e+00 3 26 0 0 0 3 26 0 0 0 20012 VecAssemblyBegin 5 1.0 1.1083e-01 6.6 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 5 1.0 1.2672e-03 5.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecLoad 1 1.0 6.6392e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecScatterBegin 303 1.0 1.7568e-01 2.0 0.00e+00 0.0 3.8e+04 8.0e+04 0.0e+00 0 0 74 96 0 0 0 74 96 0 0 VecScatterEnd 303 1.0 1.2503e+0116.7 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 VecNormalize 272 1.0 2.6531e+01 1.5 1.28e+09 1.0 0.0e+00 0.0e+00 2.7e+02 6 7 0 0 24 6 7 0 0 25 3076 TSStep 25 1.0 4.1271e+02 1.0 1.94e+10 1.0 5.0e+04 6.2e+04 1.1e+03 92100100100 97 92100100100 97 3005 TSFunctionEval 50 1.0 6.2466e+00 1.6 7.81e+08 1.0 6.3e+03 8.0e+04 0.0e+00 1 4 12 16 0 1 4 12 16 0 7999 TSJacobianEval 75 1.0 6.8585e+01 1.0 0.00e+00 0.0 1.3e+04 1.0e+04 5.0e+02 15 0 25 4 45 15 0 25 4 45 0 SNESSolve 25 1.0 4.0951e+02 1.0 1.90e+10 1.0 4.7e+04 6.1e+04 1.1e+03 92 98 93 92 97 92 98 93 92 97 2962 SNESFunctionEval 25 1.0 3.2792e+00 1.5 5.08e+08 1.0 3.2e+03 8.0e+04 0.0e+00 1 3 6 8 0 1 3 6 8 0 9906 SNESJacobianEval 25 1.0 6.8586e+01 1.0 0.00e+00 0.0 1.3e+04 1.0e+04 5.0e+02 15 0 25 4 45 15 0 25 4 45 0 SFSetGraph 1 1.0 2.8300e-0431.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 SFSetUp 1 1.0 9.5369e-0211.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 SFBcastBegin 3 1.0 1.0011e-03 5.7 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 SFBcastEnd 3 1.0 3.2377e-04 5.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMult 297 1.0 3.1536e+01 1.7 4.18e+09 1.0 3.7e+04 8.0e+04 0.0e+00 5 22 74 96 0 5 22 74 96 0 8470 MatSOR 272 1.0 7.9240e+01 2.1 4.27e+09 1.0 0.0e+00 0.0e+00 0.0e+00 10 22 0 0 0 10 22 0 0 0 3445 MatConvert 3 1.0 6.2609e-01 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatScale 1 1.0 2.2956e-01 1.7 1.56e+07 1.0 1.3e+02 8.0e+04 0.0e+00 0 0 0 0 0 0 0 0 0 0 4353 MatAssemblyBegin 108 1.0 1.2253e+0145.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 3 0 0 0 0 3 0 0 0 0 0 MatAssemblyEnd 108 1.0 1.0386e+01 1.0 0.00e+00 0.0 1.3e+04 1.0e+04 4.3e+02 2 0 26 4 38 2 0 26 4 39 0 MatCoarsen 1 1.0 6.7729e-01 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 1.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatZeroEntries 50 1.0 8.1896e-01 2.7 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 4 1.0 2.7413e-02 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 4.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAXPY 50 1.0 6.1071e+01 1.0 0.00e+00 0.0 1.3e+04 1.0e+04 5.0e+02 13 0 25 4 45 13 0 25 4 45 0 MatTrnMatMult 1 1.0 9.1609e-01 1.0 3.12e+06 1.0 0.0e+00 0.0e+00 1.6e+01 0 0 0 0 1 0 0 0 0 1 218 MatTrnMatMultSym 1 1.0 6.4303e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.6e+01 0 0 0 0 1 0 0 0 0 1 0 MatTrnMatMultNum 1 1.0 2.7484e-01 1.0 3.12e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 728 MatGetLocalMat 2 1.0 1.0381e-01 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSetUp 50 1.0 8.1054e-02 2.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve 25 1.0 3.3837e+02 1.0 1.83e+10 1.0 3.1e+04 8.0e+04 5.8e+02 76 94 62 80 52 76 94 62 80 52 3459 KSPGMRESOrthog 247 1.0 2.4871e+02 1.3 8.41e+09 1.0 0.0e+00 0.0e+00 2.5e+02 53 43 0 0 22 53 43 0 0 22 2163 PCGAMGGraph_AGG 1 1.0 1.2211e+00 1.0 1.56e+07 1.0 1.3e+02 8.0e+04 1.2e+01 0 0 0 0 1 0 0 0 0 1 818 PCGAMGCoarse_AGG 1 1.0 2.1430e+00 1.0 3.12e+06 1.0 0.0e+00 0.0e+00 2.1e+01 0 0 0 0 2 0 0 0 0 2 93 PCGAMGProl_AGG 1 1.0 5.1474e-02 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 4.0e+00 0 0 0 0 0 0 0 0 0 0 0 GAMG: createProl 1 1.0 3.7042e+00 1.0 1.87e+07 1.0 1.3e+02 8.0e+04 3.7e+01 1 0 0 0 3 1 0 0 0 3 324 Graph 2 1.0 1.2051e+00 1.0 1.56e+07 1.0 1.3e+02 8.0e+04 1.2e+01 0 0 0 0 1 0 0 0 0 1 829 MIS/Agg 1 1.0 6.7757e-01 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 1.0e+00 0 0 0 0 0 0 0 0 0 0 0 PCSetUp 25 1.0 4.6405e+00 1.1 1.87e+07 1.0 1.3e+02 8.0e+04 6.2e+01 1 0 0 0 6 1 0 0 0 6 258 PCApply 272 1.0 8.0476e+01 2.1 4.27e+09 1.0 0.0e+00 0.0e+00 0.0e+00 11 22 0 0 0 11 22 0 0 0 3392 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Viewer 4 3 2504 0. Vector 138 138 167039720 0. TSAdapt 1 1 1240 0. TS 1 1 2056 0. DMTS 1 1 744 0. SNES 1 1 1404 0. DMSNES 2 2 1344 0. SNESLineSearch 1 1 992 0. Distributed Mesh 2 2 9712 0. Index Set 107 107 1104336 0. Star Forest Graph 5 5 4064 0. Discrete System 2 2 1856 0. Matrix 174 174 3578947404 0. Matrix Coarsen 1 1 636 0. Vec Scatter 56 56 74368 0. Krylov Solver 2 2 36496 0. DMKSP interface 1 1 656 0. Preconditioner 2 2 2348 0. PetscRandom 1 1 662 0. ======================================================================================================================== Average time to get PetscTime(): 1.82152e-05 Average time for MPI_Barrier(): 0.000781012 Average time for zero size MPI_Send(): 7.12834e-05 #PETSc Option Table entries: -log_view -pc_type gamg -prop_steps 25 -ts_monitor -ts_type cn #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 16 sizeof(PetscInt) 4 Configure options: --prefix=/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/petsc-develop-ecs35zatgyriagebycvjn7u5kmho5abk --with-ssl=0 --download-c2html=0 --download-sowing=0 --download-hwloc=0 CFLAGS=-axCOMMON-AVX512,MIC-AVX512,CORE-AVX512,CORE-AVX2,AVX FFLAGS=-axCOMMON-AVX512,MIC-AVX512,CORE-AVX512,CORE-AVX2,AVX CXXFLAGS=-axCOMMON-AVX512,MIC-AVX512,CORE-AVX512,CORE-AVX2,AVX --with-cc=/soft/spack-0.10.1/opt/spack/linux-centos7-x86_64/intel-17.0.4/intel-mpi-2017.3-dfphq6kavje2olnichisvjjndtridrok/compilers_and_libraries_2017.4.196/linux/mpi/intel64/bin/mpiicc --with-cxx=/soft/spack-0.10.1/opt/spack/linux-centos7-x86_64/intel-17.0.4/intel-mpi-2017.3-dfphq6kavje2olnichisvjjndtridrok/compilers_and_libraries_2017.4.196/linux/mpi/intel64/bin/mpiicpc --with-fc=/soft/spack-0.10.1/opt/spack/linux-centos7-x86_64/intel-17.0.4/intel-mpi-2017.3-dfphq6kavje2olnichisvjjndtridrok/compilers_and_libraries_2017.4.196/linux/mpi/intel64/bin/mpiifort --FC_LINKER_FLAGS=-lintlc --with-precision=double --with-scalar-type=complex --with-shared-libraries=1 --with-debugging=0 --with-64-bit-indices=0 COPTFLAGS= FOPTFLAGS= CXXOPTFLAGS= --with-blaslapack-lib="/blues/gpfs/home/software/spack-0.10.1/opt/spack/linux-centos7-x86_64/intel-17.0.4/intel-mkl-2017.3.196-v7uuj6zmthzln35n2hb7i5u5ybncv5ev/compilers_and_libraries_2017.4.196/linux/mkl/lib/intel64/libmkl_intel_lp64.so /blues/gpfs/home/software/spack-0.10.1/opt/spack/linux-centos7-x86_64/intel-17.0.4/intel-mkl-2017.3.196-v7uuj6zmthzln35n2hb7i5u5ybncv5ev/compilers_and_libraries_2017.4.196/linux/mkl/lib/intel64/libmkl_sequential.so /blues/gpfs/home/software/spack-0.10.1/opt/spack/linux-centos7-x86_64/intel-17.0.4/intel-mkl-2017.3.196-v7uuj6zmthzln35n2hb7i5u5ybncv5ev/compilers_and_libraries_2017.4.196/linux/mkl/lib/intel64/libmkl_core.so /lib64/libpthread.so /lib64/libm.so /lib64/libdl.so" --with-x=0 --with-clanguage=C --with-scalapack=0 --with-metis=1 --with-metis-dir=/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/metis-5.1.0-onchwjrnnhpm2sbw4bhponru3yxfnr6d --with-hdf5=1 --with-hdf5-dir=/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/hdf5-1.10.4-k6ccj3qc4olurjwtiy3w7iruq6wlqbql --with-hypre=0 --with-parmetis=1 --with-parmetis-dir=/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/parmetis-4.0.3-zr4z75e6frp2hnuy2rdhn2mrjuci635k --with-mumps=0 --with-trilinos=0 --with-cxx-dialect=C++11 --with-superlu_dist-include=/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/superlu-dist-develop-gkxn3ohp5uuhrnxcteqz6az35qfpaof2/include --with-superlu_dist-lib=/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/superlu-dist-develop-gkxn3ohp5uuhrnxcteqz6az35qfpaof2/lib/libsuperlu_dist.a --with-superlu_dist=1 --with-suitesparse=0 --with-zlib-include=/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/zlib-1.2.11-efom5ik43mamshxvhvuoo4sxtctqjhjq/include --with-zlib-lib="-L/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/zlib-1.2.11-efom5ik43mamshxvhvuoo4sxtctqjhjq/lib -lz" --with-zlib=1 ----------------------------------------- Libraries compiled on 2019-02-21 18:36:46 on apsxrmd-0001 Machine characteristics: Linux-3.10.0-957.5.1.el7.x86_64-x86_64-with-centos-7.6.1810-Core Using PETSc directory: /blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/petsc-develop-ecs35zatgyriagebycvjn7u5kmho5abk Using PETSc arch: ----------------------------------------- Using C compiler: /soft/spack-0.10.1/opt/spack/linux-centos7-x86_64/intel-17.0.4/intel-mpi-2017.3-dfphq6kavje2olnichisvjjndtridrok/compilers_and_libraries_2017.4.196/linux/mpi/intel64/bin/mpiicc -axCOMMON-AVX512,MIC-AVX512,CORE-AVX512,CORE-AVX2,AVX -fPIC Using Fortran compiler: /soft/spack-0.10.1/opt/spack/linux-centos7-x86_64/intel-17.0.4/intel-mpi-2017.3-dfphq6kavje2olnichisvjjndtridrok/compilers_and_libraries_2017.4.196/linux/mpi/intel64/bin/mpiifort -axCOMMON-AVX512,MIC-AVX512,CORE-AVX512,CORE-AVX2,AVX -fPIC ----------------------------------------- Using include paths: -I/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/petsc-develop-ecs35zatgyriagebycvjn7u5kmho5abk/include -I/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/superlu-dist-develop-gkxn3ohp5uuhrnxcteqz6az35qfpaof2/include -I/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/hdf5-1.10.4-k6ccj3qc4olurjwtiy3w7iruq6wlqbql/include -I/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/parmetis-4.0.3-zr4z75e6frp2hnuy2rdhn2mrjuci635k/include -I/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/metis-5.1.0-onchwjrnnhpm2sbw4bhponru3yxfnr6d/include -I/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/zlib-1.2.11-efom5ik43mamshxvhvuoo4sxtctqjhjq/include ----------------------------------------- Using C linker: /soft/spack-0.10.1/opt/spack/linux-centos7-x86_64/intel-17.0.4/intel-mpi-2017.3-dfphq6kavje2olnichisvjjndtridrok/compilers_and_libraries_2017.4.196/linux/mpi/intel64/bin/mpiicc Using Fortran linker: /soft/spack-0.10.1/opt/spack/linux-centos7-x86_64/intel-17.0.4/intel-mpi-2017.3-dfphq6kavje2olnichisvjjndtridrok/compilers_and_libraries_2017.4.196/linux/mpi/intel64/bin/mpiifort Using libraries: -Wl,-rpath,/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/petsc-develop-ecs35zatgyriagebycvjn7u5kmho5abk/lib -L/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/petsc-develop-ecs35zatgyriagebycvjn7u5kmho5abk/lib -lpetsc -Wl,-rpath,/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/superlu-dist-develop-gkxn3ohp5uuhrnxcteqz6az35qfpaof2/lib -L/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/superlu-dist-develop-gkxn3ohp5uuhrnxcteqz6az35qfpaof2/lib -Wl,-rpath,/blues/gpfs/home/software/spack-0.10.1/opt/spack/linux-centos7-x86_64/intel-17.0.4/intel-mkl-2017.3.196-v7uuj6zmthzln35n2hb7i5u5ybncv5ev/compilers_and_libraries_2017.4.196/linux/mkl/lib/intel64 -L/blues/gpfs/home/software/spack-0.10.1/opt/spack/linux-centos7-x86_64/intel-17.0.4/intel-mkl-2017.3.196-v7uuj6zmthzln35n2hb7i5u5ybncv5ev/compilers_and_libraries_2017.4.196/linux/mkl/lib/intel64 /lib64/libpthread.so /lib64/libm.so /lib64/libdl.so -Wl,-rpath,/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/hdf5-1.10.4-k6ccj3qc4olurjwtiy3w7iruq6wlqbql/lib -L/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/hdf5-1.10.4-k6ccj3qc4olurjwtiy3w7iruq6wlqbql/lib -Wl,-rpath,/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/parmetis-4.0.3-zr4z75e6frp2hnuy2rdhn2mrjuci635k/lib -L/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/parmetis-4.0.3-zr4z75e6frp2hnuy2rdhn2mrjuci635k/lib -Wl,-rpath,/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/metis-5.1.0-onchwjrnnhpm2sbw4bhponru3yxfnr6d/lib -L/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/metis-5.1.0-onchwjrnnhpm2sbw4bhponru3yxfnr6d/lib -L/blues/gpfs/home/sajid/packages/spack/opt/spack/linux-centos7-x86_64/intel-17.0.4/zlib-1.2.11-efom5ik43mamshxvhvuoo4sxtctqjhjq/lib -Wl,-rpath,/soft/spack-0.10.1/opt/spack/linux-centos7-x86_64/intel-17.0.4/intel-mpi-2017.3-dfphq6kavje2olnichisvjjndtridrok/compilers_and_libraries_2017.4.196/linux/mpi/intel64/lib/release_mt -L/soft/spack-0.10.1/opt/spack/linux-centos7-x86_64/intel-17.0.4/intel-mpi-2017.3-dfphq6kavje2olnichisvjjndtridrok/compilers_and_libraries_2017.4.196/linux/mpi/intel64/lib/release_mt -Wl,-rpath,/soft/spack-0.10.1/opt/spack/linux-centos7-x86_64/intel-17.0.4/intel-mpi-2017.3-dfphq6kavje2olnichisvjjndtridrok/compilers_and_libraries_2017.4.196/linux/mpi/intel64/lib -L/soft/spack-0.10.1/opt/spack/linux-centos7-x86_64/intel-17.0.4/intel-mpi-2017.3-dfphq6kavje2olnichisvjjndtridrok/compilers_and_libraries_2017.4.196/linux/mpi/intel64/lib -Wl,-rpath,/blues/gpfs/home/software/spack-0.10.1/opt/spack/linux-centos7-x86_64/gcc-4.8.5/intel-17.0.4-74uvhjiulyqgvsmywifbbuo46v5n42xc/compilers_and_libraries_2017.4.193/linux/compiler/lib/intel64_lin -L/blues/gpfs/home/software/spack-0.10.1/opt/spack/linux-centos7-x86_64/gcc-4.8.5/intel-17.0.4-74uvhjiulyqgvsmywifbbuo46v5n42xc/compilers_and_libraries_2017.4.193/linux/compiler/lib/intel64_lin -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -L/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -Wl,-rpath,/opt/intel/mpi-rt/2017.0.0/intel64/lib/release_mt -Wl,-rpath,/opt/intel/mpi-rt/2017.0.0/intel64/lib -lsuperlu_dist -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lhdf5hl_fortran -lhdf5_fortran -lhdf5_hl -lhdf5 -lparmetis -lmetis -lz -lintlc -lstdc++ -ldl -lmpifort -lmpi -lmpigi -lrt -lpthread -lifport -lifcoremt_pic -limf -lsvml -lm -lipgo -lirc -lgcc_s -lirc_s -lstdc++ -ldl -----------------------------------------