[petsc-users] Error reported by MUMPS in numerical factorization phase

Danyang Su danyang.su at gmail.com
Tue Dec 1 17:02:08 CST 2015


Hi All,

My code fails due to the error in external library. It works fine for 
the previous 2000+ timesteps but then crashes.

[4]PETSC ERROR: Error in external library
[4]PETSC ERROR: Error reported by MUMPS in numerical factorization 
phase: INFO(1)=-1, INFO(2)=0

The full error message is attached.

Then I tried the same simulation on another machine using the same 
number of processors, it does not fail.

Is there anything wrong with the configuration? or would it be possible 
if anything is wrong in my code?

Thanks,

Danyang
-------------- next part --------------
[4]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[4]PETSC ERROR: Error in external library
[4]PETSC ERROR: Error reported by MUMPS in numerical factorization phase: INFO(1)=-1, INFO(2)=0

[4]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[4]PETSC ERROR: Petsc Release Version 3.6.2, Oct, 02, 2015 
[4]PETSC ERROR: ../min3p_thcm on a arch-linux2-c-debug named pod26b15 by danyangs Tue Dec  1 11:29:00 2015
[4]PETSC ERROR: Configure options --prefix=/global/software/lib64/intel/petsc-3.6.2 --with-64-bit-pointers=0 --with-pthread=0 --with-pthreadclasses=0 --with-cc=/global/software/openmpi-1.6.5/intel/bin/mpicc --CFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-cxx=/global/software/openmpi-1.6.5/intel/bin/mpicxx --CXXFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-fc=/global/software/openmpi-1.6.5/intel/bin/mpif90 --FFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-cxx-dialect=C++11 --with-single-library=1 --with-shared-libraries --with-shared-ld=/global/software/openmpi-1.6.5/intel-2011/bin/mpicc --sharedLibraryFlags="-fpic -fPIC " --with-large-file-io=1 --with-mpi=1 --with-mpi-shared=1 --with-mpirun=/global/software/openmpi-1.6.5/intel/bin/mpiexec --with-mpi-compilers=1 --with-x=yes --with-blas-lapack-dir=/global/software/intel/composerxe/mkl/lib/intel64 --with-ptscotch=0 --with-x=1 --with-hdf5=1 --with-hdf5-dir=/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1 --with-netcdf=0 --with-fftw=1 --with-fftw-dir=/global/software/lib64/intel/fftw-3.3.3 --download-blacs=yes --download-scalapack=yes --download-superlu_dist=yes --download-mumps=yes --download-metis=yes --download-parmetis=yes --download-spooles=yes --download-cproto=yes --download-suitesparse=yes --download-hypre=yes --download-amd=yes --download-adifor=yes --download-euclid=yes --download-spai=yes --download-sprng=yes --download-ml=yes --download-boost=yes --download-triangle=yes --download-generator=yes --with-boost=1 --with-petsc4py=0 --with-numpy=1 exit 0
[4]PETSC ERROR: #1 MatFactorNumeric_MUMPS() line 1172 in /tmp/petsc-3.6.2/src/mat/impls/aij/mpi/mumps/mumps.c
[5]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[5]PETSC ERROR: Error in external library
[5]PETSC ERROR: Error reported by MUMPS in numerical factorization phase: INFO(1)=-1, INFO(2)=0

[5]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[5]PETSC ERROR: Petsc Release Version 3.6.2, Oct, 02, 2015 
[5]PETSC ERROR: ../min3p_thcm on a arch-linux2-c-debug named pod26b15 by danyangs Tue Dec  1 11:29:00 2015
[5]PETSC ERROR: Configure options --prefix=/global/software/lib64/intel/petsc-3.6.2 --with-64-bit-pointers=0 --with-pthread=0 --with-pthreadclasses=0 --with-cc=/global/software/openmpi-1.6.5/intel/bin/mpicc --CFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-cxx=/global/software/openmpi-1.6.5/intel/bin/mpicxx --CXXFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-fc=/global/software/openmpi-1.6.5/intel/bin/mpif90 --FFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-cxx-dialect=C++11 --with-single-library=1 --with-shared-libraries --with-shared-ld=/global/software/openmpi-1.6.5/intel-2011/bin/mpicc --sharedLibraryFlags="-fpic -fPIC " --with-large-file-io=1 --with-mpi=1 --with-mpi-shared=1 --with-mpirun=/global/software/openmpi-1.6.5/intel/bin/mpiexec --with-mpi-compilers=1 --with-x=yes --with-blas-lapack-dir=/global/software/intel/composerxe/mkl/lib/intel64 --with-ptscotch=0 --with-x=1 --with-hdf5=1 --with-hdf5-dir=/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1 --with-netcdf=0 --with-fftw=1 --with-fftw-dir=/global/software/lib64/intel/fftw-3.3.3 --download-blacs=yes --download-scalapack=yes --download-superlu_dist=yes --download-mumps=yes --download-metis=yes --download-parmetis=yes --download-spooles=yes --download-cproto=yes --download-suitesparse=yes --download-hypre=yes --download-amd=yes --download-adifor=yes --download-euclid=yes --download-spai=yes --download-sprng=yes --download-ml=yes --download-boost=yes --download-triangle=yes --download-generator=yes --with-boost=1 --with-petsc4py=0 --with-numpy=1 exit 0
[5]PETSC ERROR: #1 MatFactorNumeric_MUMPS() line 1172 in /tmp/petsc-3.6.2/src/mat/impls/aij/mpi/mumps/mumps.c
[6]PETSC ERROR: [7]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[7]PETSC ERROR: Error in external library
[7]PETSC ERROR: Error reported by MUMPS in numerical factorization phase: INFO(1)=-1, INFO(2)=0

[7]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[7]PETSC ERROR: Petsc Release Version 3.6.2, Oct, 02, 2015 
[7]PETSC ERROR: ../min3p_thcm on a arch-linux2-c-debug named pod26b15 by danyangs Tue Dec  1 11:29:00 2015
[7]PETSC ERROR: Configure options --prefix=/global/software/lib64/intel/petsc-3.6.2 --with-64-bit-pointers=0 --with-pthread=0 --with-pthreadclasses=0 --with-cc=/global/software/openmpi-1.6.5/intel/bin/mpicc --CFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-cxx=/global/software/openmpi-1.6.5/intel/bin/mpicxx --CXXFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-fc=/global/software/openmpi-1.6.5/intel/bin/mpif90 --FFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-cxx-dialect=C++11 --with-single-library=1 --with-shared-libraries --with-shared-ld=/global/software/openmpi-1.6.5/intel-2011/bin/mpicc --sharedLibraryFlags="-fpic -fPIC " --with-large-file-io=1 --with-mpi=1 --with-mpi-shared=1 --with-mpirun=/global/software/openmpi-1.6.5/intel/bin/mpiexec --with-mpi-compilers=1 --with-x=yes --with-blas-lapack-dir=/global/software/intel/composerxe/mkl/lib/intel64 --with-ptscotch=0 --with-x=1 --with-hdf5=1 --with-hdf5-dir=/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1 --with-netcdf=0 --with-fftw=1 --with-fftw-dir=/global/software/lib64/intel/fftw-3.3.3 --download-blacs=yes --download-scalapack=yes --download-superlu_dist=yes --download-mumps=yes --download-metis=yes --download-parmetis=yes --download-spooles=yes --download-cproto=yes --download-suitesparse=yes --download-hypre=yes --download-amd=yes --download-adifor=yes --download-euclid=yes --download-spai=yes --download-sprng=yes --download-ml=yes --download-boost=yes --download-triangle=yes --download-generator=yes --with-boost=1 --with-petsc4py=0 --with-numpy=1 exit 0
[7]PETSC ERROR: #1 MatFactorNumeric_MUMPS() line 1172 in /tmp/petsc-3.6.2/src/mat/impls/aij/mpi/mumps/mumps.c
[7]PETSC ERROR: #2 MatLUFactorNumeric() line 2946 in /tmp/petsc-3.6.2/src/mat/interface/matrix.c
[7]PETSC ERROR: #3 PCSetUp_LU() line 152 in /tmp/petsc-3.6.2/src/ksp/pc/impls/factor/lu/lu.c
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Error in external library
[0]PETSC ERROR: Error reported by MUMPS in numerical factorization phase: INFO(1)=-9, INFO(2)=113254

[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.6.2, Oct, 02, 2015 
[0]PETSC ERROR: ../min3p_thcm on a arch-linux2-c-debug named pod26b15 by danyangs Tue Dec  1 11:29:00 2015
[0]PETSC ERROR: Configure options --prefix=/global/software/lib64/intel/petsc-3.6.2 --with-64-bit-pointers=0 --with-pthread=0 --with-pthreadclasses=0 --with-cc=/global/software/openmpi-1.6.5/intel/bin/mpicc --CFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-cxx=/global/software/openmpi-1.6.5/intel/bin/mpicxx --CXXFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-fc=/global/software/openmpi-1.6.5/intel/bin/mpif90 --FFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-cxx-dialect=C++11 --with-single-library=1 --with-shared-libraries --with-shared-ld=/global/software/openmpi-1.6.5/intel-2011/bin/mpicc --sharedLibraryFlags="-fpic -fPIC " --with-large-file-io=1 --with-mpi=1 --with-mpi-shared=1 --with-mpirun=/global/software/openmpi-1.6.5/intel/bin/mpiexec --with-mpi-compilers=1 --with-x=yes --with-blas-lapack-dir=/global/software/intel/composerxe/mkl/lib/intel64 --with-ptscotch=0 --with-x=1 --with-hdf5=1 --with-hdf5-dir=/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1 --with-netcdf=0 --with-fftw=1 --with-fftw-dir=/global/software/lib64/intel/fftw-3.3.3 --download-blacs=yes --download-scalapack=yes --download-superlu_dist=yes --download-mumps=yes --download-metis=yes --download-parmetis=yes --download-spooles=yes --download-cproto=yes --download-suitesparse=yes --download-hypre=yes --download-amd=yes --download-adifor=yes --download-euclid=yes --download-spai=yes --download-sprng=yes --download-ml=yes --download-boost=yes --download-triangle=yes --download-generator=yes --with-boost=1 --with-petsc4py=0 --with-numpy=1 exit 0
[0]PETSC ERROR: #1 MatFactorNumeric_MUMPS() line 1172 in /tmp/petsc-3.6.2/src/mat/impls/aij/mpi/mumps/mumps.c
[0]PETSC ERROR: #2 MatLUFactorNumeric() line 2946 in /tmp/petsc-3.6.2/src/mat/interface/matrix.c
[0]PETSC ERROR: #3 PCSetUp_LU() line 152 in /tmp/petsc-3.6.2/src/ksp/pc/impls/factor/lu/lu.c
[0]PETSC ERROR: #4 PCSetUp() line 983 in /tmp/petsc-3.6.2/src/ksp/pc/interface/precon.c
[1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[1]PETSC ERROR: Error in external library
[1]PETSC ERROR: Error reported by MUMPS in numerical factorization phase: INFO(1)=-1, INFO(2)=0

[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[1]PETSC ERROR: Petsc Release Version 3.6.2, Oct, 02, 2015 
[1]PETSC ERROR: ../min3p_thcm on a arch-linux2-c-debug named pod26b15 by danyangs Tue Dec  1 11:29:00 2015
[1]PETSC ERROR: Configure options --prefix=/global/software/lib64/intel/petsc-3.6.2 --with-64-bit-pointers=0 --with-pthread=0 --with-pthreadclasses=0 --with-cc=/global/software/openmpi-1.6.5/intel/bin/mpicc --CFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-cxx=/global/software/openmpi-1.6.5/intel/bin/mpicxx --CXXFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-fc=/global/software/openmpi-1.6.5/intel/bin/mpif90 --FFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-cxx-dialect=C++11 --with-single-library=1 --with-shared-libraries --with-shared-ld=/global/software/openmpi-1.6.5/intel-2011/bin/mpicc --sharedLibraryFlags="-fpic -fPIC " --with-large-file-io=1 --with-mpi=1 --with-mpi-shared=1 --with-mpirun=/global/software/openmpi-1.6.5/intel/bin/mpiexec --with-mpi-compilers=1 --with-x=yes --with-blas-lapack-dir=/global/software/intel/composerxe/mkl/lib/intel64 --with-ptscotch=0 --with-x=1 --with-hdf5=1 --with-hdf5-dir=/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1 --with-netcdf=0 --with-fftw=1 --with-fftw-dir=/global/software/lib64/intel/fftw-3.3.3 --download-blacs=yes --download-scalapack=yes --download-superlu_dist=yes --download-mumps=yes --download-metis=yes --download-parmetis=yes --download-spooles=yes --download-cproto=yes --download-suitesparse=yes --download-hypre=yes --download-amd=yes --download-adifor=yes --download-euclid=yes --download-spai=yes --download-sprng=yes --download-ml=yes --download-boost=yes --download-triangle=yes --download-generator=yes --with-boost=1 --with-petsc4py=0 --with-numpy=1 exit 0
[1]PETSC ERROR: #1 MatFactorNumeric_MUMPS() line 1172 in /tmp/petsc-3.6.2/src/mat/impls/aij/mpi/mumps/mumps.c
[1]PETSC ERROR: #2 MatLUFactorNumeric() line 2946 in /tmp/petsc-3.6.2/src/mat/interface/matrix.c
[1]PETSC ERROR: #3 PCSetUp_LU() line 152 in /tmp/petsc-3.6.2/src/ksp/pc/impls/factor/lu/lu.c
[1]PETSC ERROR: #4 PCSetUp() line 983 in /tmp/petsc-3.6.2/src/ksp/pc/interface/precon.c
[2]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[2]PETSC ERROR: Error in external library
[2]PETSC ERROR: Error reported by MUMPS in numerical factorization phase: INFO(1)=-1, INFO(2)=0

[2]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[2]PETSC ERROR: Petsc Release Version 3.6.2, Oct, 02, 2015 
[2]PETSC ERROR: ../min3p_thcm on a arch-linux2-c-debug named pod26b15 by danyangs Tue Dec  1 11:29:00 2015
[2]PETSC ERROR: Configure options --prefix=/global/software/lib64/intel/petsc-3.6.2 --with-64-bit-pointers=0 --with-pthread=0 --with-pthreadclasses=0 --with-cc=/global/software/openmpi-1.6.5/intel/bin/mpicc --CFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-cxx=/global/software/openmpi-1.6.5/intel/bin/mpicxx --CXXFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-fc=/global/software/openmpi-1.6.5/intel/bin/mpif90 --FFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-cxx-dialect=C++11 --with-single-library=1 --with-shared-libraries --with-shared-ld=/global/software/openmpi-1.6.5/intel-2011/bin/mpicc --sharedLibraryFlags="-fpic -fPIC " --with-large-file-io=1 --with-mpi=1 --with-mpi-shared=1 --with-mpirun=/global/software/openmpi-1.6.5/intel/bin/mpiexec --with-mpi-compilers=1 --with-x=yes --with-blas-lapack-dir=/global/software/intel/composerxe/mkl/lib/intel64 --with-ptscotch=0 --with-x=1 --with-hdf5=1 --with-hdf5-dir=/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1 --with-netcdf=0 --with-fftw=1 --with-fftw-dir=/global/software/lib64/intel/fftw-3.3.3 --download-blacs=yes --download-scalapack=yes --download-superlu_dist=yes --download-mumps=yes --download-metis=yes --download-parmetis=yes --download-spooles=yes --download-cproto=yes --download-suitesparse=yes --download-hypre=yes --download-amd=yes --download-adifor=yes --download-euclid=yes --download-spai=yes --download-sprng=yes --download-ml=yes --download-boost=yes --download-triangle=yes --download-generator=yes --with-boost=1 --with-petsc4py=0 --with-numpy=1 exit 0
[2]PETSC ERROR: #1 MatFactorNumeric_MUMPS() line 1172 in /tmp/petsc-3.6.2/src/mat/impls/aij/mpi/mumps/mumps.c
[2]PETSC ERROR: #2 MatLUFactorNumeric() line 2946 in /tmp/petsc-3.6.2/src/mat/interface/matrix.c
[2]PETSC ERROR: #3 PCSetUp_LU() line 152 in /tmp/petsc-3.6.2/src/ksp/pc/impls/factor/lu/lu.c
[2]PETSC ERROR: #4 PCSetUp() line 983 in /tmp/petsc-3.6.2/src/ksp/pc/interface/precon.c
[2]PETSC ERROR: #5 KSPSetUp() line 332 in /tmp/petsc-3.6.2/src/ksp/ksp/interface/itfunc.c
[2]PETSC ERROR: #6 KSPSolve() line 546 in /tmp/petsc-3.6.2/src/ksp/ksp/interface/itfunc.c
[3]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[3]PETSC ERROR: Error in external library
[3]PETSC ERROR: Error reported by MUMPS in numerical factorization phase: INFO(1)=-1, INFO(2)=0

[3]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[3]PETSC ERROR: Petsc Release Version 3.6.2, Oct, 02, 2015 
[3]PETSC ERROR: ../min3p_thcm on a arch-linux2-c-debug named pod26b15 by danyangs Tue Dec  1 11:29:00 2015
[3]PETSC ERROR: Configure options --prefix=/global/software/lib64/intel/petsc-3.6.2 --with-64-bit-pointers=0 --with-pthread=0 --with-pthreadclasses=0 --with-cc=/global/software/openmpi-1.6.5/intel/bin/mpicc --CFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-cxx=/global/software/openmpi-1.6.5/intel/bin/mpicxx --CXXFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-fc=/global/software/openmpi-1.6.5/intel/bin/mpif90 --FFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-cxx-dialect=C++11 --with-single-library=1 --with-shared-libraries --with-shared-ld=/global/software/openmpi-1.6.5/intel-2011/bin/mpicc --sharedLibraryFlags="-fpic -fPIC " --with-large-file-io=1 --with-mpi=1 --with-mpi-shared=1 --with-mpirun=/global/software/openmpi-1.6.5/intel/bin/mpiexec --with-mpi-compilers=1 --with-x=yes --with-blas-lapack-dir=/global/software/intel/composerxe/mkl/lib/intel64 --with-ptscotch=0 --with-x=1 --with-hdf5=1 --with-hdf5-dir=/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1 --with-netcdf=0 --with-fftw=1 --with-fftw-dir=/global/software/lib64/intel/fftw-3.3.3 --download-blacs=yes --download-scalapack=yes --download-superlu_dist=yes --download-mumps=yes --download-metis=yes --download-parmetis=yes --download-spooles=yes --download-cproto=yes --download-suitesparse=yes --download-hypre=yes --download-amd=yes --download-adifor=yes --download-euclid=yes --download-spai=yes --download-sprng=yes --download-ml=yes --download-boost=yes --download-triangle=yes --download-generator=yes --with-boost=1 --with-petsc4py=0 --with-numpy=1 exit 0
[3]PETSC ERROR: #1 MatFactorNumeric_MUMPS() line 1172 in /tmp/petsc-3.6.2/src/mat/impls/aij/mpi/mumps/mumps.c
[3]PETSC ERROR: #2 MatLUFactorNumeric() line 2946 in /tmp/petsc-3.6.2/src/mat/interface/matrix.c
[3]PETSC ERROR: #3 PCSetUp_LU() line 152 in /tmp/petsc-3.6.2/src/ksp/pc/impls/factor/lu/lu.c
[3]PETSC ERROR: #4 PCSetUp() line 983 in /tmp/petsc-3.6.2/src/ksp/pc/interface/precon.c
[3]PETSC ERROR: #5 KSPSetUp() line 332 in /tmp/petsc-3.6.2/src/ksp/ksp/interface/itfunc.c
[3]PETSC ERROR: #6 KSPSolve() line 546 in /tmp/petsc-3.6.2/src/ksp/ksp/interface/itfunc.c
[4]PETSC ERROR: #2 MatLUFactorNumeric() line 2946 in /tmp/petsc-3.6.2/src/mat/interface/matrix.c
[4]PETSC ERROR: #3 PCSetUp_LU() line 152 in /tmp/petsc-3.6.2/src/ksp/pc/impls/factor/lu/lu.c
[4]PETSC ERROR: #4 PCSetUp() line 983 in /tmp/petsc-3.6.2/src/ksp/pc/interface/precon.c
[4]PETSC ERROR: #5 KSPSetUp() line 332 in /tmp/petsc-3.6.2/src/ksp/ksp/interface/itfunc.c
[4]PETSC ERROR: #6 KSPSolve() line 546 in /tmp/petsc-3.6.2/src/ksp/ksp/interface/itfunc.c
[5]PETSC ERROR: #2 MatLUFactorNumeric() line 2946 in /tmp/petsc-3.6.2/src/mat/interface/matrix.c
[5]PETSC ERROR: #3 PCSetUp_LU() line 152 in /tmp/petsc-3.6.2/src/ksp/pc/impls/factor/lu/lu.c
[5]PETSC ERROR: #4 PCSetUp() line 983 in /tmp/petsc-3.6.2/src/ksp/pc/interface/precon.c
[5]PETSC ERROR: #5 KSPSetUp() line 332 in /tmp/petsc-3.6.2/src/ksp/ksp/interface/itfunc.c
[5]PETSC ERROR: #6 KSPSolve() line 546 in /tmp/petsc-3.6.2/src/ksp/ksp/interface/itfunc.c
[7]PETSC ERROR: #4 PCSetUp() line 983 in /tmp/petsc-3.6.2/src/ksp/pc/interface/precon.c
[7]PETSC ERROR: #5 KSPSetUp() line 332 in /tmp/petsc-3.6.2/src/ksp/ksp/interface/itfunc.c
[7]PETSC ERROR: #6 KSPSolve() line 546 in /tmp/petsc-3.6.2/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #5 KSPSetUp() line 332 in /tmp/petsc-3.6.2/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: #6 KSPSolve() line 546 in /tmp/petsc-3.6.2/src/ksp/ksp/interface/itfunc.c
[1]PETSC ERROR: #5 KSPSetUp() line 332 in /tmp/petsc-3.6.2/src/ksp/ksp/interface/itfunc.c
[1]PETSC ERROR: #6 KSPSolve() line 546 in /tmp/petsc-3.6.2/src/ksp/ksp/interface/itfunc.c
--------------------- Error Message --------------------------------------------------------------
[6]PETSC ERROR: Error in external library
[6]PETSC ERROR: Error reported by MUMPS in numerical factorization phase: INFO(1)=-1, INFO(2)=0

[6]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[6]PETSC ERROR: Petsc Release Version 3.6.2, Oct, 02, 2015 
[6]PETSC ERROR: ../min3p_thcm on a arch-linux2-c-debug named pod26b15 by danyangs Tue Dec  1 11:29:00 2015
[6]PETSC ERROR: Configure options --prefix=/global/software/lib64/intel/petsc-3.6.2 --with-64-bit-pointers=0 --with-pthread=0 --with-pthreadclasses=0 --with-cc=/global/software/openmpi-1.6.5/intel/bin/mpicc --CFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-cxx=/global/software/openmpi-1.6.5/intel/bin/mpicxx --CXXFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-fc=/global/software/openmpi-1.6.5/intel/bin/mpif90 --FFLAGS="-O3 -axSSE4.2,SSE4.1 -xSSSE3  -I/global/software/intel/composerxe/mkl/include -I/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1/include -I/global/software/lib64/intel/petsc-3.6.2/include" --with-cxx-dialect=C++11 --with-single-library=1 --with-shared-libraries --with-shared-ld=/global/software/openmpi-1.6.5/intel-2011/bin/mpicc --sharedLibraryFlags="-fpic -fPIC " --with-large-file-io=1 --with-mpi=1 --with-mpi-shared=1 --with-mpirun=/global/software/openmpi-1.6.5/intel/bin/mpiexec --with-mpi-compilers=1 --with-x=yes --with-blas-lapack-dir=/global/software/intel/composerxe/mkl/lib/intel64 --with-ptscotch=0 --with-x=1 --with-hdf5=1 --with-hdf5-dir=/global/software/lib64/intel/ncsa-tools/hdf5-1.8.15p1 --with-netcdf=0 --with-fftw=1 --with-fftw-dir=/global/software/lib64/intel/fftw-3.3.3 --download-blacs=yes --download-scalapack=yes --download-superlu_dist=yes --download-mumps=yes --download-metis=yes --download-parmetis=yes --download-spooles=yes --download-cproto=yes --download-suitesparse=yes --download-hypre=yes --download-amd=yes --download-adifor=yes --download-euclid=yes --download-spai=yes --download-sprng=yes --download-ml=yes --download-boost=yes --download-triangle=yes --download-generator=yes --with-boost=1 --with-petsc4py=0 --with-numpy=1 exit 0
[6]PETSC ERROR: #1 MatFactorNumeric_MUMPS() line 1172 in /tmp/petsc-3.6.2/src/mat/impls/aij/mpi/mumps/mumps.c
[6]PETSC ERROR: #2 MatLUFactorNumeric() line 2946 in /tmp/petsc-3.6.2/src/mat/interface/matrix.c
[6]PETSC ERROR: #3 PCSetUp_LU() line 152 in /tmp/petsc-3.6.2/src/ksp/pc/impls/factor/lu/lu.c
[6]PETSC ERROR: #4 PCSetUp() line 983 in /tmp/petsc-3.6.2/src/ksp/pc/interface/precon.c
[6]PETSC ERROR: #5 KSPSetUp() line 332 in /tmp/petsc-3.6.2/src/ksp/ksp/interface/itfunc.c
[6]PETSC ERROR: #6 KSPSolve() line 546 in /tmp/petsc-3.6.2/src/ksp/ksp/interface/itfunc.c
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 7 in communicator MPI_COMM_WORLD 
with errorcode 76.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpiexec has exited due to process rank 6 with PID 17428 on
node pod26b15 exiting improperly. There are two reasons this could occur:

1. this process did not call "init" before exiting, but others in
the job did. This can cause a job to hang indefinitely while it waits
for all processes to call "init". By rule, if one process calls "init",
then ALL processes must call "init" prior to termination.

2. this process called "init", but exited without calling "finalize".
By rule, all processes that call "init" MUST call "finalize" prior to
exiting or it will be considered an "abnormal termination"

This may have caused other processes in the application to be
terminated by signals sent by mpiexec (as reported here).
--------------------------------------------------------------------------
[pod26b15:17421] 7 more processes have sent help message help-mpi-api.txt / mpi-abort
[pod26b15:17421] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages


More information about the petsc-users mailing list