[petsc-dev] elemental fail in master
hong at aspiritech.org
hong at aspiritech.org
Mon Mar 2 17:36:02 CST 2015
I cannot reproduce it, even with same configure options on petsc machine:
Configure Options: --configModules=PETSc.Configure
--optionsModule=config.compilerOptions --download-metis
--download-openmpich --with-debugging=0 --with-cc=mpicc.openmpi
--with-cxx=mpicxx.openmpi --with-fc=mpif90.openmpi
--with-mpiexec=mpiexec.openmpi --download-fblaslapack=1 --download-mumps
--download-parmetis --download-scalapack --download-elemental
--with-cxx-dialect=C++11 --download-cmake PETSC_ARCH=arch-openmpi-o
Any suggestion?
Hong
On Mon, Mar 2, 2015 at 2:34 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
> http://ftp.mcs.anl.gov/pub/petsc/nightlylogs/archive/2015/03/01/examples_master_arch-linux-pkgs-opt_crank.log
>
> ******* Testing: testexamples_ELEMENTAL *******
> 0a1,55
> > [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > [0]PETSC ERROR: Object is in wrong state
> > [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > [1]PETSC ERROR: Object is in wrong state
> > [1]PETSC ERROR: C != D
> > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> > [1]PETSC ERROR: Petsc Development GIT revision: v3.5.3-2131-g5620d6d
> GIT Date: 2015-02-28 21:26:35 -0600
> > [1]PETSC ERROR: ./ex104 on a arch-linux-pkgs-opt named crank by petsc
> Sun Mar 1 14:11:43 2015
> > [1]PETSC ERROR: Configure options --with-debugging=0
> --with-cc=mpicc.openmpi --with-cxx=mpicxx.openmpi --with-fc=mpif90.openmpi
> --with-mpiexec=mpiexec.openmpi --download-fblaslapack=1 --download-hypre=1
> --download-cmake=1 --download-metis=1 --download-parmetis=1
> --download-ptscotch=1 --download-suitesparse=1 --download-triangle=1
> --download-superlu=1 --download-superlu_dist=1 --download-scalapack=1
> --download-mumps=1 --download-elemental=1 --with-cxx-dialect=C++11
> --download-spai=1 --download-parms=1 --download-moab=1 --download-chaco=1
> --download-saws --with-no-output -PETSC_ARCH=arch-linux-pkgs-opt
> -PETSC_DIR=/sandbox/petsc/petsc.clone
> > [1]PETSC ERROR: #1 main() line 67 in
> /sandbox/petsc/petsc.clone/src/mat/examples/tests/ex104.c
> > [2]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> > [2]PETSC ERROR: Object is in wrong state
> > [2]PETSC ERROR: C != D
> > [2]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> > [2]PETSC ERROR: Petsc Development GIT revision: v3.5.3-2131-g5620d6d
> GIT Date: 2015-02-28 21:26:35 -0600
> > [2]PETSC ERROR: ./ex104 on a arch-linux-pkgs-opt named crank by petsc
> Sun Mar 1 14:11:43 2015
> > [2]PETSC ERROR: Configure options --with-debugging=0
> --with-cc=mpicc.openmpi --with-cxx=mpicxx.openmpi --with-fc=mpif90.openmpi
> --with-mpiexec=mpiexec.openmpi --download-fblaslapack=1 --download-hypre=1
> --download-cmake=1 --download-metis=1 --download-parmetis=1
> --download-ptscotch=1 --download-suitesparse=1 --download-triangle=1
> --download-superlu=1 --download-superlu_dist=1 --download-scalapack=1
> --download-mumps=1 --download-elemental=1 --with-cxx-dialect=C++11
> --download-spai=1 --download-parms=1 --download-moab=1 --download-chaco=1
> --download-saws --with-no-output -PETSC_ARCH=arch-linux-pkgs-opt
> -PETSC_DIR=/sandbox/petsc/petsc.clone
> > [2]PETSC ERROR: #1 main() line 67 in
> /sandbox/petsc/petsc.clone/src/mat/examples/tests/ex104.c
> > [2]PETSC ERROR: PETSc Option Table entries:
> > [2]PETSC ERROR: -display 140.221.10.20:0.0
> > [2]PETSC ERROR: -malloc_dump
> > C != D
> > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> > [0]PETSC ERROR: Petsc Development GIT revision: v3.5.3-2131-g5620d6d
> GIT Date: 2015-02-28 21:26:35 -0600
> > [0]PETSC ERROR: ./ex104 on a arch-linux-pkgs-opt named crank by petsc
> Sun Mar 1 14:11:43 2015
> > [0]PETSC ERROR: Configure options --with-debugging=0
> --with-cc=mpicc.openmpi --with-cxx=mpicxx.openmpi --with-fc=mpif90.openmpi
> --with-mpiexec=mpiexec.openmpi --download-fblaslapack=1 --download-hypre=1
> --download-cmake=1 --download-metis=1 --download-parmetis=1
> --download-ptscotch=1 --download-suitesparse=1 --download-triangle=1
> --download-superlu=1 --download-superlu_dist=1 --download-scalapack=1
> --download-mumps=1 --download-elemental=1 --with-cxx-dialect=C++11
> --download-spai=1 --download-parms=1 --download-moab=1 --download-chaco=1
> --download-saws --with-no-output -PETSC_ARCH=arch-linux-pkgs-opt
> -PETSC_DIR=/sandbox/petsc/petsc.clone
> > [0]PETSC ERROR: #1 main() line 67 in
> /sandbox/petsc/petsc.clone/src/mat/examples/tests/ex104.c
> > [0]PETSC ERROR: PETSc Option Table entries:
> > [0]PETSC ERROR: -display 140.221.10.20:0.0
> > [0]PETSC ERROR: -malloc_dump
> > [0]PETSC ERROR: -mat_type elemental
> > [0]PETSC ERROR: ----------------End of Error Message -------send entire
> error message to petsc-maint at mcs.anl.gov----------
> > [1]PETSC ERROR: PETSc Option Table entries:
> > [1]PETSC ERROR: -display 140.221.10.20:0.0
> > [1]PETSC ERROR: -malloc_dump
> > [1]PETSC ERROR: -mat_type elemental
> > [1]PETSC ERROR: ----------------End of Error Message -------send entire
> error message to petsc-maint at mcs.anl.gov----------
> > [2]PETSC ERROR: -mat_type elemental
> > [2]PETSC ERROR: ----------------End of Error Message -------send entire
> error message to petsc-maint at mcs.anl.gov----------
> >
> --------------------------------------------------------------------------
> > MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD
> > with errorcode 73.
> >
> > NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> > You may or may not see output from other processes, depending on
> > exactly when Open MPI kills them.
> >
> --------------------------------------------------------------------------
> >
> --------------------------------------------------------------------------
> > mpiexec.openmpi has exited due to process rank 2 with PID 21714 on
> > node crank exiting without calling "finalize". This may
> > have caused other processes in the application to be
> > terminated by signals sent by mpiexec.openmpi (as reported here).
> >
> --------------------------------------------------------------------------
> > [crank:21711] 2 more processes have sent help message help-mpi-api.txt /
> mpi-abort
> > [crank:21711] Set MCA parameter "orte_base_help_aggregate" to 0 to see
> all help / error messages
> /sandbox/petsc/petsc.clone/src/mat/examples/tests
> Possible problem with ex104_elemental_2, diffs above
> =========================================
> 6c6,30
> < Test LU Solver
> ---
> > terminate called after throwing an instance of 'std::logic_error'
> > what(): A was not numerically HPD
> >
> > [crank:21810] *** Process received signal ***
> > [crank:21810] Signal: Aborted (6)
> > [crank:21810] Signal code: (-6)
> > [crank:21810] [ 0] /lib/x86_64-linux-gnu/libpthread.so.0(+0xfcb0)
> [0x2b997a66bcb0]
> > [crank:21810] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x35)
> [0x2b997a8af0d5]
> > [crank:21810] [ 2] /lib/x86_64-linux-gnu/libc.so.6(abort+0x17b)
> [0x2b997a8b283b]
> > [crank:21810] [ 3]
> /nfs/software/linux-ubuntu_precise_amd64/apps/packages/gcc-4.9.0/lib64/libstdc++.so.6(_ZN9__gnu_cxx27__verbose_terminate_handlerEv+0x175)
> [0x2b9976a592d5]
> > [crank:21810] [ 4]
> /nfs/software/linux-ubuntu_precise_amd64/apps/packages/gcc-4.9.0/lib64/libstdc++.so.6(+0x5e336)
> [0x2b9976a57336]
> > [crank:21810] [ 5]
> /nfs/software/linux-ubuntu_precise_amd64/apps/packages/gcc-4.9.0/lib64/libstdc++.so.6(+0x5e381)
> [0x2b9976a57381]
> > [crank:21810] [ 6]
> /nfs/software/linux-ubuntu_precise_amd64/apps/packages/gcc-4.9.0/lib64/libstdc++.so.6(+0x5e598)
> [0x2b9976a57598]
> > [crank:21810] [ 7]
> /sandbox/petsc/petsc.clone/arch-linux-pkgs-opt/lib/libEl.so(_ZN2El10LogicErrorIJPKcEEEvDpT_+0xaa)
> [0x2b9974d9dffa]
> > [crank:21810] [ 8]
> /sandbox/petsc/petsc.clone/arch-linux-pkgs-opt/lib/libEl.so(_ZN2El8CholeskyIdEEvNS_14UpperOrLowerNS12UpperOrLowerERNS_6MatrixIT_EE+0x281)
> [0x2b997557b391]
> > [crank:21810] [ 9]
> /sandbox/petsc/petsc.clone/arch-linux-pkgs-opt/lib/libEl.so(_ZN2El8cholesky5UVar3IdEEvRNS_18AbstractDistMatrixIT_EE+0x20c)
> [0x2b997558460c]
> > [crank:21810] [10]
> /sandbox/petsc/petsc.clone/arch-linux-pkgs-opt/lib/libpetsc.so.3.05(+0x56fd37)
> [0x2b9973546d37]
> > [crank:21810] [11]
> /sandbox/petsc/petsc.clone/arch-linux-pkgs-opt/lib/libpetsc.so.3.05(MatCholeskyFactor+0x230)
> [0x2b99732a79aa]
> > [crank:21810] [12] ./ex145(main+0x24b7) [0x4044ff]
> > [crank:21810] [13]
> /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xed) [0x2b997a89a76d]
> > [crank:21810] [14] ./ex145() [0x401f59]
> > [crank:21810] *** End of error message ***
> >
> --------------------------------------------------------------------------
> > mpiexec.openmpi noticed that process rank 5 with PID 21810 on node crank
> exited on signal 6 (Aborted).
> >
> --------------------------------------------------------------------------
> /sandbox/petsc/petsc.clone/src/mat/examples/tests
> Possible problem with ex145_2, diffs above
> =========================================
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20150302/4022e5b5/attachment.html>
More information about the petsc-dev
mailing list