hello from 0 of 2 hello from 1 of 2 petsc initalized for 1 time(s) with errcode= 0 @ 0 petsc initalized for 1 time(s) with errcode= 0 @ 1 set mat for gpu with errcode= 0 @ 0 matsettype set mat for gpu with errcode= 0 @ 1 matsettype mat setup with errcode= 0 @ 0 mat ownership range with errcode= 0 @ 0 mat setup with errcode= 0 @ 1 mat ownership range with errcode= 0 @ 1 mat value set with errcode= 0 @ 0 mat value set with errcode= 0 @ 1 mat assumbly with errcode= 0 @ 0 mat assumbly with errcode= 0 @ 1 setup vec from mat with errcode= 0 @ 0 setup vec from mat with errcode= 0 @ 1 vec duplication with errcode= 0 @ 0 vec duplication with errcode= 0 @ 1 MatMult for rhs with errcode= 0 @ 0 MatMult for rhs with errcode= 0 @ 1 KSP solve the system with errcode= 0 @ 0 AxPy for error estimation with errcode= 0 @ 0 KSP solve the system with errcode= 0 @ 1 AxPy for error estimation with errcode= 0 @ 1 run 1 Norm of error 0.6808E-03,iterations 25 KSP distroyed with errcode= 0 @ 1 vecs distroyed with errcode= 0 @ 1 KSP distroyed with errcode= 0 @ 0 vecs distroyed with errcode= 0 @ 0 mat distroyed with errcode= 0 @ 1 mat distroyed with errcode= 0 @ 0 PETSc finalized with errcode= 0 @ 1 PETSc finalized with errcode= 0 @ 0 petsc initalized for 2 time(s) with errcode= 0 @ 0 petsc initalized for 2 time(s) with errcode= 0 @ 1 set mat for gpu with errcode= 0 @ 0 matsettype set mat for gpu with errcode= 0 @ 1 matsettype mat setup with errcode= 0 @ 0 mat ownership range with errcode= 0 @ 0 mat setup with errcode= 0 @ 1 mat ownership range with errcode= 0 @ 1 mat value set with errcode= 0 @ 0 mat value set with errcode= 0 @ 1 mat assumbly with errcode= 0 @ 0 mat assumbly with errcode= 0 @ 1 setup vec from mat with errcode= 0 @ 0 setup vec from mat with errcode= 0 @ 1 vec duplication with errcode= 0 @ 0 vec duplication with errcode= 0 @ 1 MatMult for rhs with errcode= 0 @ 1 MatMult for rhs with errcode= 0 @ 0 KSP solve the system with errcode= 0 @ 1 AxPy for error estimation with errcode= 0 @ 1 KSP solve the system with errcode= 0 @ 0 AxPy for error estimation with errcode= 0 @ 0 run 2 Norm of error 0.6808E-03,iterations 25 KSP distroyed with errcode= 0 @ 1 vecs distroyed with errcode= 0 @ 1 KSP distroyed with errcode= 0 @ 0 vecs distroyed with errcode= 0 @ 0 mat distroyed with errcode= 0 @ 1 mat distroyed with errcode= 0 @ 0 [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: GPU error [0]PETSC ERROR: cuda error 709 (cudaErrorContextIsDestroyed) : context is destroyed [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.16.3, unknown [0]PETSC ERROR: ex11fc on a named XPS-15 by hao Tue Jan 25 17:30:19 2022 [0]PETSC ERROR: Configure options --prefix=/opt/petsc/debug --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --with-scalar-type=complex --with-precision=double --with-cuda-dir=/usr/local/cuda --with-fortran-kernels=1 --with-cxx-dialect=cxx14 --with-cuda-dialect=cxx14 --with-debugging=1 [0]PETSC ERROR: #1 PetscFinalize() at /home/hao/packages/petsc-current/src/sys/objects/pinit.c:1638 [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [1]PETSC ERROR: GPU error [1]PETSC ERROR: cuda error 201 (cudaErrorDeviceUninitialized) : invalid device context [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. [1]PETSC ERROR: Petsc Release Version 3.16.3, unknown [1]PETSC ERROR: ex11fc on a named XPS-15 by hao Tue Jan 25 17:30:19 2022 [1]PETSC ERROR: Configure options --prefix=/opt/petsc/debug --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --with-scalar-type=complex --with-precision=double --with-cuda-dir=/usr/local/cuda --with-fortran-kernels=1 --with-cxx-dialect=cxx14 --with-cuda-dialect=cxx14 --with-debugging=1 [1]PETSC ERROR: #1 PetscFinalize() at /home/hao/packages/petsc-current/src/sys/objects/pinit.c:1638 [1]PETSC ERROR: #2 User provided function() at User file:0 PETSc finalized with errcode= 97 @ 0 [0]PETSC ERROR: #2 User provided function() at User file:0 -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF with errorcode 97. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- PETSc finalized with errcode= 97 @ 1 [XPS-15:22663] 127 more processes have sent help message help-mpi-common-cuda.txt / cuEventCreate failed [XPS-15:22663] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages [XPS-15:22663] 127 more processes have sent help message help-mpi-common-cuda.txt / cuIpcGetEventHandle failed [XPS-15:22663] 99 more processes have sent help message help-mpi-common-cuda.txt / cuIpcGetMemHandle failed [XPS-15:22663] 1 more process has sent help message help-mpi-api.txt / mpi-abort