[petsc-users] Error MPI_ABORT

Ivan Luthfi ivanluthfi5 at gmail.com
Fri Jul 26 22:24:06 CDT 2024


Hi friend,

I am trying to try my second Petsc exercise in a lecture from Eijkhout.
I run it using command: mpiexec -n 2 ./vec_view in my Macbook with 2 cores.

but I got an error. Here is the error message:

[1]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[1]PETSC ERROR: Argument out of range
[1]PETSC ERROR: Out of range index value 200 maximum 200
[1]PETSC ERROR: WARNING! There are unused option(s) set! Could be the
program crashed before usage or a spelling mistake, etc!
[1]PETSC ERROR:   Option left: name:-vec_view (no value) source: command
line
[1]PETSC ERROR: See https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!c9Sc5i6Zy6_vzM9RLEJfU92oYcv7s-RDyELXxNOCh10iT66qkMKsVNOdNN5Mylbznn2ag7w1daVGSrqSnN7HmPLABA$  for trouble shooting.
[1]PETSC ERROR: Petsc Release Version 3.21.3, unknown
[1]PETSC ERROR: ./vec on a arch-darwin-c-debug named ivanpro.local by jkh
Sat Jul 27 11:21:26 2024
[1]PETSC ERROR: Configure options
--with-cc=/Users/jkh/projects/openmpi/opt-5.0.4/bin/mpicc
--with-cxx=/Users/jkh/projects/openmpi/opt-5.0.4/bin/mpicxx --with-fc=0
--download-make --download-cmake --download-bison --with-x=0
--download-f2cblaslapack --download-metis --download-parmetis
--download-ptscotch --download-superlu_dist
[1]PETSC ERROR: #1 VecSetValues_MPI() at
/Users/jkh/projects/petsc/src/vec/vec/impls/mpi/pdvec.c:859
[1]PETSC ERROR: #2 VecSetValues() at
/Users/jkh/projects/petsc/src/vec/vec/interface/rvector.c:917
[1]PETSC ERROR: #3 main() at vec.c:70
[1]PETSC ERROR: PETSc Option Table entries:
[1]PETSC ERROR: -vec_view (source: command line)
[1]PETSC ERROR: ----------------End of Error Message -------send entire
error message to petsc-maint at mcs.anl.gov----------
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF
  Proc: [[953,1],1]
  Errorcode: 63

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
prterun has exited due to process rank 1 with PID 0 on node ivanpro calling
"abort". This may have caused other processes in the application to be
terminated by signals sent by prterun (as reported here).

can you help me guys?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240727/336e82fb/attachment.html>


More information about the petsc-users mailing list