[petsc-users] Error MPI_ABORT

Junchao Zhang junchao.zhang at gmail.com
Sat Jul 27 08:55:49 CDT 2024


The error message is a little confusing.  It says the indices should be in
[0, 200), but you used an out of range index 200 in VecSetValues.

--Junchao Zhang


On Fri, Jul 26, 2024 at 10:24 PM Ivan Luthfi <ivanluthfi5 at gmail.com> wrote:

> Hi friend, I am trying to try my second Petsc exercise in a lecture from
> Eijkhout.   I run it using command: mpiexec -n 2 ./vec_view in my Macbook
> with 2 cores.   but I got an error. Here is the error message: [1]PETSC
> ERROR: ---------------------
> ZjQcmQRYFpfptBannerStart
> This Message Is From an External Sender
> This message came from outside your organization.
>
> ZjQcmQRYFpfptBannerEnd
> Hi friend,
>
> I am trying to try my second Petsc exercise in a lecture from Eijkhout.
> I run it using command: mpiexec -n 2 ./vec_view in my Macbook with 2
> cores.
>
> but I got an error. Here is the error message:
>
> [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [1]PETSC ERROR: Argument out of range
> [1]PETSC ERROR: Out of range index value 200 maximum 200
> [1]PETSC ERROR: WARNING! There are unused option(s) set! Could be the
> program crashed before usage or a spelling mistake, etc!
> [1]PETSC ERROR:   Option left: name:-vec_view (no value) source: command
> line
> [1]PETSC ERROR: See https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!d8ms3P5k6AI_J1wzoWIxg3JejQiHEmiJTNS2rI9Mdud8gZz4W5tW8iBtEQzOmmctlfUc3AbUF0W3UDTKSUMFK0Y16D4M$ 
> <https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!c9Sc5i6Zy6_vzM9RLEJfU92oYcv7s-RDyELXxNOCh10iT66qkMKsVNOdNN5Mylbznn2ag7w1daVGSrqSnN7HmPLABA$>
> for trouble shooting.
> [1]PETSC ERROR: Petsc Release Version 3.21.3, unknown
> [1]PETSC ERROR: ./vec on a arch-darwin-c-debug named ivanpro.local by jkh
> Sat Jul 27 11:21:26 2024
> [1]PETSC ERROR: Configure options
> --with-cc=/Users/jkh/projects/openmpi/opt-5.0.4/bin/mpicc
> --with-cxx=/Users/jkh/projects/openmpi/opt-5.0.4/bin/mpicxx --with-fc=0
> --download-make --download-cmake --download-bison --with-x=0
> --download-f2cblaslapack --download-metis --download-parmetis
> --download-ptscotch --download-superlu_dist
> [1]PETSC ERROR: #1 VecSetValues_MPI() at
> /Users/jkh/projects/petsc/src/vec/vec/impls/mpi/pdvec.c:859
> [1]PETSC ERROR: #2 VecSetValues() at
> /Users/jkh/projects/petsc/src/vec/vec/interface/rvector.c:917
> [1]PETSC ERROR: #3 main() at vec.c:70
> [1]PETSC ERROR: PETSc Option Table entries:
> [1]PETSC ERROR: -vec_view (source: command line)
> [1]PETSC ERROR: ----------------End of Error Message -------send entire
> error message to petsc-maint at mcs.anl.gov----------
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF
>   Proc: [[953,1],1]
>   Errorcode: 63
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
> --------------------------------------------------------------------------
> prterun has exited due to process rank 1 with PID 0 on node ivanpro calling
> "abort". This may have caused other processes in the application to be
> terminated by signals sent by prterun (as reported here).
>
> can you help me guys?
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240727/e3dc343c/attachment-0001.html>


More information about the petsc-users mailing list