[petsc-users] About MPIRUN

김성익 ksi2443 at gmail.com
Wed Dec 7 04:02:52 CST 2022


This error was caused by the inconsistent index of vecgetvalues in the
mpirun case.

For example, for the problem that the global vector size is 4, when mpirun
-np 2, the value obtained from each process with vecgetvalues should be 2,
but in my code tried to get 4 values, so it became a problem.

How to solve this problem?
I want to get a scalar array so that all process array has the same value
with global vector size and values.

Thanks,
Hyung Kim

2022년 12월 7일 (수) 오후 1:34, 김성익 <ksi2443 at gmail.com>님이 작성:

> I already done VecAssemblyBegin/End().
>
> However, only mpirun case these outputs are represented.
> There are more error outputs as below.
>
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF
> with errorcode 73.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
> [ubuntu:02473] PMIX ERROR: UNREACHABLE in file
> ../../../src/server/pmix_server.c at line 2193
> [ubuntu:02473] PMIX ERROR: UNREACHABLE in file
> ../../../src/server/pmix_server.c at line 2193
> [ubuntu:02473] PMIX ERROR: UNREACHABLE in file
> ../../../src/server/pmix_server.c at line 2193
> [ubuntu:02473] 3 more processes have sent help message help-mpi-api.txt /
> mpi-abort
> [ubuntu:02473] Set MCA parameter "orte_base_help_aggregate" to 0 to see
> all help / error messages
>
> Could this be the cause of the former petsc error??
>
>
> Thanks,
> Hyung Kim
>
> 2022년 12월 6일 (화) 오후 10:58, Matthew Knepley <knepley at gmail.com>님이 작성:
>
>> On Tue, Dec 6, 2022 at 6:45 AM 김성익 <ksi2443 at gmail.com> wrote:
>>
>>> Hello,
>>>
>>>
>>> There is a code which can run in not mpirun and also it can run in
>>> mpi_linear_solver_server.
>>> However, it has an error in just mpirun case such as mpirun -np
>>> ./program.
>>> The error output is as below.
>>> [0]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>> [0]PETSC ERROR: Object is in wrong state
>>> [0]PETSC ERROR: Not for unassembled vector
>>> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
>>> [0]PETSC ERROR: Petsc Release Version 3.18.1, unknown
>>> [0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by ksi2443
>>> Tue Dec  6 03:39:13 2022
>>> [0]PETSC ERROR: Configure options -download-mumps -download-scalapack
>>> -download-parmetis -download-metis
>>> [0]PETSC ERROR: #1 VecCopy() at
>>> /home/ksi2443/petsc/src/vec/vec/interface/vector.c:1625
>>> [0]PETSC ERROR: #2 KSPInitialResidual() at
>>> /home/ksi2443/petsc/src/ksp/ksp/interface/itres.c:60
>>> [0]PETSC ERROR: #3 KSPSolve_GMRES() at
>>> /home/ksi2443/petsc/src/ksp/ksp/impls/gmres/gmres.c:227
>>> [0]PETSC ERROR: #4 KSPSolve_Private() at
>>> /home/ksi2443/petsc/src/ksp/ksp/interface/itfunc.c:899
>>> [0]PETSC ERROR: #5 KSPSolve() at
>>> /home/ksi2443/petsc/src/ksp/ksp/interface/itfunc.c:1071
>>> [0]PETSC ERROR: #6 main() at /home/ksi2443/Downloads/coding/a1.c:450
>>> [0]PETSC ERROR: No PETSc Option Table entries
>>> [0]PETSC ERROR: ----------------End of Error Message -------send entire
>>> error message to petsc-maint at mcs.anl.gov----------
>>>
>>> How can I fix this??
>>>
>>
>> It looks like we do not check the assembled state in parallel, since it
>> cannot cause a problem, but every time you
>> update values with VecSetValues(), you should call VecAssemblyBegin/End().
>>
>>   Thanks
>>
>>       Matt
>>
>>
>>> Thanks,
>>> Hyung Kim
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20221207/274591c8/attachment.html>


More information about the petsc-users mailing list