<div dir="ltr"><div dir="ltr">On Wed, Dec 7, 2022 at 5:03 AM 김성익 <<a href="mailto:ksi2443@gmail.com">ksi2443@gmail.com</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">This error was caused by the inconsistent index of vecgetvalues in the mpirun case.<br><br>For example, for the problem that the global vector size is 4, when mpirun -np 2, the value obtained from each process with vecgetvalues should be 2, but in my code tried to get 4 values, so it became a problem.<br><br>How to solve this problem?<br>I want to get a scalar array so that all process array has the same value with global vector size and values.</div></blockquote><div><br></div><div>This is a fundamentally nonscalable operation. Are you sure you want to do this?</div><div><br></div><div>If so, you can use</div><div><br></div><div> <a href="https://petsc.org/main/docs/manualpages/PetscSF/VecScatterCreateToZero/">https://petsc.org/main/docs/manualpages/PetscSF/VecScatterCreateToZero/</a></div><div><br></div><div> Thanks</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>Thanks,<br>Hyung Kim</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">2022년 12월 7일 (수) 오후 1:34, 김성익 <<a href="mailto:ksi2443@gmail.com" target="_blank">ksi2443@gmail.com</a>>님이 작성:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">I already done VecAssemblyBegin/End().<div><br></div><div>However, only mpirun case these outputs are represented.</div><div>There are more error outputs as below.</div><div> --------------------------------------------------------------------------<br></div>MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF<br>with errorcode 73.<br><br>NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.<br>You may or may not see output from other processes, depending on<br>exactly when Open MPI kills them.<br>--------------------------------------------------------------------------<br>[ubuntu:02473] PMIX ERROR: UNREACHABLE in file ../../../src/server/pmix_server.c at line 2193<br>[ubuntu:02473] PMIX ERROR: UNREACHABLE in file ../../../src/server/pmix_server.c at line 2193<br>[ubuntu:02473] PMIX ERROR: UNREACHABLE in file ../../../src/server/pmix_server.c at line 2193<br>[ubuntu:02473] 3 more processes have sent help message help-mpi-api.txt / mpi-abort<br>[ubuntu:02473] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages<div><br></div><div>Could this be the cause of the former petsc error??</div><div><br></div><div><br></div><div>Thanks,</div><div>Hyung Kim</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">2022년 12월 6일 (화) 오후 10:58, Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>>님이 작성:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">On Tue, Dec 6, 2022 at 6:45 AM 김성익 <<a href="mailto:ksi2443@gmail.com" target="_blank">ksi2443@gmail.com</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">Hello,<div><br></div><div><br></div><div>There is a code which can run in not mpirun and also it can run in mpi_linear_solver_server.</div><div>However, it has an error in just mpirun case such as mpirun -np ./program.</div><div>The error output is as below.</div><div>[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>[0]PETSC ERROR: Object is in wrong state<br>[0]PETSC ERROR: Not for unassembled vector<br>[0]PETSC ERROR: See <a href="https://petsc.org/release/faq/" target="_blank">https://petsc.org/release/faq/</a> for trouble shooting.<br>[0]PETSC ERROR: Petsc Release Version 3.18.1, unknown <br>[0]PETSC ERROR: ./app on a arch-linux-c-debug named ubuntu by ksi2443 Tue Dec 6 03:39:13 2022<br>[0]PETSC ERROR: Configure options -download-mumps -download-scalapack -download-parmetis -download-metis<br>[0]PETSC ERROR: #1 VecCopy() at /home/ksi2443/petsc/src/vec/vec/interface/vector.c:1625<br>[0]PETSC ERROR: #2 KSPInitialResidual() at /home/ksi2443/petsc/src/ksp/ksp/interface/itres.c:60<br>[0]PETSC ERROR: #3 KSPSolve_GMRES() at /home/ksi2443/petsc/src/ksp/ksp/impls/gmres/gmres.c:227<br>[0]PETSC ERROR: #4 KSPSolve_Private() at /home/ksi2443/petsc/src/ksp/ksp/interface/itfunc.c:899<br>[0]PETSC ERROR: #5 KSPSolve() at /home/ksi2443/petsc/src/ksp/ksp/interface/itfunc.c:1071<br>[0]PETSC ERROR: #6 main() at /home/ksi2443/Downloads/coding/a1.c:450<br>[0]PETSC ERROR: No PETSc Option Table entries<br>[0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov----------<br></div><div><br></div><div>How can I fix this??</div></div></blockquote><div><br></div><div>It looks like we do not check the assembled state in parallel, since it cannot cause a problem, but every time you</div><div>update values with VecSetValues(), you should call VecAssemblyBegin/End().</div><div><br></div><div> Thanks</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>Thanks,</div><div>Hyung Kim</div></div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>
</blockquote></div>
</blockquote></div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr" class="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>