<div dir="ltr"><div>Yes, we can turn it off.   The code without real use is just a maintenance burden.</div><div><br></div><div><div dir="ltr" class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr">--Junchao Zhang</div></div></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Tue, Sep 14, 2021 at 10:45 AM Barry Smith <<a href="mailto:bsmith@petsc.dev">bsmith@petsc.dev</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div style="overflow-wrap: break-word;"><div><br></div>  Ok, so it could be a bug in PETSc, but if it appears with particular MPI implementations shouldn't we turn off the support in those cases we know it will fail?<div><br></div><div>  Barry</div><div><br><div><br><blockquote type="cite"><div>On Sep 14, 2021, at 11:10 AM, Junchao Zhang <<a href="mailto:junchao.zhang@gmail.com" target="_blank">junchao.zhang@gmail.com</a>> wrote:</div><br><div><div dir="ltr">MPI one-sided is tricky and needs careful synchronization (like OpenMP).  An incorrect code could work in one interface but fail in another.<div><div><br clear="all"><div><div dir="ltr"><div dir="ltr">--Junchao Zhang</div></div></div><br></div></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Tue, Sep 14, 2021 at 10:01 AM Barry Smith <<a href="mailto:bsmith@petsc.dev" target="_blank">bsmith@petsc.dev</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><div><br></div><div>   It sounds reproducible and related to using a particular versions of OpenMPI and even particular interfaces.</div><div><br></div><div>  Barry</div><div><br></div>   On Tue, Sep 14, 2021 at 2:35 AM Stefano Zampini <<a href="mailto:stefano.zampini@gmail.com" target="_blank">stefano.zampini@gmail.com</a>> wrote:<blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">I can reproduce it even with OpenMPI 4.1.1 on a different machine (Ubuntu 18 + AMD milan + clang from AOCC)  and it is definitely an OpenMPI bug in the vader BTL If I use tcp, everything runs smooth</div><div dir="ltr"><br></div><div dir="ltr"><br></div><div dir="ltr"><br></div></blockquote><div><br><blockquote type="cite"><div>On Sep 14, 2021, at 10:54 AM, Junchao Zhang <<a href="mailto:junchao.zhang@gmail.com" target="_blank">junchao.zhang@gmail.com</a>> wrote:</div><br><div><div dir="ltr">Without a standalone & valid mpi example to reproduce the error, we are not assured to say it is an OpenMPI bug. <div><br clear="all"><div><div dir="ltr"><div dir="ltr">--Junchao Zhang</div></div></div><br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Tue, Sep 14, 2021 at 6:17 AM Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">Okay, we have to send this to OpenMPI. Volunteers?<div><br></div><div>Maybe we should note this in the FAQ, or installation, so we remember how to fix it if someone else asks?</div><div><br></div><div>  Thanks,</div><div><br></div><div>     Matt</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Tue, Sep 14, 2021 at 2:35 AM Stefano Zampini <<a href="mailto:stefano.zampini@gmail.com" target="_blank">stefano.zampini@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>I can reproduce it even with OpenMPI 4.1.1 on a different machine (Ubuntu 18 + AMD milan + clang from AOCC)  and it is definitely an OpenMPI bug in the vader BTL If I use tcp, everything runs smooth</div><div><br></div><div>zampins@kanary:~/Devel/petsc$ cat /home/zampins/local/etc/openmpi-mca-params.conf | grep btl<br>btl = tcp,self<br>zampins@kanary:~/Devel/petsc$ make -f gmakefile.test vec_is_sf_tutorials-ex1_4 <br>Using MAKEFLAGS:<br>        TEST arch-debug/tests/counts/vec_is_sf_tutorials-ex1_4.counts<br> ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create<br> ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create<br> ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic<br> ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic<br> ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate<br> ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate<br> ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create<br> ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create<br> ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic<br> ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic<br> ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate<br> ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate<br> ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create<br> ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create<br> ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic<br> ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic<br> ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate<br> ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate</div><div><br></div><div><br></div><div>zampins@kanary:~/Devel/petsc$ cat /home/zampins/local/etc/openmpi-mca-params.conf | grep btl<br>btl = vader,tcp,self<br>zampins@kanary:~/Devel/petsc$ make -f gmakefile.test vec_is_sf_tutorials-ex1_4 <br>Using MAKEFLAGS:<br>        TEST arch-debug/tests/counts/vec_is_sf_tutorials-ex1_4.counts<br> ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create<br>not ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create # Error code: 1<br>#    43,46c43,46<br>#  < [0] 0: 4001 2000 2002 3002 4002<br># < [1] 0: 1001 3000<br>#        < [2] 0: 2001 4000<br>#        < [3] 0: 3001 1000<br>#        ---<br>#  > [0] 0: 2002 2146435072 2 2146435072 38736240<br>#    > [1] 0: 3000 2146435072<br>#  > [2] 0: 2001 2146435072<br>#  > [3] 0: 3001 2146435072<br> ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic<br>not ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic # Error code: 1<br>#       43,46c43,46<br>#  < [0] 0: 4001 2000 2002 3002 4002<br># < [1] 0: 1001 3000<br>#        < [2] 0: 2001 4000<br>#        < [3] 0: 3001 1000<br>#        ---<br>#  > [0] 0: 2002 2146435072 2 2146435072 0<br>#   > [1] 0: 3000 2146435072<br>#  > [2] 0: 2001 2146435072<br>#  > [3] 0: 3001 2146435072<br> ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate<br> ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate<br># retrying vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create<br>not ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create # Error code: 98<br>#  [1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>#   [1]PETSC ERROR: General MPI error <br>#   [1]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank<br>#       [1]PETSC ERROR: See <a href="https://petsc.org/release/faq/" target="_blank">https://petsc.org/release/faq/</a> for trouble shooting.<br>#        [1]PETSC ERROR: Petsc Development GIT revision: v3.15.4-783-g168bb9f76b  GIT Date: 2021-09-13 14:01:22 +0000<br>#        [1]PETSC ERROR: ../ex1 on a arch-debug named <a href="http://kanary.kaust.edu.sa/" target="_blank">kanary.kaust.edu.sa</a> by zampins Tue Sep 14 09:31:42 2021<br>#       [1]PETSC ERROR: [2]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>#   [2]PETSC ERROR: General MPI error <br>#   [2]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank<br>#       [2]PETSC ERROR: See <a href="https://petsc.org/release/faq/" target="_blank">https://petsc.org/release/faq/</a> for trouble shooting.<br>#        [2]PETSC ERROR: Petsc Development GIT revision: v3.15.4-783-g168bb9f76b  GIT Date: 2021-09-13 14:01:22 +0000<br>#        [2]PETSC ERROR: ../ex1 on a arch-debug named <a href="http://kanary.kaust.edu.sa/" target="_blank">kanary.kaust.edu.sa</a> by zampins Tue Sep 14 09:31:42 2021<br>#       [2]PETSC ERROR: Configure options --with-cc=/home/zampins/local/bin/mpicc --with-cxx-dialect=c++14 --with-cxx=/home/zampins/local/bin/mpicxx --with-debugging=1 --with-fc=/home/zampins/local/bin/mpifort --with-fortran-bindings=0 --with-hip-dir=/opt/rocm --with-hip=1 --with-hypre-dir=/home/zampins/local-petsc --with-kokkos-dir=/home/zampins/local-petsc --with-kokkos-kernels-dir=/home/zampins/local-petsc --with-blaslapack-include=/home/zampins/local-aocl/aocc/3.0-6/include --with-blaslapack-lib="[/home/zampins/local-aocl/aocc/3.0-6/lib/liblapacke.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libflame.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libblis-mt.a]" HIPFLAGS=--amdgpu-target=gfx908 HIPPPFLAGS=-I/home/zampins/local-petsc/include PETSC_ARCH=arch-debug<br>#     [2]PETSC ERROR: #1 PetscSFGetGroups() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:1166<br># [2]PETSC ERROR: #2 PetscSFSetUp_Window() at /home/zampins/Devel/petsc/src/vec/is/sf/impls/window/sfwindow.c:708<br>#      [2]PETSC ERROR: #3 PetscSFSetUp() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:318<br>#      [2]PETSC ERROR: #4 main() at /home/zampins/Devel/petsc/src/vec/is/sf/tutorials/ex1.c:172<br>#     [2]PETSC ERROR: PETSc Option Table entries:<br>#  [2]PETSC ERROR: -sf_type window<br>#      [2]PETSC ERROR: -sf_window_flavor create<br>#     [2]PETSC ERROR: -sf_window_sync active<br>#       [2]PETSC ERROR: -test_gather<br># [2]PETSC ERROR: ----------------End of Error Message -------send entire error message to <a href="mailto:petsc-maint@mcs.anl.gov" target="_blank">petsc-maint@mcs.anl.gov</a>----------<br>#      [3]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>#   [3]PETSC ERROR: General MPI error <br>#   [3]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank<br>#       [3]PETSC ERROR: See <a href="https://petsc.org/release/faq/" target="_blank">https://petsc.org/release/faq/</a> for trouble shooting.<br>#        [3]PETSC ERROR: Petsc Development GIT revision: v3.15.4-783-g168bb9f76b  GIT Date: 2021-09-13 14:01:22 +0000<br>#        [3]PETSC ERROR: ../ex1 on a arch-debug named <a href="http://kanary.kaust.edu.sa/" target="_blank">kanary.kaust.edu.sa</a> by zampins Tue Sep 14 09:31:42 2021<br>#       [3]PETSC ERROR: Configure options --with-cc=/home/zampins/local/bin/mpicc --with-cxx-dialect=c++14 --with-cxx=/home/zampins/local/bin/mpicxx --with-debugging=1 --with-fc=/home/zampins/local/bin/mpifort --with-fortran-bindings=0 --with-hip-dir=/opt/rocm --with-hip=1 --with-hypre-dir=/home/zampins/local-petsc --with-kokkos-dir=/home/zampins/local-petsc --with-kokkos-kernels-dir=/home/zampins/local-petsc --with-blaslapack-include=/home/zampins/local-aocl/aocc/3.0-6/include --with-blaslapack-lib="[/home/zampins/local-aocl/aocc/3.0-6/lib/liblapacke.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libflame.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libblis-mt.a]" HIPFLAGS=--amdgpu-target=gfx908 HIPPPFLAGS=-I/home/zampins/local-petsc/include PETSC_ARCH=arch-debug<br>#     [3]PETSC ERROR: #1 PetscSFGetGroups() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:1166<br># [3]PETSC ERROR: #2 PetscSFSetUp_Window() at /home/zampins/Devel/petsc/src/vec/is/sf/impls/window/sfwindow.c:708<br>#      [3]PETSC ERROR: #3 PetscSFSetUp() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:318<br>#      [3]PETSC ERROR: #4 main() at /home/zampins/Devel/petsc/src/vec/is/sf/tutorials/ex1.c:172<br>#     [3]PETSC ERROR: PETSc Option Table entries:<br>#  [3]PETSC ERROR: -sf_type window<br>#      [3]PETSC ERROR: -sf_window_flavor create<br>#     [3]PETSC ERROR: Configure options --with-cc=/home/zampins/local/bin/mpicc --with-cxx-dialect=c++14 --with-cxx=/home/zampins/local/bin/mpicxx --with-debugging=1 --with-fc=/home/zampins/local/bin/mpifort --with-fortran-bindings=0 --with-hip-dir=/opt/rocm --with-hip=1 --with-hypre-dir=/home/zampins/local-petsc --with-kokkos-dir=/home/zampins/local-petsc --with-kokkos-kernels-dir=/home/zampins/local-petsc --with-blaslapack-include=/home/zampins/local-aocl/aocc/3.0-6/include --with-blaslapack-lib="[/home/zampins/local-aocl/aocc/3.0-6/lib/liblapacke.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libflame.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libblis-mt.a]" HIPFLAGS=--amdgpu-target=gfx908 HIPPPFLAGS=-I/home/zampins/local-petsc/include PETSC_ARCH=arch-debug<br>#     [1]PETSC ERROR: #1 PetscSFGetGroups() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:1166<br># [1]PETSC ERROR: #2 PetscSFSetUp_Window() at /home/zampins/Devel/petsc/src/vec/is/sf/impls/window/sfwindow.c:708<br>#      [1]PETSC ERROR: #3 PetscSFSetUp() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:318<br>#      [1]PETSC ERROR: #4 main() at /home/zampins/Devel/petsc/src/vec/is/sf/tutorials/ex1.c:172<br>#     [1]PETSC ERROR: PETSc Option Table entries:<br>#  [1]PETSC ERROR: -sf_type window<br>#      [1]PETSC ERROR: -sf_window_flavor create<br>#     [1]PETSC ERROR: -sf_window_sync active<br>#       [1]PETSC ERROR: -test_gather<br># [1]PETSC ERROR: ----------------End of Error Message -------send entire error message to <a href="mailto:petsc-maint@mcs.anl.gov" target="_blank">petsc-maint@mcs.anl.gov</a>----------<br>#      -sf_window_sync active<br>#       [3]PETSC ERROR: -test_gather<br># [3]PETSC ERROR: ----------------End of Error Message -------send entire error message to <a href="mailto:petsc-maint@mcs.anl.gov" target="_blank">petsc-maint@mcs.anl.gov</a>----------<br>#      --------------------------------------------------------------------------<br>#   MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD<br>#       with errorcode 98.<br>#   <br>#     NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.<br>#  You may or may not see output from other processes, depending on<br>#     exactly when Open MPI kills them.<br>#    --------------------------------------------------------------------------<br>#   [<a href="http://kanary.kaust.edu.sa:115527" target="_blank">kanary.kaust.edu.sa:115527</a>] 2 more processes have sent help message help-mpi-api.txt / mpi-abort<br>#    [<a href="http://kanary.kaust.edu.sa:115527" target="_blank">kanary.kaust.edu.sa:115527</a>] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages<br> ok vec_is_sf_tutorials-ex1_4 # SKIP Command failed so no diff<br># retrying vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic<br>not ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic # Error code: 98<br>#    [1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>#   [1]PETSC ERROR: General MPI error <br>#   [1]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank<br>#       [1]PETSC ERROR: See <a href="https://petsc.org/release/faq/" target="_blank">https://petsc.org/release/faq/</a> for trouble shooting.<br>#        [1]PETSC ERROR: Petsc Development GIT revision: v3.15.4-783-g168bb9f76b  GIT Date: 2021-09-13 14:01:22 +0000<br>#        [1]PETSC ERROR: ../ex1 on a arch-debug named <a href="http://kanary.kaust.edu.sa/" target="_blank">kanary.kaust.edu.sa</a> by zampins Tue Sep 14 09:31:45 2021<br>#       [1]PETSC ERROR: Configure options --with-cc=/home/zampins/local/bin/mpicc --with-cxx-dialect=c++14 --with-cxx=/home/zampins/local/bin/mpicxx --with-debugging=1 --with-fc=/home/zampins/local/bin/mpifort --with-fortran-bindings=0 --with-hip-dir=/opt/rocm --with-hip=1 --with-hypre-dir=/home/zampins/local-petsc --with-kokkos-dir=/home/zampins/local-petsc --with-kokkos-kernels-dir=/home/zampins/local-petsc --with-blaslapack-include=/home/zampins/local-aocl/aocc/3.0-6/include --with-blaslapack-lib="[/home/zampins/local-aocl/aocc/3.0-6/lib/liblapacke.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libflame.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libblis-mt.a]" HIPFLAGS=--amdgpu-target=gfx908 HIPPPFLAGS=-I/home/zampins/local-petsc/include PETSC_ARCH=arch-debug<br>#     [1]PETSC ERROR: #1 PetscSFGetGroups() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:1166<br># [1]PETSC ERROR: #2 PetscSFSetUp_Window() at /home/zampins/Devel/petsc/src/vec/is/sf/impls/window/sfwindow.c:708<br>#      [1]PETSC ERROR: #3 PetscSFSetUp() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:318<br>#      [1]PETSC ERROR: #4 main() at /home/zampins/Devel/petsc/src/vec/is/sf/tutorials/ex1.c:172<br>#     [1]PETSC ERROR: PETSc Option Table entries:<br>#  [1]PETSC ERROR: -sf_type window<br>#      [1]PETSC ERROR: [2]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>#   [2]PETSC ERROR: General MPI error <br>#   [2]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank<br>#       [2]PETSC ERROR: See <a href="https://petsc.org/release/faq/" target="_blank">https://petsc.org/release/faq/</a> for trouble shooting.<br>#        [2]PETSC ERROR: Petsc Development GIT revision: v3.15.4-783-g168bb9f76b  GIT Date: 2021-09-13 14:01:22 +0000<br>#        [2]PETSC ERROR: ../ex1 on a arch-debug named <a href="http://kanary.kaust.edu.sa/" target="_blank">kanary.kaust.edu.sa</a> by zampins Tue Sep 14 09:31:45 2021<br>#       [2]PETSC ERROR: Configure options --with-cc=/home/zampins/local/bin/mpicc --with-cxx-dialect=c++14 --with-cxx=/home/zampins/local/bin/mpicxx --with-debugging=1 --with-fc=/home/zampins/local/bin/mpifort --with-fortran-bindings=0 --with-hip-dir=/opt/rocm --with-hip=1 --with-hypre-dir=/home/zampins/local-petsc --with-kokkos-dir=/home/zampins/local-petsc --with-kokkos-kernels-dir=/home/zampins/local-petsc --with-blaslapack-include=/home/zampins/local-aocl/aocc/3.0-6/include --with-blaslapack-lib="[/home/zampins/local-aocl/aocc/3.0-6/lib/liblapacke.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libflame.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libblis-mt.a]" HIPFLAGS=--amdgpu-target=gfx908 HIPPPFLAGS=-I/home/zampins/local-petsc/include PETSC_ARCH=arch-debug<br>#     [2]PETSC ERROR: #1 PetscSFGetGroups() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:1166<br># [2]PETSC ERROR: #2 PetscSFSetUp_Window() at /home/zampins/Devel/petsc/src/vec/is/sf/impls/window/sfwindow.c:708<br>#      [2]PETSC ERROR: #3 PetscSFSetUp() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:318<br>#      [2]PETSC ERROR: #4 main() at /home/zampins/Devel/petsc/src/vec/is/sf/tutorials/ex1.c:172<br>#     [2]PETSC ERROR: PETSc Option Table entries:<br>#  [2]PETSC ERROR: -sf_type window<br>#      [2]PETSC ERROR: -sf_window_flavor dynamic<br>#    [2]PETSC ERROR: -sf_window_sync active<br>#       [2]PETSC ERROR: -test_gather<br># [2]PETSC ERROR: ----------------End of Error Message -------send entire error message to <a href="mailto:petsc-maint@mcs.anl.gov" target="_blank">petsc-maint@mcs.anl.gov</a>----------<br>#      [3]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>#   [3]PETSC ERROR: General MPI error <br>#   [3]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank<br>#       [3]PETSC ERROR: See <a href="https://petsc.org/release/faq/" target="_blank">https://petsc.org/release/faq/</a> for trouble shooting.<br>#        [3]PETSC ERROR: Petsc Development GIT revision: v3.15.4-783-g168bb9f76b  GIT Date: 2021-09-13 14:01:22 +0000<br>#        [3]PETSC ERROR: ../ex1 on a arch-debug named <a href="http://kanary.kaust.edu.sa/" target="_blank">kanary.kaust.edu.sa</a> by zampins Tue Sep 14 09:31:45 2021<br>#       [3]PETSC ERROR: Configure options --with-cc=/home/zampins/local/bin/mpicc --with-cxx-dialect=c++14 --with-cxx=/home/zampins/local/bin/mpicxx --with-debugging=1 --with-fc=/home/zampins/local/bin/mpifort --with-fortran-bindings=0 --with-hip-dir=/opt/rocm --with-hip=1 --with-hypre-dir=/home/zampins/local-petsc --with-kokkos-dir=/home/zampins/local-petsc --with-kokkos-kernels-dir=/home/zampins/local-petsc --with-blaslapack-include=/home/zampins/local-aocl/aocc/3.0-6/include --with-blaslapack-lib="[/home/zampins/local-aocl/aocc/3.0-6/lib/liblapacke.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libflame.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libblis-mt.a]" HIPFLAGS=--amdgpu-target=gfx908 HIPPPFLAGS=-I/home/zampins/local-petsc/include PETSC_ARCH=arch-debug<br>#     [3]PETSC ERROR: #1 PetscSFGetGroups() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:1166<br># [3]PETSC ERROR: #2 PetscSFSetUp_Window() at /home/zampins/Devel/petsc/src/vec/is/sf/impls/window/sfwindow.c:708<br>#      [3]PETSC ERROR: #3 PetscSFSetUp() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:318<br>#      [3]PETSC ERROR: #4 main() at /home/zampins/Devel/petsc/src/vec/is/sf/tutorials/ex1.c:172<br>#     [3]PETSC ERROR: PETSc Option Table entries:<br>#  [3]PETSC ERROR: -sf_type window<br>#      [3]PETSC ERROR: -sf_window_flavor dynamic<br>#    [3]PETSC ERROR: -sf_window_sync active<br>#       [3]PETSC ERROR: -test_gather<br># [3]PETSC ERROR: ----------------End of Error Message -------send entire error message to <a href="mailto:petsc-maint@mcs.anl.gov" target="_blank">petsc-maint@mcs.anl.gov</a>----------<br>#      -sf_window_flavor dynamic<br>#    [1]PETSC ERROR: -sf_window_sync active<br>#       [1]PETSC ERROR: -test_gather<br># [1]PETSC ERROR: ----------------End of Error Message -------send entire error message to <a href="mailto:petsc-maint@mcs.anl.gov" target="_blank">petsc-maint@mcs.anl.gov</a>----------<br>#      --------------------------------------------------------------------------<br>#   MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD<br>#       with errorcode 98.<br>#   <br>#     NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.<br>#  You may or may not see output from other processes, depending on<br>#     exactly when Open MPI kills them.<br>#    --------------------------------------------------------------------------<br>#   [<a href="http://kanary.kaust.edu.sa:115572" target="_blank">kanary.kaust.edu.sa:115572</a>] 2 more processes have sent help message help-mpi-api.txt / mpi-abort<br>#    [<a href="http://kanary.kaust.edu.sa:115572" target="_blank">kanary.kaust.edu.sa:115572</a>] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages<br> ok vec_is_sf_tutorials-ex1_4 # SKIP Command failed so no diff<br> ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate<br> ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate<br> ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create<br>not ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create # Error code: 1<br>#       43,46c43,46<br>#  < [0] 0: 4001 2000 2002 3002 4002<br># < [1] 0: 1001 3000<br>#        < [2] 0: 2001 4000<br>#        < [3] 0: 3001 1000<br>#        ---<br>#  > [0] 0: 4002 2146435072 2 2146435072 34619728<br>#    > [1] 0: 3000 2146435072<br>#  > [2] 0: 4000 2146435072<br>#  > [3] 0: 3001 2146435072<br> ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic<br>not ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic # Error code: 1<br># 43,46c43,46<br>#  < [0] 0: 4001 2000 2002 3002 4002<br># < [1] 0: 1001 3000<br>#        < [2] 0: 2001 4000<br>#        < [3] 0: 3001 1000<br>#        ---<br>#  > [0] 0: 4002 2146435072 2 2146435072 0<br>#   > [1] 0: 3000 2146435072<br>#  > [2] 0: 4000 2146435072<br>#  > [3] 0: 3001 2146435072<br> ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate<br> ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate<br></div><div><br></div><div><br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">Il giorno mar 14 set 2021 alle ore 07:44 Stefano Zampini <<a href="mailto:stefano.zampini@gmail.com" target="_blank">stefano.zampini@gmail.com</a>> ha scritto:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="auto">I'll see if I can reproduce </div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">Il Mar 14 Set 2021, 06:58 Junchao Zhang <<a href="mailto:junchao.zhang@gmail.com" target="_blank">junchao.zhang@gmail.com</a>> ha scritto:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">Hi, Stefano,<div>   Ping you again to see if you want to resolve this problem before petsc-3.16 </div><div><br></div><div><div><div dir="ltr"><div dir="ltr">--Junchao Zhang</div></div></div><br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Sun, Sep 12, 2021 at 3:06 PM Antonio T. sagitter <<a href="mailto:sagitter@fedoraproject.org" rel="noreferrer" target="_blank">sagitter@fedoraproject.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Unfortunately, it's not possible. I must use the OpenMPI provided by <br>
Fedora build-system (these rpm builds of PETSc are for Fedora's <br>
repositories), downloading external software is not permitted.<br>
<br>
On 9/12/21 21:10, Pierre Jolivet wrote:<br>
> <br>
>> On 12 Sep 2021, at 8:56 PM, Matthew Knepley <<a href="mailto:knepley@gmail.com" rel="noreferrer" target="_blank">knepley@gmail.com</a> <br>
>> <mailto:<a href="mailto:knepley@gmail.com" rel="noreferrer" target="_blank">knepley@gmail.com</a>>> wrote:<br>
>><br>
>> On Sun, Sep 12, 2021 at 2:49 PM Antonio T. sagitter <br>
>> <<a href="mailto:sagitter@fedoraproject.org" rel="noreferrer" target="_blank">sagitter@fedoraproject.org</a> <mailto:<a href="mailto:sagitter@fedoraproject.org" rel="noreferrer" target="_blank">sagitter@fedoraproject.org</a>>> wrote:<br>
>><br>
>>     Those attached are configure.log/make.log from a MPI build in<br>
>>     Fedora 34<br>
>>     x86_64 where the error below occurred.<br>
>><br>
>><br>
>> This is OpenMPI 4.1.0. Is that the only MPI you build? My first <br>
>> inclination is that this is an MPI implementation bug.<br>
>><br>
>> Junchao, do we have an OpenMPI build in the CI?<br>
> <br>
> config/examples/arch-ci-linux-cuda-double-64idx.py:   <br>
>   '--download-openmpi=1',<br>
> config/examples/arch-ci-linux-pkgs-dbg-ftn-interfaces.py: <br>
>   '--download-openmpi=1',<br>
> config/examples/arch-ci-linux-pkgs-opt.py:  '--download-openmpi=1',<br>
> <br>
> config/BuildSystem/config/packages/OpenMPI.py uses version 4.1.0 as well.<br>
> I’m not sure PETSc is to blame here Antonio. You may want to try to <br>
> ditch the OpenMPI shipped by your packet manager and try <br>
> --download-openmpi as well, just for a quick sanity check.<br>
> <br>
> Thanks,<br>
> Pierre<br>
> <br>
<br>
-- <br>
---<br>
Antonio Trande<br>
Fedora Project<br>
mailto: <a href="mailto:sagitter@fedoraproject.org" rel="noreferrer" target="_blank">sagitter@fedoraproject.org</a><br>
GPG key: 0x29FBC85D7A51CC2F<br>
GPG key server: <a href="https://keyserver1.pgp.com/" rel="noreferrer noreferrer" target="_blank">https://keyserver1.pgp.com/</a><br>
</blockquote></div>
</blockquote></div>
</blockquote></div><br clear="all"><br>-- <br><div dir="ltr">Stefano</div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div>
</blockquote></div>
</div></blockquote></div><br></div></blockquote></div>
</div></blockquote></div><br></div></div></blockquote></div>