[petsc-dev] Cannot locate file: share/petsc/datafiles/matrices/small

Barry Smith bsmith at petsc.dev
Tue Sep 14 10:45:03 CDT 2021


  Ok, so it could be a bug in PETSc, but if it appears with particular MPI implementations shouldn't we turn off the support in those cases we know it will fail?

  Barry


> On Sep 14, 2021, at 11:10 AM, Junchao Zhang <junchao.zhang at gmail.com> wrote:
> 
> MPI one-sided is tricky and needs careful synchronization (like OpenMP).  An incorrect code could work in one interface but fail in another.
> 
> --Junchao Zhang
> 
> 
> On Tue, Sep 14, 2021 at 10:01 AM Barry Smith <bsmith at petsc.dev <mailto:bsmith at petsc.dev>> wrote:
> 
>    It sounds reproducible and related to using a particular versions of OpenMPI and even particular interfaces.
> 
>   Barry
> 
>    On Tue, Sep 14, 2021 at 2:35 AM Stefano Zampini <stefano.zampini at gmail.com <mailto:stefano.zampini at gmail.com>> wrote:
> I can reproduce it even with OpenMPI 4.1.1 on a different machine (Ubuntu 18 + AMD milan + clang from AOCC)  and it is definitely an OpenMPI bug in the vader BTL If I use tcp, everything runs smooth
> 
> 
> 
> 
>> On Sep 14, 2021, at 10:54 AM, Junchao Zhang <junchao.zhang at gmail.com <mailto:junchao.zhang at gmail.com>> wrote:
>> 
>> Without a standalone & valid mpi example to reproduce the error, we are not assured to say it is an OpenMPI bug. 
>> 
>> --Junchao Zhang
>> 
>> 
>> On Tue, Sep 14, 2021 at 6:17 AM Matthew Knepley <knepley at gmail.com <mailto:knepley at gmail.com>> wrote:
>> Okay, we have to send this to OpenMPI. Volunteers?
>> 
>> Maybe we should note this in the FAQ, or installation, so we remember how to fix it if someone else asks?
>> 
>>   Thanks,
>> 
>>      Matt
>> 
>> On Tue, Sep 14, 2021 at 2:35 AM Stefano Zampini <stefano.zampini at gmail.com <mailto:stefano.zampini at gmail.com>> wrote:
>> I can reproduce it even with OpenMPI 4.1.1 on a different machine (Ubuntu 18 + AMD milan + clang from AOCC)  and it is definitely an OpenMPI bug in the vader BTL If I use tcp, everything runs smooth
>> 
>> zampins at kanary:~/Devel/petsc$ cat /home/zampins/local/etc/openmpi-mca-params.conf | grep btl
>> btl = tcp,self
>> zampins at kanary:~/Devel/petsc$ make -f gmakefile.test vec_is_sf_tutorials-ex1_4 
>> Using MAKEFLAGS:
>>         TEST arch-debug/tests/counts/vec_is_sf_tutorials-ex1_4.counts
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
>>  ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
>>  ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>>  ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
>>  ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic
>>  ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate
>>  ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create
>>  ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic
>>  ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate
>>  ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate
>> 
>> 
>> zampins at kanary:~/Devel/petsc$ cat /home/zampins/local/etc/openmpi-mca-params.conf | grep btl
>> btl = vader,tcp,self
>> zampins at kanary:~/Devel/petsc$ make -f gmakefile.test vec_is_sf_tutorials-ex1_4 
>> Using MAKEFLAGS:
>>         TEST arch-debug/tests/counts/vec_is_sf_tutorials-ex1_4.counts
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create
>> not ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-create # Error code: 1
>> # 43,46c43,46
>> # < [0] 0: 4001 2000 2002 3002 4002
>> # < [1] 0: 1001 3000
>> # < [2] 0: 2001 4000
>> # < [3] 0: 3001 1000
>> # ---
>> # > [0] 0: 2002 2146435072 2 2146435072 38736240
>> # > [1] 0: 3000 2146435072
>> # > [2] 0: 2001 2146435072
>> # > [3] 0: 3001 2146435072
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic
>> not ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-dynamic # Error code: 1
>> # 43,46c43,46
>> # < [0] 0: 4001 2000 2002 3002 4002
>> # < [1] 0: 1001 3000
>> # < [2] 0: 2001 4000
>> # < [3] 0: 3001 1000
>> # ---
>> # > [0] 0: 2002 2146435072 2 2146435072 0
>> # > [1] 0: 3000 2146435072
>> # > [2] 0: 2001 2146435072
>> # > [3] 0: 3001 2146435072
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>>  ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-fence_sf_window_flavor-allocate
>> # retrying vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create
>> not ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-create # Error code: 98
>> # [1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
>> # [1]PETSC ERROR: General MPI error 
>> # [1]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank
>> # [1]PETSC ERROR: See https://petsc.org/release/faq/ <https://petsc.org/release/faq/> for trouble shooting.
>> # [1]PETSC ERROR: Petsc Development GIT revision: v3.15.4-783-g168bb9f76b  GIT Date: 2021-09-13 14:01:22 +0000
>> # [1]PETSC ERROR: ../ex1 on a arch-debug named kanary.kaust.edu.sa <http://kanary.kaust.edu.sa/> by zampins Tue Sep 14 09:31:42 2021
>> # [1]PETSC ERROR: [2]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
>> # [2]PETSC ERROR: General MPI error 
>> # [2]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank
>> # [2]PETSC ERROR: See https://petsc.org/release/faq/ <https://petsc.org/release/faq/> for trouble shooting.
>> # [2]PETSC ERROR: Petsc Development GIT revision: v3.15.4-783-g168bb9f76b  GIT Date: 2021-09-13 14:01:22 +0000
>> # [2]PETSC ERROR: ../ex1 on a arch-debug named kanary.kaust.edu.sa <http://kanary.kaust.edu.sa/> by zampins Tue Sep 14 09:31:42 2021
>> # [2]PETSC ERROR: Configure options --with-cc=/home/zampins/local/bin/mpicc --with-cxx-dialect=c++14 --with-cxx=/home/zampins/local/bin/mpicxx --with-debugging=1 --with-fc=/home/zampins/local/bin/mpifort --with-fortran-bindings=0 --with-hip-dir=/opt/rocm --with-hip=1 --with-hypre-dir=/home/zampins/local-petsc --with-kokkos-dir=/home/zampins/local-petsc --with-kokkos-kernels-dir=/home/zampins/local-petsc --with-blaslapack-include=/home/zampins/local-aocl/aocc/3.0-6/include --with-blaslapack-lib="[/home/zampins/local-aocl/aocc/3.0-6/lib/liblapacke.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libflame.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libblis-mt.a]" HIPFLAGS=--amdgpu-target=gfx908 HIPPPFLAGS=-I/home/zampins/local-petsc/include PETSC_ARCH=arch-debug
>> # [2]PETSC ERROR: #1 PetscSFGetGroups() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:1166
>> # [2]PETSC ERROR: #2 PetscSFSetUp_Window() at /home/zampins/Devel/petsc/src/vec/is/sf/impls/window/sfwindow.c:708
>> # [2]PETSC ERROR: #3 PetscSFSetUp() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:318
>> # [2]PETSC ERROR: #4 main() at /home/zampins/Devel/petsc/src/vec/is/sf/tutorials/ex1.c:172
>> # [2]PETSC ERROR: PETSc Option Table entries:
>> # [2]PETSC ERROR: -sf_type window
>> # [2]PETSC ERROR: -sf_window_flavor create
>> # [2]PETSC ERROR: -sf_window_sync active
>> # [2]PETSC ERROR: -test_gather
>> # [2]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov <mailto:petsc-maint at mcs.anl.gov>----------
>> # [3]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
>> # [3]PETSC ERROR: General MPI error 
>> # [3]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank
>> # [3]PETSC ERROR: See https://petsc.org/release/faq/ <https://petsc.org/release/faq/> for trouble shooting.
>> # [3]PETSC ERROR: Petsc Development GIT revision: v3.15.4-783-g168bb9f76b  GIT Date: 2021-09-13 14:01:22 +0000
>> # [3]PETSC ERROR: ../ex1 on a arch-debug named kanary.kaust.edu.sa <http://kanary.kaust.edu.sa/> by zampins Tue Sep 14 09:31:42 2021
>> # [3]PETSC ERROR: Configure options --with-cc=/home/zampins/local/bin/mpicc --with-cxx-dialect=c++14 --with-cxx=/home/zampins/local/bin/mpicxx --with-debugging=1 --with-fc=/home/zampins/local/bin/mpifort --with-fortran-bindings=0 --with-hip-dir=/opt/rocm --with-hip=1 --with-hypre-dir=/home/zampins/local-petsc --with-kokkos-dir=/home/zampins/local-petsc --with-kokkos-kernels-dir=/home/zampins/local-petsc --with-blaslapack-include=/home/zampins/local-aocl/aocc/3.0-6/include --with-blaslapack-lib="[/home/zampins/local-aocl/aocc/3.0-6/lib/liblapacke.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libflame.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libblis-mt.a]" HIPFLAGS=--amdgpu-target=gfx908 HIPPPFLAGS=-I/home/zampins/local-petsc/include PETSC_ARCH=arch-debug
>> # [3]PETSC ERROR: #1 PetscSFGetGroups() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:1166
>> # [3]PETSC ERROR: #2 PetscSFSetUp_Window() at /home/zampins/Devel/petsc/src/vec/is/sf/impls/window/sfwindow.c:708
>> # [3]PETSC ERROR: #3 PetscSFSetUp() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:318
>> # [3]PETSC ERROR: #4 main() at /home/zampins/Devel/petsc/src/vec/is/sf/tutorials/ex1.c:172
>> # [3]PETSC ERROR: PETSc Option Table entries:
>> # [3]PETSC ERROR: -sf_type window
>> # [3]PETSC ERROR: -sf_window_flavor create
>> # [3]PETSC ERROR: Configure options --with-cc=/home/zampins/local/bin/mpicc --with-cxx-dialect=c++14 --with-cxx=/home/zampins/local/bin/mpicxx --with-debugging=1 --with-fc=/home/zampins/local/bin/mpifort --with-fortran-bindings=0 --with-hip-dir=/opt/rocm --with-hip=1 --with-hypre-dir=/home/zampins/local-petsc --with-kokkos-dir=/home/zampins/local-petsc --with-kokkos-kernels-dir=/home/zampins/local-petsc --with-blaslapack-include=/home/zampins/local-aocl/aocc/3.0-6/include --with-blaslapack-lib="[/home/zampins/local-aocl/aocc/3.0-6/lib/liblapacke.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libflame.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libblis-mt.a]" HIPFLAGS=--amdgpu-target=gfx908 HIPPPFLAGS=-I/home/zampins/local-petsc/include PETSC_ARCH=arch-debug
>> # [1]PETSC ERROR: #1 PetscSFGetGroups() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:1166
>> # [1]PETSC ERROR: #2 PetscSFSetUp_Window() at /home/zampins/Devel/petsc/src/vec/is/sf/impls/window/sfwindow.c:708
>> # [1]PETSC ERROR: #3 PetscSFSetUp() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:318
>> # [1]PETSC ERROR: #4 main() at /home/zampins/Devel/petsc/src/vec/is/sf/tutorials/ex1.c:172
>> # [1]PETSC ERROR: PETSc Option Table entries:
>> # [1]PETSC ERROR: -sf_type window
>> # [1]PETSC ERROR: -sf_window_flavor create
>> # [1]PETSC ERROR: -sf_window_sync active
>> # [1]PETSC ERROR: -test_gather
>> # [1]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov <mailto:petsc-maint at mcs.anl.gov>----------
>> # -sf_window_sync active
>> # [3]PETSC ERROR: -test_gather
>> # [3]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov <mailto:petsc-maint at mcs.anl.gov>----------
>> # --------------------------------------------------------------------------
>> # MPI_ABORT was invoked on rank 2 in communicator MPI_COMM_WORLD
>> # with errorcode 98.
>> # 
>> # NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
>> # You may or may not see output from other processes, depending on
>> # exactly when Open MPI kills them.
>> # --------------------------------------------------------------------------
>> # [kanary.kaust.edu.sa:115527 <http://kanary.kaust.edu.sa:115527>] 2 more processes have sent help message help-mpi-api.txt / mpi-abort
>> # [kanary.kaust.edu.sa:115527 <http://kanary.kaust.edu.sa:115527>] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
>>  ok vec_is_sf_tutorials-ex1_4 # SKIP Command failed so no diff
>> # retrying vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic
>> not ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-dynamic # Error code: 98
>> # [1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
>> # [1]PETSC ERROR: General MPI error 
>> # [1]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank
>> # [1]PETSC ERROR: See https://petsc.org/release/faq/ <https://petsc.org/release/faq/> for trouble shooting.
>> # [1]PETSC ERROR: Petsc Development GIT revision: v3.15.4-783-g168bb9f76b  GIT Date: 2021-09-13 14:01:22 +0000
>> # [1]PETSC ERROR: ../ex1 on a arch-debug named kanary.kaust.edu.sa <http://kanary.kaust.edu.sa/> by zampins Tue Sep 14 09:31:45 2021
>> # [1]PETSC ERROR: Configure options --with-cc=/home/zampins/local/bin/mpicc --with-cxx-dialect=c++14 --with-cxx=/home/zampins/local/bin/mpicxx --with-debugging=1 --with-fc=/home/zampins/local/bin/mpifort --with-fortran-bindings=0 --with-hip-dir=/opt/rocm --with-hip=1 --with-hypre-dir=/home/zampins/local-petsc --with-kokkos-dir=/home/zampins/local-petsc --with-kokkos-kernels-dir=/home/zampins/local-petsc --with-blaslapack-include=/home/zampins/local-aocl/aocc/3.0-6/include --with-blaslapack-lib="[/home/zampins/local-aocl/aocc/3.0-6/lib/liblapacke.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libflame.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libblis-mt.a]" HIPFLAGS=--amdgpu-target=gfx908 HIPPPFLAGS=-I/home/zampins/local-petsc/include PETSC_ARCH=arch-debug
>> # [1]PETSC ERROR: #1 PetscSFGetGroups() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:1166
>> # [1]PETSC ERROR: #2 PetscSFSetUp_Window() at /home/zampins/Devel/petsc/src/vec/is/sf/impls/window/sfwindow.c:708
>> # [1]PETSC ERROR: #3 PetscSFSetUp() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:318
>> # [1]PETSC ERROR: #4 main() at /home/zampins/Devel/petsc/src/vec/is/sf/tutorials/ex1.c:172
>> # [1]PETSC ERROR: PETSc Option Table entries:
>> # [1]PETSC ERROR: -sf_type window
>> # [1]PETSC ERROR: [2]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
>> # [2]PETSC ERROR: General MPI error 
>> # [2]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank
>> # [2]PETSC ERROR: See https://petsc.org/release/faq/ <https://petsc.org/release/faq/> for trouble shooting.
>> # [2]PETSC ERROR: Petsc Development GIT revision: v3.15.4-783-g168bb9f76b  GIT Date: 2021-09-13 14:01:22 +0000
>> # [2]PETSC ERROR: ../ex1 on a arch-debug named kanary.kaust.edu.sa <http://kanary.kaust.edu.sa/> by zampins Tue Sep 14 09:31:45 2021
>> # [2]PETSC ERROR: Configure options --with-cc=/home/zampins/local/bin/mpicc --with-cxx-dialect=c++14 --with-cxx=/home/zampins/local/bin/mpicxx --with-debugging=1 --with-fc=/home/zampins/local/bin/mpifort --with-fortran-bindings=0 --with-hip-dir=/opt/rocm --with-hip=1 --with-hypre-dir=/home/zampins/local-petsc --with-kokkos-dir=/home/zampins/local-petsc --with-kokkos-kernels-dir=/home/zampins/local-petsc --with-blaslapack-include=/home/zampins/local-aocl/aocc/3.0-6/include --with-blaslapack-lib="[/home/zampins/local-aocl/aocc/3.0-6/lib/liblapacke.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libflame.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libblis-mt.a]" HIPFLAGS=--amdgpu-target=gfx908 HIPPPFLAGS=-I/home/zampins/local-petsc/include PETSC_ARCH=arch-debug
>> # [2]PETSC ERROR: #1 PetscSFGetGroups() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:1166
>> # [2]PETSC ERROR: #2 PetscSFSetUp_Window() at /home/zampins/Devel/petsc/src/vec/is/sf/impls/window/sfwindow.c:708
>> # [2]PETSC ERROR: #3 PetscSFSetUp() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:318
>> # [2]PETSC ERROR: #4 main() at /home/zampins/Devel/petsc/src/vec/is/sf/tutorials/ex1.c:172
>> # [2]PETSC ERROR: PETSc Option Table entries:
>> # [2]PETSC ERROR: -sf_type window
>> # [2]PETSC ERROR: -sf_window_flavor dynamic
>> # [2]PETSC ERROR: -sf_window_sync active
>> # [2]PETSC ERROR: -test_gather
>> # [2]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov <mailto:petsc-maint at mcs.anl.gov>----------
>> # [3]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
>> # [3]PETSC ERROR: General MPI error 
>> # [3]PETSC ERROR: MPI error 6 MPI_ERR_RANK: invalid rank
>> # [3]PETSC ERROR: See https://petsc.org/release/faq/ <https://petsc.org/release/faq/> for trouble shooting.
>> # [3]PETSC ERROR: Petsc Development GIT revision: v3.15.4-783-g168bb9f76b  GIT Date: 2021-09-13 14:01:22 +0000
>> # [3]PETSC ERROR: ../ex1 on a arch-debug named kanary.kaust.edu.sa <http://kanary.kaust.edu.sa/> by zampins Tue Sep 14 09:31:45 2021
>> # [3]PETSC ERROR: Configure options --with-cc=/home/zampins/local/bin/mpicc --with-cxx-dialect=c++14 --with-cxx=/home/zampins/local/bin/mpicxx --with-debugging=1 --with-fc=/home/zampins/local/bin/mpifort --with-fortran-bindings=0 --with-hip-dir=/opt/rocm --with-hip=1 --with-hypre-dir=/home/zampins/local-petsc --with-kokkos-dir=/home/zampins/local-petsc --with-kokkos-kernels-dir=/home/zampins/local-petsc --with-blaslapack-include=/home/zampins/local-aocl/aocc/3.0-6/include --with-blaslapack-lib="[/home/zampins/local-aocl/aocc/3.0-6/lib/liblapacke.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libflame.a,/home/zampins/local-aocl/aocc/3.0-6/lib/libblis-mt.a]" HIPFLAGS=--amdgpu-target=gfx908 HIPPPFLAGS=-I/home/zampins/local-petsc/include PETSC_ARCH=arch-debug
>> # [3]PETSC ERROR: #1 PetscSFGetGroups() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:1166
>> # [3]PETSC ERROR: #2 PetscSFSetUp_Window() at /home/zampins/Devel/petsc/src/vec/is/sf/impls/window/sfwindow.c:708
>> # [3]PETSC ERROR: #3 PetscSFSetUp() at /home/zampins/Devel/petsc/src/vec/is/sf/interface/sf.c:318
>> # [3]PETSC ERROR: #4 main() at /home/zampins/Devel/petsc/src/vec/is/sf/tutorials/ex1.c:172
>> # [3]PETSC ERROR: PETSc Option Table entries:
>> # [3]PETSC ERROR: -sf_type window
>> # [3]PETSC ERROR: -sf_window_flavor dynamic
>> # [3]PETSC ERROR: -sf_window_sync active
>> # [3]PETSC ERROR: -test_gather
>> # [3]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov <mailto:petsc-maint at mcs.anl.gov>----------
>> # -sf_window_flavor dynamic
>> # [1]PETSC ERROR: -sf_window_sync active
>> # [1]PETSC ERROR: -test_gather
>> # [1]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov <mailto:petsc-maint at mcs.anl.gov>----------
>> # --------------------------------------------------------------------------
>> # MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD
>> # with errorcode 98.
>> # 
>> # NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
>> # You may or may not see output from other processes, depending on
>> # exactly when Open MPI kills them.
>> # --------------------------------------------------------------------------
>> # [kanary.kaust.edu.sa:115572 <http://kanary.kaust.edu.sa:115572>] 2 more processes have sent help message help-mpi-api.txt / mpi-abort
>> # [kanary.kaust.edu.sa:115572 <http://kanary.kaust.edu.sa:115572>] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
>>  ok vec_is_sf_tutorials-ex1_4 # SKIP Command failed so no diff
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate
>>  ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-active_sf_window_flavor-allocate
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create
>> not ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-create # Error code: 1
>> # 43,46c43,46
>> # < [0] 0: 4001 2000 2002 3002 4002
>> # < [1] 0: 1001 3000
>> # < [2] 0: 2001 4000
>> # < [3] 0: 3001 1000
>> # ---
>> # > [0] 0: 4002 2146435072 2 2146435072 34619728
>> # > [1] 0: 3000 2146435072
>> # > [2] 0: 4000 2146435072
>> # > [3] 0: 3001 2146435072
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic
>> not ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-dynamic # Error code: 1
>> # 43,46c43,46
>> # < [0] 0: 4001 2000 2002 3002 4002
>> # < [1] 0: 1001 3000
>> # < [2] 0: 2001 4000
>> # < [3] 0: 3001 1000
>> # ---
>> # > [0] 0: 4002 2146435072 2 2146435072 0
>> # > [1] 0: 3000 2146435072
>> # > [2] 0: 4000 2146435072
>> # > [3] 0: 3001 2146435072
>>  ok vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate
>>  ok diff-vec_is_sf_tutorials-ex1_4+sf_window_sync-lock_sf_window_flavor-allocate
>> 
>> 
>> 
>> Il giorno mar 14 set 2021 alle ore 07:44 Stefano Zampini <stefano.zampini at gmail.com <mailto:stefano.zampini at gmail.com>> ha scritto:
>> I'll see if I can reproduce 
>> 
>> Il Mar 14 Set 2021, 06:58 Junchao Zhang <junchao.zhang at gmail.com <mailto:junchao.zhang at gmail.com>> ha scritto:
>> Hi, Stefano,
>>    Ping you again to see if you want to resolve this problem before petsc-3.16 
>> 
>> --Junchao Zhang
>> 
>> 
>> On Sun, Sep 12, 2021 at 3:06 PM Antonio T. sagitter <sagitter at fedoraproject.org <mailto:sagitter at fedoraproject.org>> wrote:
>> Unfortunately, it's not possible. I must use the OpenMPI provided by 
>> Fedora build-system (these rpm builds of PETSc are for Fedora's 
>> repositories), downloading external software is not permitted.
>> 
>> On 9/12/21 21:10, Pierre Jolivet wrote:
>> > 
>> >> On 12 Sep 2021, at 8:56 PM, Matthew Knepley <knepley at gmail.com <mailto:knepley at gmail.com> 
>> >> <mailto:knepley at gmail.com <mailto:knepley at gmail.com>>> wrote:
>> >>
>> >> On Sun, Sep 12, 2021 at 2:49 PM Antonio T. sagitter 
>> >> <sagitter at fedoraproject.org <mailto:sagitter at fedoraproject.org> <mailto:sagitter at fedoraproject.org <mailto:sagitter at fedoraproject.org>>> wrote:
>> >>
>> >>     Those attached are configure.log/make.log from a MPI build in
>> >>     Fedora 34
>> >>     x86_64 where the error below occurred.
>> >>
>> >>
>> >> This is OpenMPI 4.1.0. Is that the only MPI you build? My first 
>> >> inclination is that this is an MPI implementation bug.
>> >>
>> >> Junchao, do we have an OpenMPI build in the CI?
>> > 
>> > config/examples/arch-ci-linux-cuda-double-64idx.py:   
>> >   '--download-openmpi=1',
>> > config/examples/arch-ci-linux-pkgs-dbg-ftn-interfaces.py: 
>> >   '--download-openmpi=1',
>> > config/examples/arch-ci-linux-pkgs-opt.py:  '--download-openmpi=1',
>> > 
>> > config/BuildSystem/config/packages/OpenMPI.py uses version 4.1.0 as well.
>> > I’m not sure PETSc is to blame here Antonio. You may want to try to 
>> > ditch the OpenMPI shipped by your packet manager and try 
>> > --download-openmpi as well, just for a quick sanity check.
>> > 
>> > Thanks,
>> > Pierre
>> > 
>> 
>> -- 
>> ---
>> Antonio Trande
>> Fedora Project
>> mailto: sagitter at fedoraproject.org <mailto:sagitter at fedoraproject.org>
>> GPG key: 0x29FBC85D7A51CC2F
>> GPG key server: https://keyserver1.pgp.com/ <https://keyserver1.pgp.com/>
>> 
>> 
>> -- 
>> Stefano
>> 
>> 
>> -- 
>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>> -- Norbert Wiener
>> 
>> https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20210914/b30c11cb/attachment-0001.html>


More information about the petsc-dev mailing list