<div dir="ltr"><div>Just found out that if we configure with cuda and then want to run on CPU only using CUDA_VISIBLE_DEVICES=-1 PETSc errors out. Is this intended behavior? I supposed it should work</div><div>This is with main<br></div><div><br></div><div>(ecrcml-cuda) zampins@qaysar:~/miniforge/Devel/petsc$ make check</div>Running check examples to verify correct installation<br>Using PETSC_DIR=/home/zampins/miniforge/Devel/petsc and PETSC_ARCH=arch-ecrcml-cuda-double<br>C/C++ example src/snes/tutorials/ex19 run successfully with 1 MPI process<br>C/C++ example src/snes/tutorials/ex19 run successfully with 2 MPI processes<br>C/C++ example src/snes/tutorials/ex19 run successfully with cuda<br>Completed test examples<br><div><br></div><div>(ecrcml-cuda) zampins@qaysar:~/miniforge/Devel/petsc$ make check CUDA_VISIBLE_DEVICES=1</div>Running check examples to verify correct installation<br>Using PETSC_DIR=/home/zampins/miniforge/Devel/petsc and PETSC_ARCH=arch-ecrcml-cuda-double<br>C/C++ example src/snes/tutorials/ex19 run successfully with 1 MPI process<br>C/C++ example src/snes/tutorials/ex19 run successfully with 2 MPI processes<br>C/C++ example src/snes/tutorials/ex19 run successfully with cuda<br>Completed test examples<br><div><br></div><div>(ecrcml-cuda) zampins@qaysar:~/miniforge/Devel/petsc$ make check CUDA_VISIBLE_DEVICES=-1</div>Running check examples to verify correct installation<br>Using PETSC_DIR=/home/zampins/miniforge/Devel/petsc and PETSC_ARCH=arch-ecrcml-cuda-double<br>Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process<br>See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a><br>[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>[0]PETSC ERROR: GPU error <br>[0]PETSC ERROR: cuda error 100 (cudaErrorNoDevice) : no CUDA-capable device is detected<br>[0]PETSC ERROR: See <a href="https://petsc.org/release/faq/">https://petsc.org/release/faq/</a> for trouble shooting.<br>[0]PETSC ERROR: Petsc Development GIT revision: v3.16.0-368-g72b201b202  GIT Date: 2021-10-29 14:48:19 +0300<br>[0]PETSC ERROR: ./ex19 on a arch-ecrcml-cuda-double named <a href="http://qaysar.kaust.edu.sa">qaysar.kaust.edu.sa</a> by zampins Mon Nov  1 18:06:12 2021<br>[0]PETSC ERROR: Configure options --with-blaslapack-include=/home/zampins/miniforge/envs/ecrcml-cuda/include --with-blaslapack-lib=/home/zampins/miniforge/envs/ecrcml-cuda/lib/libmkl_rt.so --download-h2opus --with-cuda --with-kblas-dir=/home/zampins/miniforge/envs/ecrcml-cuda --with-magma-dir=/home/zampins/miniforge/envs/ecrcml-cuda --LDFLAGS=/usr/lib/x86_64-linux-gnu/libcuda.so --with-debugging=1 --with-openmp --with-precision=double --with-fc=0 PETSC_ARCH=arch-ecrcml-cuda-double PETSC_DIR=/home/zampins/miniforge/Devel/petsc<br>[0]PETSC ERROR: #1 initialize() at /home/zampins/miniforge/Devel/petsc/src/sys/objects/device/impls/cupm/cupmdevice.cxx:302<br>[0]PETSC ERROR: #2 PetscDeviceInitializeTypeFromOptions_Private() at /home/zampins/miniforge/Devel/petsc/src/sys/objects/device/interface/device.cxx:292<br>[0]PETSC ERROR: #3 PetscDeviceInitializeFromOptions_Internal() at /home/zampins/miniforge/Devel/petsc/src/sys/objects/device/interface/device.cxx:417<br>[0]PETSC ERROR: #4 PetscInitialize_Common() at /home/zampins/miniforge/Devel/petsc/src/sys/objects/pinit.c:956<br>[0]PETSC ERROR: #5 PetscInitialize() at /home/zampins/miniforge/Devel/petsc/src/sys/objects/pinit.c:1231<br>--------------------------------------------------------------------------<br>Primary job  terminated normally, but 1 process returned<br>a non-zero exit code. Per user-direction, the job has been aborted.<br>--------------------------------------------------------------------------<br>--------------------------------------------------------------------------<br><br clear="all">[</div>