[petsc-users] [MPI][GPU]

Barry Smith bsmith at petsc.dev
Sat Aug 30 14:47:07 CDT 2025


Did you try the additional option -vec_type cuda with ex10.c ?



> On Aug 30, 2025, at 1:16 PM, LEDAC Pierre <Pierre.LEDAC at cea.fr> wrote:
> 
> Hello,
> 
> My code is built with PETSc 3.23+OpenMPI 4.1.6 (Cuda support enabled) and profling indicates that MPI communications are done between GPUs in all the code except PETSc part where D2H transfers occur.
> 
> I reproduced the PETSc issue with the example under src/ksp/ksp/tutorials/ex10 on 2 MPI ranks. See output in ex10.log
> 
> Also below the Nsys system profiling on ex10 with D2H and H2D copies before/after MPI calls.
> 
> Thanks for your help,
> 
> <pastedImage.png>
> 
> 
> Pierre LEDAC
> Commissariat à l’énergie atomique et aux énergies alternatives
> Centre de SACLAY
> DES/ISAS/DM2S/SGLS/LCAN
> Bâtiment 451 – point courrier n°41
> F-91191 Gif-sur-Yvette
> +33 1 69 08 04 03
> +33 6 83 42 05 79
> <ex10.log>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250830/582ff1cc/attachment.html>


More information about the petsc-users mailing list