[petsc-users] [MPI][GPU]

LEDAC Pierre Pierre.LEDAC at cea.fr
Sat Aug 30 12:16:08 CDT 2025


Hello,


My code is built with PETSc 3.23+OpenMPI 4.1.6 (Cuda support enabled) and profling indicates that MPI communications are done between GPUs in all the code except PETSc part where D2H transfers occur.


I reproduced the PETSc issue with the example under src/ksp/ksp/tutorials/ex10 on 2 MPI ranks. See output in ex10.log


Also below the Nsys system profiling on ex10 with D2H and H2D copies before/after MPI calls.


Thanks for your help,


[cid:fe096b00-e3b7-4aa3-9d78-d8a44adbe145]



Pierre LEDAC
Commissariat à l’énergie atomique et aux énergies alternatives
Centre de SACLAY
DES/ISAS/DM2S/SGLS/LCAN
Bâtiment 451 – point courrier n°41
F-91191 Gif-sur-Yvette
+33 1 69 08 04 03
+33 6 83 42 05 79
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250830/a580f91a/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: pastedImage.png
Type: image/png
Size: 123750 bytes
Desc: pastedImage.png
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250830/a580f91a/attachment-0001.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: ex10.log
Type: text/x-log
Size: 31221 bytes
Desc: ex10.log
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250830/a580f91a/attachment-0001.bin>


More information about the petsc-users mailing list