[petsc-users] [GPU] Jacobi preconditioner

LEDAC Pierre Pierre.LEDAC at cea.fr
Mon Jul 28 02:50:53 CDT 2025


Hello all,


We are solving with PETSc a linear system updated every time step (constant stencil but coefficients changing).


The matrix is preallocated once with MatSetPreallocationCOO() then filled each time step with MatSetValuesCOO() and we use device pointers for coo_i, coo_j, and coefficients values.


It is working fine with a GMRES Ksp solver and PC Jacobi but we are surprised to see that every time step, during PCSetUp, MatGetDiagonal_SeqAIJ is called whereas the matrix is on the device. Looking at the API, it seems there is no MatGetDiagonal_SeqAIJCUSPARSE() but a MatGetDiagonal_SeqAIJKOKKOS().


Does it mean we should use Kokkos backend in PETSc to have Jacobi preconditioner built directly on device ? Or I am doing something wrong ?

NB: Gmres is running well on device.


I could use -ksp_reuse_preconditioner to avoid Jacobi being recreated each solve on host but it increases significantly the number of iterations.


Thanks,


[cid:efad8a7f-43b5-4363-9e63-818e26a804bd]



Pierre LEDAC
Commissariat à l’énergie atomique et aux énergies alternatives
Centre de SACLAY
DES/ISAS/DM2S/SGLS/LCAN
Bâtiment 451 – point courrier n°41
F-91191 Gif-sur-Yvette
+33 1 69 08 04 03
+33 6 83 42 05 79
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250728/c713bd5d/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: pastedImage.png
Type: image/png
Size: 519712 bytes
Desc: pastedImage.png
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250728/c713bd5d/attachment-0001.png>


More information about the petsc-users mailing list