[petsc-users] [GPU] Jacobi preconditioner

LEDAC Pierre Pierre.LEDAC at cea.fr
Thu Jul 31 05:46:41 CDT 2025


Thanks Barry, I agree but didn't dare asking for that.

Pierre LEDAC
Commissariat à l’énergie atomique et aux énergies alternatives
Centre de SACLAY
DES/ISAS/DM2S/SGLS/LCAN
Bâtiment 451 – point courrier n°41
F-91191 Gif-sur-Yvette
+33 1 69 08 04 03
+33 6 83 42 05 79
________________________________
De : Barry Smith <bsmith at petsc.dev>
Envoyé : mercredi 30 juillet 2025 20:34:26
À : Junchao Zhang
Cc : LEDAC Pierre; petsc-users at mcs.anl.gov
Objet : Re: [petsc-users] [GPU] Jacobi preconditioner


   We absolutely should have a MatGetDiagonal_SeqAIJCUSPARSE(). It's somewhat embarrassing that we don't provide this.

   I have found some potential code at https://urldefense.us/v3/__https://stackoverflow.com/questions/60311408/how-to-get-the-diagonal-of-a-sparse-matrix-in-cusparse__;!!G_uCfscf7eWS!flO1UCfj-bia4eeLdSw3qZ5b15r6I7UIktvoIFPaqwGfdbGlABa_9JjiwW6xy6Gan0s-kA6hRXDz3jjsoCZnkf_SlbiT$ 

   Barry




On Jul 28, 2025, at 11:43 AM, Junchao Zhang <junchao.zhang at gmail.com> wrote:

Yes, MatGetDiagonal_SeqAIJCUSPARSE hasn't been implemented.  petsc/cuda and petsc/kokkos backends are separate code.
If petsc/kokkos meet your needs, then just use them.  For petsc users, we hope it will be just a difference of extra --download-kokkos --download-kokkos-kernels in configuration.

--Junchao Zhang


On Mon, Jul 28, 2025 at 2:51 AM LEDAC Pierre <Pierre.LEDAC at cea.fr<mailto:Pierre.LEDAC at cea.fr>> wrote:

Hello all,


We are solving with PETSc a linear system updated every time step (constant stencil but coefficients changing).


The matrix is preallocated once with MatSetPreallocationCOO() then filled each time step with MatSetValuesCOO() and we use device pointers for coo_i, coo_j, and coefficients values.


It is working fine with a GMRES Ksp solver and PC Jacobi but we are surprised to see that every time step, during PCSetUp, MatGetDiagonal_SeqAIJ is called whereas the matrix is on the device. Looking at the API, it seems there is no MatGetDiagonal_SeqAIJCUSPARSE() but a MatGetDiagonal_SeqAIJKOKKOS().


Does it mean we should use Kokkos backend in PETSc to have Jacobi preconditioner built directly on device ? Or I am doing something wrong ?

NB: Gmres is running well on device.


I could use -ksp_reuse_preconditioner to avoid Jacobi being recreated each solve on host but it increases significantly the number of iterations.


Thanks,


<pastedImage.png>



Pierre LEDAC
Commissariat à l’énergie atomique et aux énergies alternatives
Centre de SACLAY
DES/ISAS/DM2S/SGLS/LCAN
Bâtiment 451 – point courrier n°41
F-91191 Gif-sur-Yvette
+33 1 69 08 04 03
+33 6 83 42 05 79

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250731/0b55a3bc/attachment.html>


More information about the petsc-users mailing list