[petsc-users] [GPU] Jacobi preconditioner

Junchao Zhang junchao.zhang at gmail.com
Tue Jul 29 10:36:45 CDT 2025


On Tue, Jul 29, 2025 at 2:23 AM LEDAC Pierre <Pierre.LEDAC at cea.fr> wrote:

> Thanks for your confirmation, If I read you carefully:
>
> *The memory on the host won't be written until needed,  so it won't affect
> performance if all operations are done on the device.*
>
>
> That means I am doing an operation that forces the matrix to be written on
> the host. Probably the Jacobi preconditioner operation.
>
Yes, with the petsc/cuda backend, since MatGetDiagonal is not
implemented on device,  petsc will copy the matrix from device to host and
do the work there.


> And many thanks for the work done on the new API with COO format !
>
>
> Pierre LEDAC
> Commissariat à l’énergie atomique et aux énergies alternatives
> Centre de SACLAY
> DES/ISAS/DM2S/SGLS/LCAN
> Bâtiment 451 – point courrier n°41
> F-91191 Gif-sur-Yvette
> +33 1 69 08 04 03
> +33 6 83 42 05 79
> ------------------------------
> *De :* Junchao Zhang <junchao.zhang at gmail.com>
> *Envoyé :* lundi 28 juillet 2025 19:25:20
> *À :* LEDAC Pierre
> *Cc :* petsc-users at mcs.anl.gov
> *Objet :* Re: [petsc-users] [GPU] Jacobi preconditioner
>
> Currently we always allocate matrices on the host, so that when you call
> some operations not implemented on the device yet, we have a backup.   The
> memory on the host won't be written until needed,  so it won't affect
> performance if all operations are done on the device.
>
> --Junchao Zhang
>
>
> On Mon, Jul 28, 2025 at 11:45 AM LEDAC Pierre <Pierre.LEDAC at cea.fr> wrote:
>
>> Thanks, i will give a try with kokkos backend.
>>
>> I have just seen now that even if we use MatSetPreallocationCOO() with
>> device pointers, it seems that the matrix is preallocated also on host ? Am
>> i wrong or the strategy is to have matrix on host and device even if only
>> the latter is needed ?
>>
>> Thanks
>> ------------------------------
>> *De :* Junchao Zhang <junchao.zhang at gmail.com>
>> *Envoyé :* lundi 28 juillet 2025 17:43:56
>> *À :* LEDAC Pierre
>> *Cc :* petsc-users at mcs.anl.gov
>> *Objet :* Re: [petsc-users] [GPU] Jacobi preconditioner
>>
>> Yes, MatGetDiagonal_SeqAIJCUSPARSE hasn't been implemented.  petsc/cuda
>> and petsc/kokkos backends are separate code.
>> If petsc/kokkos meet your needs, then just use them.  For petsc users, we
>> hope it will be just a difference of extra --download-kokkos
>> --download-kokkos-kernels in configuration.
>>
>> --Junchao Zhang
>>
>>
>> On Mon, Jul 28, 2025 at 2:51 AM LEDAC Pierre <Pierre.LEDAC at cea.fr> wrote:
>>
>>> Hello all,
>>>
>>>
>>> We are solving with PETSc a linear system updated every time step
>>> (constant stencil but coefficients changing).
>>>
>>>
>>> The matrix is preallocated once with MatSetPreallocationCOO() then
>>> filled each time step with MatSetValuesCOO() and we use device pointers
>>> for coo_i, coo_j, and coefficients values.
>>>
>>>
>>> It is working fine with a GMRES Ksp solver and PC Jacobi but we are
>>> surprised to see that every time step, during PCSetUp,
>>> MatGetDiagonal_SeqAIJ is called whereas the matrix is on the device.
>>> Looking at the API, it seems there is no MatGetDiagonal_SeqAIJCUSPARSE()
>>> but a MatGetDiagonal_SeqAIJKOKKOS().
>>>
>>>
>>> Does it mean we should use Kokkos backend in PETSc to have Jacobi
>>> preconditioner built directly on device ? Or I am doing something wrong ?
>>>
>>> NB: Gmres is running well on device.
>>>
>>>
>>> I could use -ksp_reuse_preconditioner to avoid Jacobi being recreated
>>> each solve on host but it increases significantly the number of iterations.
>>>
>>>
>>> Thanks,
>>>
>>>
>>>
>>>
>>>
>>> Pierre LEDAC
>>> Commissariat à l’énergie atomique et aux énergies alternatives
>>> Centre de SACLAY
>>> DES/ISAS/DM2S/SGLS/LCAN
>>> Bâtiment 451 – point courrier n°41
>>> F-91191 Gif-sur-Yvette
>>> +33 1 69 08 04 03
>>> +33 6 83 42 05 79
>>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250729/1aebb890/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: pastedImage.png
Type: image/png
Size: 519712 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250729/1aebb890/attachment-0001.png>


More information about the petsc-users mailing list