[petsc-users] Port existing GMRES+ILU(0) implementation to GPU

Junchao Zhang junchao.zhang at gmail.com
Mon Feb 9 17:18:04 CST 2026


Hi Feng,
  At the first step, you don't need to change your CPU implementation.
Then do profiling to see where it is worth putting your effort.  Maybe you
need to assemble your matrices and vectors on GPUs too, but decide that at
a later stage.

  Thanks!
--Junchao Zhang


On Mon, Feb 9, 2026 at 4:31 PM feng wang <snailsoar at hotmail.com> wrote:

> Hi Junchao,
>
> Many thanks for your reply.
>
> This is great!  Do I need to change anything for my current CPU
> implementation? or I just link to a version of Petsc that is configured
> with  cuda and make sure the necessary data are copied to the "device",
> then Petsc will do the rest magic for me?
>
> Thanks,
> Feng
> ------------------------------
> *From:* Junchao Zhang <junchao.zhang at gmail.com>
> *Sent:* 09 February 2026 1:55
> *To:* feng wang <snailsoar at hotmail.com>
> *Cc:* petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
> *Subject:* Re: [petsc-users] Port existing GMRES+ILU(0) implementation to
> GPU
>
> Hello Feng,
>   It is possible to run GMRES with ILU(0) on GPUs.  You may need to
> configure PETSc with CUDA (--with-cuda --with-cudac=nvcc) or Kokkos (with
> extra --download-kokkos  --download-kokkos-kernels).  Then run with
> -mat_type {aijcusparse or aijkokkos}  -vec_type {cuda or kokkos}.
>   But triangular solve is not GPU friendly and the performance might be
> poor.  But you should try it, I think.
>
> Thanks!
> --Junchao Zhang
>
> On Sun, Feb 8, 2026 at 5:46 PM feng wang <snailsoar at hotmail.com> wrote:
>
> Dear All,
>
> I have an existing implementation of GMRES with ILU(0), it works well for
> cpu now. I went through the Petsc documentation, it seems Petsc has some
> support for GPUs. is it possible for me to run GMRES with ILU(0) in GPUs?
>
> Many thanks for your help in advance,
> Feng
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20260209/e603a5ad/attachment-0001.html>


More information about the petsc-users mailing list