[petsc-users] Kokkos Interface for PETSc

Jed Brown jed at jedbrown.org
Tue Feb 15 10:43:12 CST 2022

We need to make these docs more explicit, but the short answer is configure with --download-kokkos --download-kokkos-kernels and run almost any example with -dm_mat_type aijkokkos -dm_vec_type kokkos. If you run with -log_view, you should see that all the flops take place on the device and there are few host->device transfers. Message packing is done on the device and it'll use GPU-aware MPI. There are a few examples of residual evaluation and matrix assembly on the device using Kokkos. You can also see libCEED examples for assembly on the device into Kokkos matrices and vectors without touching host memory.

"Fackler, Philip via petsc-users" <petsc-users at mcs.anl.gov> writes:

> We're intending to transitioning the Xolotl interfaces with PETSc.
> I am hoping someone (can) point us to some documentation (and examples) for using PETSc's Kokkos-based interface. If this does not yet exist, then perhaps some slides (like the ones Richard Mills showed at the NE-SciDAC all-hands meeting) showing some examples could get us started.
> Thanks for any help that can be provided,
> Philip Fackler
> Research Software Engineer, Application Engineering Group
> Advanced Computing Systems Research Section
> Computer Science and Mathematics Division
> Oak Ridge National Laboratory

More information about the petsc-users mailing list