[petsc-users] [EXTERNAL] Re: Kokkos Interface for PETSc

Fackler, Philip facklerpw at ornl.gov
Wed Feb 23 08:47:03 CST 2022


Thanks Jed, Satish, and Richard for the quick and thorough responses.

Philip Fackler
Research Software Engineer, Application Engineering Group
Advanced Computing Systems Research Section
Computer Science and Mathematics Division
Oak Ridge National Laboratory
________________________________
From: petsc-users <petsc-users-bounces at mcs.anl.gov> on behalf of Richard Tran Mills via petsc-users <petsc-users at mcs.anl.gov>
Sent: Thursday, February 17, 2022 18:33
To: petsc-users <petsc-users at mcs.anl.gov>
Cc: Blondel, Sophie <sblondel at utk.edu>; Roth, Philip <rothpc at ornl.gov>; xolotl-psi-development at lists.sourceforge.net <xolotl-psi-development at lists.sourceforge.net>
Subject: [EXTERNAL] Re: [petsc-users] Kokkos Interface for PETSc

Hi Philip,

Sorry to be a bit late in my reply. Jed has explained the gist of what's involved with using the Kokkos/Kokkos-kernels back-end for the PETSc solves, though, depending on exactly how Xolotl creates its vectors, there may be a bit of work required to ensure that the command-line options specifying the matrix and GPU types get applied to the right objects, and that non-GPU types are not being hardcoded somewhere (by a call like "DMSetMatType(dm,MATAIJ)").

In addition to looking at the -log_view output, since Xolotl uses TS you can specify "-ts_view" and look at the output that describes the solver hierarchy that Xolotl sets up. If matrix types are being set correctly, you'll see things like

      Mat Object: 1 MPI processes
        type: seqaijkokkos

(I note that I've also sent a related message about getting Xolotl working with Kokkos back-ends on Summit to you, Sophie, and Phil in reply to old thread about this.)

Were you also asking about how to use Kokkos for PETSc matrix assembly, or is that a question for later?

Cheers,
Richard

On 2/15/22 09:07, Satish Balay via petsc-users wrote:

Also - perhaps the following info might be useful

Satish

----

balay at sb /home/balay/petsc (main=)
$ git grep -l download-kokkos-kernels config/examples
config/examples/arch-ci-freebsd-cxx-cmplx-pkgs-dbg.py
config/examples/arch-ci-linux-cuda-double.py
config/examples/arch-ci-linux-gcc-ifc-cmplx.py
config/examples/arch-ci-linux-hip-double.py
config/examples/arch-ci-linux-pkgs-dbg-ftn-interfaces.py
config/examples/arch-ci-linux-pkgs-valgrind.py
config/examples/arch-ci-osx-cxx-pkgs-opt.py
config/examples/arch-nvhpc.py
config/examples/arch-olcf-crusher.py
config/examples/arch-olcf-spock.py
balay at sb /home/balay/petsc (main=)
$ git grep -l "requires:.*kokkos_kernels"
src/ksp/ksp/tests/ex3.c
src/ksp/ksp/tests/ex43.c
src/ksp/ksp/tests/ex60.c
src/ksp/ksp/tutorials/ex7.c
src/mat/tests/ex123.c
src/mat/tests/ex132.c
src/mat/tests/ex2.c
src/mat/tests/ex250.c
src/mat/tests/ex251.c
src/mat/tests/ex252.c
src/mat/tests/ex254.c
src/mat/tests/ex5.c
src/mat/tests/ex62.c
src/mat/tutorials/ex5k.kokkos.cxx
src/snes/tests/ex13.c
src/snes/tutorials/ex13.c
src/snes/tutorials/ex3k.kokkos.cxx
src/snes/tutorials/ex56.c
src/ts/utils/dmplexlandau/tutorials/ex1.c
src/ts/utils/dmplexlandau/tutorials/ex1f90.F90
src/ts/utils/dmplexlandau/tutorials/ex2.c
src/vec/vec/tests/ex21.c
src/vec/vec/tests/ex22.c
src/vec/vec/tests/ex23.c
src/vec/vec/tests/ex28.c
src/vec/vec/tests/ex34.c
src/vec/vec/tests/ex37.c
src/vec/vec/tests/ex38.c
src/vec/vec/tests/ex4.c
src/vec/vec/tests/ex43.c
src/vec/vec/tests/ex60.c
src/vec/vec/tutorials/ex1.c
balay at sb /home/balay/petsc (main=)
$

On Tue, 15 Feb 2022, Satish Balay via petsc-users wrote:



Also - best to use petsc repo - 'main' branch.

And for install on crusher - check config/examples/arch-olcf-crusher.py

Satish

On Tue, 15 Feb 2022, Jed Brown wrote:



We need to make these docs more explicit, but the short answer is configure with --download-kokkos --download-kokkos-kernels and run almost any example with -dm_mat_type aijkokkos -dm_vec_type kokkos. If you run with -log_view, you should see that all the flops take place on the device and there are few host->device transfers. Message packing is done on the device and it'll use GPU-aware MPI. There are a few examples of residual evaluation and matrix assembly on the device using Kokkos. You can also see libCEED examples for assembly on the device into Kokkos matrices and vectors without touching host memory.

"Fackler, Philip via petsc-users" <petsc-users at mcs.anl.gov><mailto:petsc-users at mcs.anl.gov> writes:



We're intending to transitioning the Xolotl interfaces with PETSc.

I am hoping someone (can) point us to some documentation (and examples) for using PETSc's Kokkos-based interface. If this does not yet exist, then perhaps some slides (like the ones Richard Mills showed at the NE-SciDAC all-hands meeting) showing some examples could get us started.

Thanks for any help that can be provided,

Philip Fackler
Research Software Engineer, Application Engineering Group
Advanced Computing Systems Research Section
Computer Science and Mathematics Division
Oak Ridge National Laboratory











-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220223/fc056eff/attachment-0001.html>


More information about the petsc-users mailing list