[petsc-users] Dense Matrix Factorization/Solve

Junchao Zhang junchao.zhang at gmail.com
Wed Jul 24 17:03:58 CDT 2024


Currently we don't support Kokkos dense matrix and its solvers.  You can
use  MATSEQDENSECUDA/HIP

--Junchao Zhang


On Wed, Jul 24, 2024 at 2:08 PM Barry Smith <bsmith at petsc.dev> wrote:

>
>    For one MPI rank, it looks like you can use -pc_type cholesky
> -pc_factor_mat_solver_type cupm though it is not documented in
> https://urldefense.us/v3/__https://petsc.org/release/overview/linear_solve_table/*direct-solvers__;Iw!!G_uCfscf7eWS!fvASLTU48U_NIIf2O2CcYRSSki2GbUmrm4zw6CBOWx1rIY8-CnqmhboVIA5-aey5_QOPCQhaI2nbv4FJhCDdQMiJrHPc$ 
>
>    Of if you also ./configure --download-kokkos --download-kokkos-kernels
> you can use -pc_factor_mat_solver_type kokkos if you also this may also
> work for multiple GPUs but that is not documented in the table either
> (Junchao) Nor are sparse Kokkos or CUDA stuff documented (if they exist) in
> the table.
>
>
>    Barry
>
>
>
> On Jul 24, 2024, at 2:44 PM, Sreeram R Venkat <srvenkat at utexas.edu> wrote:
>
> This Message Is From an External Sender
> This message came from outside your organization.
> I have an SPD dense matrix of size NxN, where N can range from 10^4-10^5.
> Are there any Cholesky factorization/solve routines for it in PETSc (or in
> any of the external libraries)? If possible, I want to use GPU acceleration
> with 1 or more GPUs. The matrix type can be MATSEQDENSE/MATMPIDENSE or
> MATSEQDENSECUDA/MATMPIDENSECUDA accordingly. If it is possible to do the
> factorization beforehand and store it to do the triangular solves later,
> that would be great.
>
> Thanks,
> Sreeram
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240724/7cb248c0/attachment.html>


More information about the petsc-users mailing list