[petsc-users] request to add an option similar to use_omp_threads for mumps to cusparse solver

Mark Adams mfadams at lbl.gov
Tue Oct 12 12:05:01 CDT 2021


On Tue, Oct 12, 2021 at 10:24 AM Junchao Zhang <junchao.zhang at gmail.com>
wrote:

> Hi, Chang,
>    For the mumps solver, we usually transfers matrix and vector data
> within a compute node.  For the idea you propose, it looks like we need to
> gather data within MPI_COMM_WORLD, right?
>
>    Mark, I remember you said cusparse solve is slow and you would rather
> do it on CPU. Is it right?
>

Yes, I find that cuSparse solve is slower on our sparse CPU lu
factorization solves than the (old) CPU solve. I have an MR to allow the
use of the CPU solve with LU and cusparse.
I am running many fairly small problems and the factorization is on the CPU
so a CPU solve keeps the factors on the CPU.
I would imagine cuSparse solves would be faster at some point as you scale
up.


>
> --Junchao Zhang
>
>
> On Mon, Oct 11, 2021 at 10:25 PM Chang Liu via petsc-users <
> petsc-users at mcs.anl.gov> wrote:
>
>> Hi,
>>
>> Currently, it is possible to use mumps solver in PETSC with
>> -mat_mumps_use_omp_threads option, so that multiple MPI processes will
>> transfer the matrix and rhs data to the master rank, and then master
>> rank will call mumps with OpenMP to solve the matrix.
>>
>> I wonder if someone can develop similar option for cusparse solver.
>> Right now, this solver does not work with mpiaijcusparse. I think a
>> possible workaround is to transfer all the matrix data to one MPI
>> process, and then upload the data to GPU to solve. In this way, one can
>> use cusparse solver for a MPI program.
>>
>> Chang
>> --
>> Chang Liu
>> Staff Research Physicist
>> +1 609 243 3438
>> cliu at pppl.gov
>> Princeton Plasma Physics Laboratory
>> 100 Stellarator Rd, Princeton NJ 08540, USA
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20211012/20a0edd0/attachment.html>


More information about the petsc-users mailing list