[petsc-users] Using Petsc with multiple RHS

Xiangdong Liang xdliang at gmail.com
Mon Dec 12 11:56:33 CST 2011


Hi Alexander,

I am just curious how other sparse direct solver (eg. PaStiX) works on
your problem. Right now, I only have single RHS, but I will need
multi-RHS soon. For my single RHS problem, PaStiX works better for my
sparse matrix pattern (helmholtz like). If you tried PaStiX, I would
be happy to know the comparison between these solvers. Thanks.

Xiangdong

On Mon, Dec 12, 2011 at 10:22 AM, Alexander Grayver
<agrayver at gfz-potsdam.de> wrote:
> Matt, Barry & Hong,
>
> Thanks a lot for response.
> I may be wrong, but I think that people chose direct solvers for their
> robustness and ability to solve for many RHS very quickly.
> Surely I don't expect my solution to be sparse and it is not, but at least
> in electromagnetics it is pretty common to have sparse RHS.
> I am wondering how about other disciplines?
>
> I don't know about other solvers, but when I use multiple RHS in mumps it
> works several times faster then solving for them sequentially. It is just my
> experience.
>
> And I also can imagine that using sparsity is another obvious advantage if
> you have sparse RHS.
> Now for 1000 rhs and a system of order 10^6 I need 32 gb to use MatMatSolve
> with double complex arithmetic. For instance, LDLt factorization of matrix
> 10^6 takes ~80 gb.
> One option would be to use MatMatSolve(A,X,X) thus reducing memory
> consumption twice.
>
>
>>>  I doubt it is feasible for thousands of rhs.
>
>
> You're right, it is not. It is feasible only if you have sparse RHS.
>
> Regards,
> Alexander
>
>
> On 12.12.2011 15:37, Hong Zhang wrote:
>>
>> Reading MUMPS manual, mumps only supports centralized dense and sparse
>> rhs,
>> while solution can be centralized or distributed, but only in the
>> dense format (because manual
>> does not mention sparse solu).
>>
>> It would be easy to add MatMatSolve() in Petsc-mumps interface for dense
>> rhs,
>> but will require vecscatter of petsc distributed dense matrix for rhs
>> to mumps centralized
>> dense rhs - I doubt it is feasible for thousands of rhs.
>>
>> Hong
>>
>>
>> On Mon, Dec 12, 2011 at 8:26 AM, Barry Smith<bsmith at mcs.anl.gov>  wrote:
>>>
>>>   Alexander,
>>>
>>>    Do you expect that solving with a sparse right hand side will result
>>> in an answer that is also sparse? Normally once you've done the lower
>>> triangular solve and then the upper triangular solve won't the result be
>>> dense?
>>>
>>>    Barry
>>>
>>> On Dec 12, 2011, at 3:10 AM, Alexander Grayver wrote:
>>>
>>>> Hi Barry,
>>>>
>>>> Thanks for answer. I should have asked about that from the very
>>>> beginning actually.
>>>> I get 2 times decrease in performance with 20 RHS, I can imagine how
>>>> slow it will be when I will use thousands of them,
>>>> moreover it will require a lot of memory to store them as a dense
>>>> matrix.
>>>>
>>>> On 11.12.2011 18:50, Barry Smith wrote:
>>>>>
>>>>> On Dec 11, 2011, at 9:29 AM, Alexander Grayver wrote:
>>>>>    One by one
>>>>
>>>> I'm wondering why? All main direct packages like MUMPS, SuperLU_DIST,
>>>> PaSTiX support multiple RHS.
>>>>
>>>>>    We do not handle a sparse right hand side.
>>>>
>>>> Since I already transferred my code to PETSc anyway, my question now if
>>>> it's possible to implement sparse multiple RHS and solve them
>>>> simultaneously?
>>>> Something like MatMatSolveSparse. I would implement it myself, the
>>>> question is there a way to integrate it into PETSc as a patch or something
>>>> like that?
>>>>
>>>> Regards,
>>>> Alexander
>
>


More information about the petsc-users mailing list