[petsc-users] Using Petsc with multiple RHS
Hong Zhang
hzhang at mcs.anl.gov
Mon Dec 12 08:37:47 CST 2011
Reading MUMPS manual, mumps only supports centralized dense and sparse rhs,
while solution can be centralized or distributed, but only in the
dense format (because manual
does not mention sparse solu).
It would be easy to add MatMatSolve() in Petsc-mumps interface for dense rhs,
but will require vecscatter of petsc distributed dense matrix for rhs
to mumps centralized
dense rhs - I doubt it is feasible for thousands of rhs.
Hong
On Mon, Dec 12, 2011 at 8:26 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
> Alexander,
>
> Do you expect that solving with a sparse right hand side will result in an answer that is also sparse? Normally once you've done the lower triangular solve and then the upper triangular solve won't the result be dense?
>
> Barry
>
> On Dec 12, 2011, at 3:10 AM, Alexander Grayver wrote:
>
>> Hi Barry,
>>
>> Thanks for answer. I should have asked about that from the very beginning actually.
>> I get 2 times decrease in performance with 20 RHS, I can imagine how slow it will be when I will use thousands of them,
>> moreover it will require a lot of memory to store them as a dense matrix.
>>
>> On 11.12.2011 18:50, Barry Smith wrote:
>>> On Dec 11, 2011, at 9:29 AM, Alexander Grayver wrote:
>>> One by one
>>
>> I'm wondering why? All main direct packages like MUMPS, SuperLU_DIST, PaSTiX support multiple RHS.
>>
>>> We do not handle a sparse right hand side.
>>
>> Since I already transferred my code to PETSc anyway, my question now if it's possible to implement sparse multiple RHS and solve them simultaneously?
>> Something like MatMatSolveSparse. I would implement it myself, the question is there a way to integrate it into PETSc as a patch or something like that?
>>
>> Regards,
>> Alexander
>
More information about the petsc-users
mailing list