[petsc-users] Best practices for solving Dense Linear systems
Nidish
nb25 at rice.edu
Fri Aug 7 00:30:18 CDT 2020
Thank you for the response.
I've just been running some tests with matrices up to 2e4 dimensions (dense). When I compared the solution times for "-mat_type elemental" and "-mat_type mpiaij" running with 4 cores, I found the mpidense versions running way faster than elemental. I have not been able to make the elemental version finish up for 2e4 so far (my patience runs out faster).
What's going on here? I thought elemental was supposed to be superior for dense matrices.
I can share the code if that's appropriate for this forum (sorry, I'm new here).
Nidish
On Aug 6, 2020, 23:01, at 23:01, Barry Smith <bsmith at petsc.dev> wrote:
>
>
>> On Aug 6, 2020, at 7:32 PM, Nidish <nb25 at rice.edu> wrote:
>>
>> I'm relatively new to PETSc, and my applications involve (for the
>most part) dense matrix solves.
>>
>> I read in the documentation that this is an area PETSc does not
>specialize in but instead recommends external libraries such as
>Elemental. I'm wondering if there are any "best" practices in this
>regard. Some questions I'd like answered are:
>>
>> 1. Can I just declare my dense matrix as a sparse one and fill the
>whole matrix up? Do any of the others go this route? What're possible
>pitfalls/unfavorable outcomes for this? I understand the memory
>overhead probably shoots up.
>
> No, this isn't practical, the performance will be terrible.
>
>> 2. Are there any specific guidelines on when I can expect elemental
>to perform better in parallel than in serial?
>
>Because the computation to communication ratio for dense matrices is
>higher than for sparse you will see better parallel performance for
>dense problems of a given size than sparse problems of a similar size.
>In other words parallelism can help for dense matrices for relatively
>small problems, of course the specifics of your machine hardware and
>software also play a role.
>
> Barry
>
>>
>> Of course, I'm interesting in any other details that may be important
>in this regard.
>>
>> Thank you,
>> Nidish
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200807/8c8ed9d8/attachment.html>
More information about the petsc-users
mailing list