[petsc-users] Best practices for solving Dense Linear systems
Nidish
nb25 at rice.edu
Fri Aug 7 01:25:31 CDT 2020
Indeed - I was just using the default solver (GMRES with ILU).
Using just standard LU (direct solve with "-pc_type lu -ksp_type
preonly"), I find elemental to be extremely slow even for a 1000x1000
matrix. For MPIaij it's throwing me an error if I tried "-pc_type lu".
I'm attaching the code here, in case you'd like to have a look at what
I've been trying to do.
The two configurations of interest are,
$> mpirun -n 4 ./ksps -N 1000 -mat_type mpiaij
$> mpirun -n 4 ./ksps -N 1000 -mat_type elemental
(for the GMRES with ILU) and,
$> mpirun -n 4 ./ksps -N 1000 -mat_type mpiaij -pc_type lu -ksp_type
preonly
$> mpirun -n 4 ./ksps -N 1000 -mat_type elemental -pc_type lu
-ksp_type preonly
elemental seems to perform poorly in both cases.
Nidish
On 8/7/20 12:50 AM, Barry Smith wrote:
>
> What is the output of -ksp_view for the two case?
>
> It is not only the matrix format but also the matrix solver that
> matters. For example if you are using an iterative solver the
> elemental format won't be faster, you should use the PETSc MPIDENSE
> format. The elemental format is really intended when you use a direct
> LU solver for the matrix. For tiny matrices like this an iterative
> solver could easily be faster than the direct solver, it depends on
> the conditioning (eigenstructure) of the dense matrix. Also the
> default PETSc solver uses block Jacobi with ILU on each process if
> using a sparse format, ILU applied to a dense matrix is actually LU so
> your solver is probably different also between the MPIAIJ and the
> elemental.
>
> Barry
>
>
>
>
>> On Aug 7, 2020, at 12:30 AM, Nidish <nb25 at rice.edu
>> <mailto:nb25 at rice.edu>> wrote:
>>
>> Thank you for the response.
>>
>> I've just been running some tests with matrices up to 2e4 dimensions
>> (dense). When I compared the solution times for "-mat_type elemental"
>> and "-mat_type mpiaij" running with 4 cores, I found the mpidense
>> versions running way faster than elemental. I have not been able to
>> make the elemental version finish up for 2e4 so far (my patience runs
>> out faster).
>>
>> What's going on here? I thought elemental was supposed to be superior
>> for dense matrices.
>>
>> I can share the code if that's appropriate for this forum (sorry, I'm
>> new here).
>>
>> Nidish
>> On Aug 6, 2020, at 23:01, Barry Smith <bsmith at petsc.dev
>> <mailto:bsmith at petsc.dev>> wrote:
>>
>>
>> On Aug 6, 2020, at 7:32 PM, Nidish <nb25 at rice.edu
>> <mailto:nb25 at rice.edu>> wrote: I'm relatively new to PETSc,
>> and my applications involve (for the most part) dense matrix
>> solves. I read in the documentation that this is an area
>> PETSc does not specialize in but instead recommends external
>> libraries such as Elemental. I'm wondering if there are any
>> "best" practices in this regard. Some questions I'd like
>> answered are: 1. Can I just declare my dense matrix as a
>> sparse one and fill the whole matrix up? Do any of the others
>> go this route? What're possible pitfalls/unfavorable outcomes
>> for this? I understand the memory overhead probably shoots up.
>>
>>
>> No, this isn't practical, the performance will be terrible.
>>
>> 2. Are there any specific guidelines on when I can expect
>> elemental to perform better in parallel than in serial?
>>
>>
>> Because the computation to communication ratio for dense matrices is higher than for sparse you will see better parallel performance for dense problems of a given size than sparse problems of a similar size. In other words parallelism can help for dense matrices for relatively small problems, of course the specifics of your machine hardware and software also play a role.
>>
>> Barry
>>
>> Of course, I'm interesting in any other details that may be
>> important in this regard. Thank you, Nidish
>>
>>
>
--
Nidish
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200807/b82c1293/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: ksps.cpp
Type: text/x-c++src
Size: 3025 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200807/b82c1293/attachment-0001.bin>
More information about the petsc-users
mailing list