[petsc-users] Sparse linear system solving

Mark Adams mfadams at lbl.gov
Mon May 30 21:42:05 CDT 2022


And if you see "NO" change in performance I suspect the solver/matrix is
all on one processor.
(PETSc does not use threads by default so threads should not change
anything).

As Matt said, it is best to start with a PETSc example that does something
like what you want (parallel linear solve, see src/ksp/ksp/tutorials for
examples), and then add your code to it.
That way you get the basic infrastructure in place for you, which is pretty
obscure to the uninitiated.

Mark

On Mon, May 30, 2022 at 10:18 PM Matthew Knepley <knepley at gmail.com> wrote:

> On Mon, May 30, 2022 at 10:12 PM Lidia <lidia.varsh at mail.ioffe.ru> wrote:
>
>> Dear colleagues,
>>
>> Is here anyone who have solved big sparse linear matrices using PETSC?
>>
>
> There are lots of publications with this kind of data. Here is one recent
> one: https://arxiv.org/abs/2204.01722
>
>
>> We have found NO performance improvement while using more and more mpi
>> processes (1-2-3) and open-mp threads (from 1 to 72 threads). Did anyone
>> faced to this problem? Does anyone know any possible reasons of such
>> behaviour?
>>
>
> Solver behavior is dependent on the input matrix. The only general-purpose
> solvers
> are direct, but they do not scale linearly and have high memory
> requirements.
>
> Thus, in order to make progress you will have to be specific about your
> matrices.
>
>
>> We use AMG preconditioner and GMRES solver from KSP package, as our
>> matrix is large (from 100 000 to 1e+6 rows and columns), sparse,
>> non-symmetric and includes both positive and negative values. But
>> performance problems also exist while using CG solvers with symmetric
>> matrices.
>>
>
> There are many PETSc examples, such as example 5 for the Laplacian, that
> exhibit
> good scaling with both AMG and GMG.
>
>
>> Could anyone help us to set appropriate options of the preconditioner
>> and solver? Now we use default parameters, maybe they are not the best,
>> but we do not know a good combination. Or maybe you could suggest any
>> other pairs of preconditioner+solver for such tasks?
>>
>> I can provide more information: the matrices that we solve, c++ script
>> to run solving using petsc and any statistics obtained by our runs.
>>
>
> First, please provide a description of the linear system, and the output of
>
>   -ksp_view -ksp_monitor_true_residual -ksp_converged_reason -log_view
>
> for each test case.
>
>   Thanks,
>
>      Matt
>
>
>> Thank you in advance!
>>
>> Best regards,
>> Lidiia Varshavchik,
>> Ioffe Institute, St. Petersburg, Russia
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220530/25ab036d/attachment-0001.html>


More information about the petsc-users mailing list