[petsc-users] Newbie question : iterative solver - algorithm and performance

Lukas van de Wiel lukas.drinkt.thee at gmail.com
Wed Feb 15 13:29:02 CST 2017


The wonderful people of PETSc have made a comprehensible list of all the
included algorithms and whether they work in parallel or not.

It is here:
http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html

Have a great day
Lukas

On Wed, Feb 15, 2017 at 8:26 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
> > On Feb 15, 2017, at 7:03 AM, lixin chu <lixin_chu at yahoo.com> wrote:
> >
> > I think the chapter 4 of the user manual more or less answers my first
> question already ...
> >
> > rgds
> > lixin
> >
> >
> > On Wednesday, 15 February 2017, 14:40, lixin chu <lixin_chu at yahoo.com>
> wrote:
> >
> >
> > Hello,
> > New to PETSc, appreciate any help -
> >
> > I have done some experiment with MUMPS (the direct solver for sparse
> matrix), and very interested to try out PETSc now. I have a large sparse
> symmetric matrix (3 millions x 3 millions, complex data type).
> >
> > Some questions I have:
> > - I am assuming that I should select one algorithm of "Krylov methods",
> which algorithm is a good option, GMRES, CG, or others ? (I am not a domain
> expert, but helping developing a program to test the matrix).
>
>    GMRES is a "safe" choice. You can try -ksp_type cg   -ksp_cg_type
> symmetric  (or if the matrix is hermitian; i.e. transpose A = complex
> conjugate of A  then type hermitian instead of symmetric).
>
>
> > - do all the algorithms support distributed architecture (multiple
> machines, multiple cores)
>
>     Yes
>
> > - are there any performance test data ? (total run time for example)
>
>    Run your code with -view_summary and it will print information at the
> end about where it has spent the time doing the computation.
>
>
> >
> >
> > thank you very much,
> > LX
> >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170215/5885a300/attachment.html>


More information about the petsc-users mailing list