[petsc-users] Benchmarking

Vijay Gopal Chilkuri vijay.gopal.c at gmail.com
Wed Dec 28 10:14:12 CST 2016


Dear Jose,

Thanks for the references.

I'm only calculation the smallest 10-100 eigenvalues using the krylovshur
algorithm.

How important is it to use a preconditioner ? My matrix has at most 48
nonzero elements per row.


Best,
 Vijay

On Wed, Dec 28, 2016 at 4:44 PM, Jose E. Roman <jroman at dsic.upv.es> wrote:

>
> > El 28 dic 2016, a las 16:26, Vijay Gopal Chilkuri <
> vijay.gopal.c at gmail.com> escribió:
> >
> > Dear developers,
> >
> > I'm doing exact diagonalization studies of some phenomenological model
> Hamiltonian. In this study I have to diagonalize large sparse matrices in
> Hilbert space of Slater determinants many times.
> >
> > I've successfully used PETSc + SLEPc to get few smallest eigenvalues.
> > For example I've been able to diagonalize a matrix of rank 91454220 with
> 990 processors. This diagonalization took 15328.695847 Sec (or 4.25 Hrs.)
> >
> > I have two questions:
> >
> > 1. Is this time reasonable, if not, is it possible to optimize further ?
>
> It depends on how many eigenvalues are being computed. If computing more
> than 1000 eigenvalues it is very important to set the mpd parameter, see
> section 2.6.5.
>
> >
> > 2. I've tried a quick google search but could not find a comprehensive
> benchmarking of the SLEPc library for sparse matrix diagonalization. Could
> you point me to a publication/resource which has such a benchmarking ?
>
> Some papers in the list of applications have performance results.
> http://slepc.upv.es/material/appli.htm
> See for instance [Moran et al 2011] for results up to 2048 cores. See also
> [Steiger et al 2011].
>
> Jose
>
>
> >
> > Thanks for your help.
> >
> > PETSc Version: master branch commit: b33322e
> > SLEPc Version: master branch commit: c596d1c
> >
> > Best,
> >  Vijay
> >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20161228/cce6063d/attachment.html>


More information about the petsc-users mailing list