[petsc-users] SLEPc Bogus eigenvalues for large -eps_nev
Vijay Gopal Chilkuri
vijay.gopal.c at gmail.com
Fri May 15 08:26:12 CDT 2015
On Fri, May 15, 2015 at 3:18 PM, Jose E. Roman <jroman at dsic.upv.es> wrote:
>
> El 15/05/2015, a las 15:00, Vijay Gopal Chilkuri escribió:
>
> > Yes, those seem to be the right eigenvalues.
> > Ok so the solution is to recompile PETSc/SLEPc with a basic
> configuration and test with --with-debugging=1
>
> Yes, it is always good to work --with-debugging=1 until you know your code
> is working correctly. But anyway, I did the computation --with-debugging=0,
> with gnu compilers. It is more important to avoid multiple threads, for the
> moment, both in the LAPACK linked with SLEPc (avoid MKL), and the code that
> generates the matrix (just in case there was a problem in computing the
> coefficients).
>
>
> > Would it make a difference if I use -esp_type lanczos or some other
> diagonalization procedure ?
>
> Do not use lanczos, it is just a worse algorithm than the default solver
> (explicit vs implicit restart).
>
> In principle, methods of the CG type are appropriate for your problem, but
> convergence may be too slow. You can try RQCG with icc preconditioner. I am
> currently implementing LOBPCG in the development version - I will try with
> your matrices.
>
>
Yes ! that would be great !
At the moment, I'm having a lot of trouble with the assembly time for large
matrices, although the actual diagonalization with
Krylov-Schur takes considerable time too.
Please let me know if you could improve the matrix assembly performance for
large matrices.
Although I have to say that the implicit restart in Krylov-Schur seems to
be working great for my type of matrices.
Please keep in touch if you find something interesting.
Thanks again,
Vijay
Jose
>
>
> >
> > I'll run the test with a new version of PETSc/SLEPc and report back.
> >
> > Thanks a lot,
> > Vijay
> >
> > On Fri, May 15, 2015 at 2:55 PM, Jose E. Roman <jroman at dsic.upv.es>
> wrote:
> >
> > El 14/05/2015, a las 19:13, Vijay Gopal Chilkuri escribió:
> >
> > > oups sorry, I send you a smaller one (540540) this should finish in a
> few minutes.
> > >
> > > It requires the same makefile and irpf90.a library.
> > > so just replace the old problem.c file with this and it should compile.
> > >
> > > Thanks again,
> > > Vijay
> > >
> >
> > I was able to compute 300 eigenvalues of this matrix of size 540540. All
> eigenvalues are in the range -4.70811 .. -4.613807, and the associated
> residual is always below 1e-9.
> >
> > There must be something in your software configuration that is causing
> problems. I would suggest trying with a basic PETSc/SLEPc configuration,
> with no openmp flags, using --download-fblaslapack (instead of MKL). Also,
> although it should not make any difference, you may want to try with a
> smaller number of MPI processes (rather than 741).
> >
> > Jose
> >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150515/660218d9/attachment-0001.html>
More information about the petsc-users
mailing list