[petsc-users] Slepc: Computing first 4 smallest eigenvalues and eigenvectors of a large graph Laplacian

Barry Smith bsmith at mcs.anl.gov
Sat Mar 25 22:22:51 CDT 2017


> On Mar 25, 2017, at 8:52 PM, Bodhisatta Pramanik <bodhi91 at iastate.edu> wrote:
> 
> Hi,
> 
> I apologize for the slepc question. I could not find any user lists so I'm hoping someone on here might be able to offer some guidance.
> 
> Problem Definition:
> I am working on a graph partitioning problem. I have the laplacian of a large graph(500,000 nodes) and am interested in extracting its global information in order to find good partitioning. An approach is to compute the first few(4-5) eigenvalues and use that information to formulate the partition algorithm.
> 
> I am leveraging the EPS solvers of the Slepc library. It appears that the Jacobi-davidson eigen solver gives me the eigenvalues in the shortest period of time compared to others (Krylov-Schur, Rayleigh quotient, Lanczos, etc). I use this eigensolver with the conjugate gradient linear solver and the block-jacobi preconditioner. So this is what I am basically passing through the command line: 
> 
> ./RunPart -eps_type jd -eps_nev 4 -st_ksp_type cg -st_ksp_rtol 0.001 -eps_tol 0.001 -st_pc_type bjacobi -eps_smallest_real

  Try     -st_pc_type gamg 

   One one process default bjacobi results in ILU which is not a great preconditioner.  

  Also try -st_pc_type sor

> 
> Question:
> The time it takes to compute the first 4-5 eigenvectors of a matrix of size (200k) is near about 60 seconds. CPU config: Intel Xeon 2GHz. I am using a single processor to run my code. Is there any way I can gain major speedup than what I am getting?

  Run with -log_view to see where the computation is taking the most time.

> 
> Is it possible to obtain the eigenvalues inside 10-15 seconds of such huge matrices even if I do not use multiple processor??
> 
> Can someone provide me with some valuable guidance??

   Since the matrix is symmetric are you using the sbaij format instead of AIJ?

> 
> Thanks,
> Bodhi



More information about the petsc-users mailing list