[petsc-users] SLEPc spectrum slicing
Jeff Steward
jeffsteward at gmail.com
Thu May 19 15:28:01 CDT 2016
I have some questions regarding spectrum slicing in SLEPc, especially in
the new version 3.7, as I see a line in the Change Notes that I don't quite
understand ("in spectrum slicing in multi-communicator mode now it is
possible to update the problem matrices directly on the sub-communicators".)
1) In light of this statement, how should I divide my Mat to best work with
eps_krylovschur_partitions? Let's say I have 384 processors and
eps_krylovschur_partitions=48, so there will be 8 processors in each group.
Should I distribute the matrix over all 384 processors (so let's say this
gives 32 rows per processor) and have the entire communicator call EPS
solve, or should I (can I?) distribute the matrix over each of the 8 groups
(giving say 1536 rows per processor) and have the 48 subcommunicators call
EPSSolve? Am I correct in thinking that distributing over all 384
processors requires collecting and duplicating the matrix to the 48
different groups?
2) If I'm understanding it correctly, the approach for Krylov-Schur
spectrum slicing described in the manual for SLEPc seems like a wasteful
default method. From what I gather, regions are divided by equal distance,
and different regions are bound to end up with different (and potentially
vastly different) numbers of eigenpairs. I understand the user can provide
their own region intervals, but wouldn't it be better for SLEPc to first
compute the matrix inertias at some given first guess regions, interpolate,
then fix the spectra endpoints so they contain approximately the same
number in each region? An option for logarithmically spaced points rather
than linearly spaced points would be helpful as well, as for the problem I
am looking at the spectrum decays in this way (few large eigenvalues with
an exponential decay down to many smaller eigenvalues). I require
eigenpairs with eigenvalues that vary by several orders of magnitude (1e-3
to 1e3), so the linear equidistant strategy is hopeless.
3) The example given for spectrum slicing in the user manual is
mpiexec -n 20 ./ex25 -eps_interval 0.4,0.8 -eps_krylovschur_partitions 4
-st_type sinvert -st_ksp_type preonly -st_pc_type cholesky
-st_pc_factor_mat_solver_package mumps
-mat_mumps_icntl_13 1
which requires a direct solver. If I can compute the matrix inertias myself
and come up with spectral regions and subcommunicators as described above,
is there a way to efficiently use SLEPc with an iterative solver? How about
with a matrix shell? (I'm getting greedy now ^_^).
I would really appreciate any help on these questions. Thank you for your
time.
Best wishes,
Jeff
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160519/1d5b3ae3/attachment.html>
More information about the petsc-users
mailing list