[petsc-users] Retireve eigenvectors from a paralell job/ Spectrum slicing in order to solve big eigenvalue problem

Jan Grießer griesser.jan at googlemail.com
Tue Sep 18 10:53:33 CDT 2018


But i'am still struggling a little bit with copying the vector entries from
the processes to one sequentiel proces. I tried to use the comment with
Scatter().toZero() in Python and tried the following code:
vecs = np.zeros((69999,nconv))
for i in range(nconv):
val = E.getEigenpair(i, vr, vi)
scatter, vec_full = PETSc.Scatter().toZero(vr)
scatter.scatter(vr, vec_full, False, PETSc.ScatterMode.FORWARD)
if rank == 0:
vecs[:,i] = vec_full.getArray()
but when i take a look at the eigenvectors there are some entries which are
strange, because in this exampled i printed 4 eigenvalues and vectors and
get nconv=4, but when i print the vectors i get lines
with 4 entries but they are all zero and also get lines with strange
entries e.g.
6.386079897185445800e-05 3.675310683411162830e-04 9.068495707804999221e-06
7.43627600.000000000000000000e+00 0.000000000000000000e+00
0.000000000000000000e+00 0.000000000000000000e+00
where i have more than 4 entries. Can somebody explain to me why this
happens or what is wrong with the code. (Sorry for asking such beginner
questions, but i just started to work with PETSc and SLEPc)

Am Di., 18. Sep. 2018 um 09:26 Uhr schrieb Jan Grießer <
griesser.jan at googlemail.com>:

> Now it works. Thank you very much!
>
> Am Mo., 17. Sep. 2018 um 10:54 Uhr schrieb Jose E. Roman <
> jroman at dsic.upv.es>:
>
>> For this you have to set a number of partitions equal to the number of
>> MPI processes. If you get this error it is because these values are
>> different.
>> Jose
>>
>>
>> > El 17 sept 2018, a las 10:50, Jan Grießer <griesser.jan at googlemail.com>
>> escribió:
>> >
>> > Is this relly necessary, because in the last sentences of the chapter
>> it states that:
>> > An additional benefit of multi-communicator support is that it enables
>> parallel spectrum slicing runs without the need to install a parallel
>> direct solver (MUMPS). The following commandline example uses sequential
>> linear solves in 4 partitions, one process each:
>> > Therefore i assumed that it is not necessary to compile PETsc4py with
>> an external solver e.g. MUMPS
>> >
>> > Am Mo., 17. Sep. 2018 um 10:47 Uhr schrieb Jose E. Roman <
>> jroman at dsic.upv.es>:
>> > You need a parallel direct solver such as MUMPS. This is explained in
>> section 3.4.5.
>> > Jose
>> >
>> >
>> > > El 17 sept 2018, a las 10:41, Jan Grießer <
>> griesser.jan at googlemail.com> escribió:
>> > >
>> > > def solve_eigensystem(DynMatrix_nn, Unity_nn, Dimension, LowerLimit,
>> UpperLimit):
>> > >       # Create the EPS solver
>> > >       E = SLEPc.EPS().create()
>> > >
>> > >       # Create the preconditioner and set it to Cholesky
>> > >       pc = PETSc.PC().create()
>> > >       pc.setType(pc.Type.CHOLESKY)
>> > >
>> > >       # Create the KSP object
>> > >       ksp = PETSc.KSP().create()
>> > >       ksp.setType(ksp.Type.PREONLY)
>> > >       ksp.setPC(pc)
>> > >
>> > >       # Set up the spectral transformations
>> > >       st = SLEPc.ST().create()
>> > >       st.setType("sinvert")
>> > >       st.setKSP(ksp)
>> > >       # Setup spectral transformation
>> > >       E.setST(st)
>> > >
>> > >       # Eigenvalues should be real, therefore we start to order them
>> from the smallest real value |l.real|
>> > >       E.setWhichEigenpairs(E.Which.ALL)
>> > >       # Set the interval of spectrum slicing
>> > >       E.setInterval(LowerLimit, UpperLimit)
>> > >       # Since the dynamical matrix is symmetric and real it is
>> hermitian. Use GHEP for the spectrum slicing. Operatormatrix B is just a
>> unit matrix
>> > >       E.setProblemType(SLEPc.EPS.ProblemType.GHEP)
>> > >       # Use the Krylov Schur method to solve the eigenvalue problem
>> > >       E.setType(E.Type.KRYLOVSCHUR)
>> > >       # Partition the Krylov schnur problem in npart procceses
>> > >       E.setKrylovSchurPartitions(10)
>> > >       # Set the convergence criterion to relative to the eigenvalue
>> and the maximal number of iterations
>> > >       E.setConvergenceTest(E.Conv.REL)
>> > >       E.setTolerances(tol = 1e-7, max_it = 1000)
>> > >       # Set the matrix in order to solve
>> > >       E.setOperators(DynMatrix_nn, Unity_nn)
>> > >       # Sets EPS options from the options database.
>> > >       E.setFromOptions()
>> > >       # Sets up all the internal data structures necessary for the
>> execution of the eigensolver.
>> > >       E.setUp()
>> > >
>> > >       # Solve eigenvalue problem
>> > >       startClock = time.clock()
>> > >       startTime = time.time()
>> > >       E.solve()
>> > >
>> > > Has maybe one of you any idea why this happens and where the problem
>> is ?
>> > >
>> > > Am Mo., 17. Sep. 2018 um 10:40 Uhr schrieb Jan Grießer <
>> griesser.jan at googlemail.com>:
>> > > I am aware that SLEPc is not supposed to calculate all eigenvalues
>> and eigenvectors, my problem is simply that i want for a physical large
>> enough system all of them before i can make the transition to go to the
>> smallest ones.
>> > > Competitiveness is of secondary importance at the moment.
>> > > But ihave a problem connected with spectrum slicing. I followed the
>> instructions in the manual of Chap. 3.4.5 Spectrum Slicing and converted
>> them to the python package.
>> > > But now i get the following error. It appears to me that it is not
>> able to find the ksp object, but i actually do not know why this is the
>> case.
>> > > aceback (most recent call last):
>> > >   File "Eigensolver_spectrum_slicing.py", line 216, in <module>
>> > >     solve_eigensystem(DynMatrix_nn, Unity_nn, D_nn.shape,
>> opt_dict.LowLimit, opt_dict.UpperLimit)
>> > >   File "Eigensolver_spectrum_slicing.py", line 121, in
>> solve_eigensystem
>> > >     E.setUp()
>> > >   File "SLEPc/EPS.pyx", line 1099, in slepc4py.SLEPc.EPS.setUp
>> > > petsc4py.PETSc.Error: error code 92
>> > > [14] EPSSetUp() line 165 in
>> /tmp/pip-install-golhudw7/slepc/src/eps/interface/epssetup.c
>> > > [14] EPSSetUp_KrylovSchur() line 146 in
>> /tmp/pip-install-golhudw7/slepc/src/eps/impls/krylov/krylovschur/krylovschur.c
>> > > [14] EPSSetUp_KrylovSchur_Slice() line 410 in
>> /tmp/pip-install-golhudw7/slepc/src/eps/impls/krylov/krylovschur/ks-slice.c
>> > > [14] EPSSliceGetEPS() line 300 in
>> /tmp/pip-install-golhudw7/slepc/src/eps/impls/krylov/krylovschur/ks-slice.c
>> > > [14] EPSSetUp() line 165 in
>> /tmp/pip-install-golhudw7/slepc/src/eps/interface/epssetup.c
>> > > [14] EPSSetUp_KrylovSchur() line 146 in
>> /tmp/pip-install-golhudw7/slepc/src/eps/impls/krylov/krylovschur/krylovschur.c
>> > > [14] EPSSetUp_KrylovSchur_Slice() line 461 in
>> /tmp/pip-install-golhudw7/slepc/src/eps/impls/krylov/krylovschur/ks-slice.c
>> > > [14] EPSSliceGetInertia() line 331 in
>> /tmp/pip-install-golhudw7/slepc/src/eps/impls/krylov/krylovschur/ks-slice.c
>> > > [14] STSetUp() line 271 in
>> /tmp/pip-install-golhudw7/slepc/src/sys/classes/st/interface/stsolve.c
>> > > [14] STSetUp_Sinvert() line 132 in
>> /tmp/pip-install-golhudw7/slepc/src/sys/classes/st/impls/sinvert/sinvert.c
>> > > [14] KSPSetUp() line 381 in
>> /tmp/pip-install-xmiaat2t/petsc/src/ksp/ksp/interface/itfunc.c
>> > > [14] PCSetUp() line 923 in
>> /tmp/pip-install-xmiaat2t/petsc/src/ksp/pc/interface/precon.c
>> > > [14] PCSetUp_Cholesky() line 86 in
>> /tmp/pip-install-xmiaat2t/petsc/src/ksp/pc/impls/factor/cholesky/cholesky.c
>> > > [14] MatGetFactor() line 4318 in
>> /tmp/pip-install-xmiaat2t/petsc/src/mat/interface/matrix.c
>> > > [14] See
>> http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html for
>> possible LU and Cholesky solvers
>> > > [14] Could not locate a solver package. Perhaps you must ./configure
>> with --download-<package>
>> > >
>> > > The code i used to solve the problem is
>> > >
>> > > Am Fr., 14. Sep. 2018 um 18:34 Uhr schrieb Matthew Knepley <
>> knepley at gmail.com>:
>> > > On Fri, Sep 14, 2018 at 12:19 PM Jose E. Roman <jroman at dsic.upv.es>
>> wrote:
>> > > El 14 sept 2018, a las 17:45, Jan Grießer <
>> griesser.jan at googlemail.com> escribió:
>> > >
>> > >> Hey there,
>> > >> first i want to say thanks to Satish and Matt for helping with with
>> my last problem with the mpi compilation. I have two questions related to
>> solving a big, hermitian, standard eigenvalue problem using SLEPc4py.,
>> compiled with Intel MKL and Intel MPI.
>> > >>
>> > >> - I am using slepc4py with
>> > >> mpi and run it with around -n 20 cores at the moment and how i
>> wanted to ask if there is an easy way to retrieve the eigenvectors? When i
>> run my code and print  for i in range(nconv):
>> > >>              for i in range(nconv):
>> > >>
>> > >>                      val = E.
>> > >> getEigenpair(i, vr
>> > >> , vi)
>> > >>                      Print(
>> > >> vr.getArray())
>> > >>  i get the parts of the eigenvectors according to the partition of
>> the matrix. Is there any easy way to put them together in an array and
>> write them to file ? (I am struggling a little bit with the building them
>> in the correct order)
>> > >
>> > > You need VecScatterCreateToZero. There must be an equivalent in
>> python.
>> > >
>> > > An alternative to this which you should consider, because it is
>> simpler, is to write the vector to a file
>> > > using some format that PETSc understands, Then you just need
>> vr.view(viewer) for a viewer like
>> > > the binary viewer or some ASCII format you like.
>> > >
>> > >   Thanks,
>> > >
>> > >     Matt
>> > >> - I need to solve eigenvalue problems up to a dimension of 100000
>> degrees of freedom and i need all eigenvalues and eigenvectors. I think
>> solving all eigenvalues in one process is far too much and i thought about
>> if it is possible to apply the spectrum slicing described in Chap. 3.4.5.
>> Due to the nature of my problem, i am able to simulate smaller systems of
>> 10000 DOF and extract the biggest eigenvalue, which will be the same for
>> larger systems sizes. Is this in general possible since i have a standard
>> HEP problem or is there a better and faster possibility to do this?
>> > >
>> > > In general, SLEPc is not intended for computing the whole spectrum.
>> You can try with spectrum slicing but this will be competitive if computing
>> just a percentage of eigenvalues, 50% say.
>> > >
>> > > Jose
>> > >
>> > >>
>> > >> Thank you very much!
>> > >
>> > >
>> > > --
>> > > What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> > > -- Norbert Wiener
>> > >
>> > > https://www.cse.buffalo.edu/~knepley/
>> >
>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180918/de436a44/attachment.html>


More information about the petsc-users mailing list