[petsc-users] eigensolution error with slepc

Manav Bhatia bhatiamanav at gmail.com
Mon Apr 4 11:06:33 CDT 2016


Hi Jose, 

   I also read these matrices into matlab and found the eigenvalues as

>>A = PetscBinaryRead('A.petsc’);
>>B = PetscBinaryRead(‘B.petsc’);
>> [v,d] = eigs(A,B)
(*** got a lot of output about poor-conditioning ***)
>> diag(d)

ans =

   1.0e-05 *

   -0.2219
    0.0229
    0.0229
    0.0025
    0.0019
    0.0014

>> So, one of these is turning out to be negative, but I am still getting numbers. 



Using these matrices with ex7 produces an error:

Dhcp-90-243:T_400_V_0 manav$ /Users/manav/Documents/codes/numerical_lib/slepc/slepc-3.6.2/src/eps/examples/tutorials/ex7 -f1 A.petsc -f2 B.petsc -eps_gen_hermitian -eps_view -eps_nev 1 

Generalized eigenproblem stored in file.

 Reading REAL matrices from binary files...
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Error in external library
[0]PETSC ERROR: Error in Lapack xSTEQR 15
[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.6.3, Dec, 03, 2015 
[0]PETSC ERROR: /Users/manav/Documents/codes/numerical_lib/slepc/slepc-3.6.2/src/eps/examples/tutorials/ex7 on a arch-darwin-cxx-opt named ws243-49.walker.dynamic.msstate.edu by manav Mon Apr  4 11:05:30 2016
[0]PETSC ERROR: Configure options --prefix=/Users/manav/Documents/codes/numerical_lib/petsc/petsc-3.6.3/../ --CC=mpicc --CXX=mpicxx --FC=mpif90 --with-clanguage=c++ --with-fortran=0 --with-mpiexec=/opt/local/bin/mpiexec --with-shared-libraries=1 --with-x=1 --with-x-dir=/opt/X11 --with-debugging=0 --with-lapack-lib=/usr/lib/liblapack.dylib --with-blas-lib=/usr/lib/libblas.dylib --download-superlu=yes --download-superlu_dist=yes --download-suitesparse=yes --download-mumps=yes --download-scalapack=yes --download-parmetis=yes --download-parmetis-shared=1 --download-metis=yes --download-metis-shared=1 --download-hypre=yes --download-hypre-shared=1 --download-ml=yes --download-ml-shared=1 --download-sundials=yes --download-sundials-shared=1
[0]PETSC ERROR: #1 DSSolve_HEP_QR() line 495 in /Users/manav/Documents/codes/numerical_lib/slepc/slepc-3.6.2/src/sys/classes/ds/impls/hep/dshep.c
[0]PETSC ERROR: #2 DSSolve() line 543 in /Users/manav/Documents/codes/numerical_lib/slepc/slepc-3.6.2/src/sys/classes/ds/interface/dsops.c
[0]PETSC ERROR: #3 EPSSolve_KrylovSchur_Symm() line 68 in /Users/manav/Documents/codes/numerical_lib/slepc/slepc-3.6.2/src/eps/impls/krylov/krylovschur/ks-symm.c
[0]PETSC ERROR: #4 EPSSolve() line 101 in /Users/manav/Documents/codes/numerical_lib/slepc/slepc-3.6.2/src/eps/interface/epssolve.c
[0]PETSC ERROR: #5 main() line 147 in /Users/manav/Documents/codes/numerical_lib/slepc/slepc-3.6.2/src/eps/examples/tutorials/ex7.c
[0]PETSC ERROR: PETSc Option Table entries:
[0]PETSC ERROR: -eps_gen_hermitian
[0]PETSC ERROR: -eps_nev 1
[0]PETSC ERROR: -eps_view
[0]PETSC ERROR: -f1 A.petsc
[0]PETSC ERROR: -f2 B.petsc
[0]PETSC ERROR: -matload_block_size 1
[0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov----------
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD 
with errorcode 76.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------




> On Apr 4, 2016, at 10:49 AM, Manav Bhatia <bhatiamanav at gmail.com> wrote:
> 
> I just sent you the updated matrices on slepc-main. 
> 
> Sorry about sending the wrong matrices earlier. 
> 
> Regards,
> Manav
> 
> 
>> On Apr 4, 2016, at 9:42 AM, Manav Bhatia <bhatiamanav at gmail.com> wrote:
>> 
>> Ok. So, I ran ex7 with the same command-line options in your email, and got a result. This is on my Mac OS X, without any changes to the lapack/blas/slepc library. 
>> 
>> I also ran my code on a linux machine with lapack/blas build from source (obtained from netlib), and got the same error as on my mac.
>> 
>> Not sure how to go about debugging this. 
>> 
>> -Manav
>> 
>> 
>>> On Apr 4, 2016, at 3:46 AM, Jose E. Roman <jroman at dsic.upv.es> wrote:
>>> 
>>> 
>>>> El 3 abr 2016, a las 22:17, Manav Bhatia <bhatiamanav at gmail.com> escribió:
>>>> 
>>>> I just send you the matrices. 
>>>> 
>>>> Thanks,
>>>> Manav
>>> 
>>> I cannot reproduce the problem. I was able to compute the eigenvalues without problems with Krylov-Schur (note that I scaled matrix A by 1e7 because the eigenvalues are tiny):
>>> 
>>> $ ./ex7 -f1 A.petsc -f2 B.petsc -eps_gen_hermitian
>>> 
>>> Generalized eigenproblem stored in file.
>>> 
>>> Reading REAL matrices from binary files...
>>> Number of iterations of the method: 1
>>> Number of linear iterations of the method: 17
>>> Solution method: krylovschur
>>> 
>>> Number of requested eigenvalues: 1
>>> Stopping condition: tol=1e-08, maxit=768
>>> Linear eigensolve converged (6 eigenpairs) due to CONVERGED_TOL; iterations 1
>>> ---------------------- --------------------
>>>          k             ||Ax-kBx||/||kx||
>>> ---------------------- --------------------
>>>     11.410228            3.18851e-09
>>>      1.323631            2.33005e-09
>>>      1.323631            3.98884e-09
>>>      0.224497            1.24334e-08
>>>      0.168344            9.81685e-08
>>>      0.129229            4.85372e-08
>>> ---------------------- --------------------
>>> 
>>> Note that I am solving the problem as a GHEP, and this gives far better residuals than solving as a GNHEP. The Arnoldi solver did not fail either.
>>> 
>>> I don't know what caused your error. You can try configuring with the source Lapack: --download-f2cblaslapack
>>> 
>>> Jose
>>> 
>> 
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160404/7308b4c4/attachment-0001.html>


More information about the petsc-users mailing list