[petsc-users] [SLEPc] ex5 fails, error in lapack
Dave May
dave.mayhem23 at gmail.com
Sun Oct 28 04:31:06 CDT 2018
On Sun, 28 Oct 2018 at 09:37, Santiago Andres Triana <repepo at gmail.com>
wrote:
> Hi petsc-users,
>
> I am experiencing problems running ex5 and ex7 from the slepc tutorial.
> This is after upgrade to petsc-3.10.2 and slepc-3.10.1. Has anyone run into
> this problem? see the error message below. Any help or advice would be
> highly appreciated. Thanks in advance!
>
> Santiago
>
>
>
> trianas at hpcb-n02:/home/trianas/slepc-3.10.1/src/eps/examples/tutorials>
> ./ex5 -eps_nev 4
>
> Markov Model, N=120 (m=15)
>
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: Error in external library
> [0]PETSC ERROR: Error in LAPACK subroutine hseqr: info=0
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018
> [0]PETSC ERROR: ./ex5 on a arch-linux2-c-opt named hpcb-n02 by trianas Sun
> Oct 28 09:30:18 2018
> [0]PETSC ERROR: Configure options --known-level1-dcache-size=32768
> --known-level1-dcache-linesize=64 --known-level1-dcache-assoc=8
> --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2
> --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8
> --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8
> --known-bits-per-byte=8 --known-memcmp-ok=1 --known-sizeof-MPI_Comm=4
> --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --known-mpi-int64_t=1
> --known-mpi-c-double-complex=1 --known-has-attribute-aligned=1
> --with-scalar-type=complex --download-mumps=1 --download-parmetis
> --download-metis --download-scalapack=1 --download-fblaslapack=1
> --with-debugging=0 --download-superlu_dist=1 --download-ptscotch=1
> CXXOPTFLAGS="-O3 -march=native" FOPTFLAGS="-O3 -march=native"
> COPTFLAGS="-O3 -march=native" --with-batch --known-64-bit-blas-indices=1
>
I think this last arg is wrong if you use --download-fblaslapack.
Did you explicitly add this option yourself?
[0]PETSC ERROR: #1 DSSolve_NHEP() line 586 in
> /space/hpc-home/trianas/slepc-3.10.1/src/sys/classes/ds/impls/nhep/dsnhep.c
> [0]PETSC ERROR: #2 DSSolve() line 586 in
> /space/hpc-home/trianas/slepc-3.10.1/src/sys/classes/ds/interface/dsops.c
> [0]PETSC ERROR: #3 EPSSolve_KrylovSchur_Default() line 275 in
> /space/hpc-home/trianas/slepc-3.10.1/src/eps/impls/krylov/krylovschur/krylovschur.c
> [0]PETSC ERROR: #4 EPSSolve() line 148 in
> /space/hpc-home/trianas/slepc-3.10.1/src/eps/interface/epssolve.c
> [0]PETSC ERROR: #5 main() line 90 in
> /home/trianas/slepc-3.10.1/src/eps/examples/tutorials/ex5.c
> [0]PETSC ERROR: PETSc Option Table entries:
> [0]PETSC ERROR: -eps_nev 4
> [0]PETSC ERROR: ----------------End of Error Message -------send entire
> error message to petsc-maint at mcs.anl.gov----------
> application called MPI_Abort(MPI_COMM_WORLD, 76) - process 0
> [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=76
> :
> system msg for write_line failure : Bad file descriptor
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181028/0121e128/attachment.html>
More information about the petsc-users
mailing list