[petsc-users] Segmentation violation
Matthew Knepley
knepley at gmail.com
Tue Oct 30 12:58:57 CDT 2018
On Tue, Oct 30, 2018 at 1:18 PM Santiago Andres Triana via petsc-users <
petsc-users at mcs.anl.gov> wrote:
> Hi petsc-users,
>
> I am solving a generalized eigenvalue problem using ex7 in
> $SLEPC_DIR/src/eps/examples/tutorials/. I provide the A and B matrices.
> The program runs fine, with correct solutions on 12-core node and also on
> a mac laptop.
>
> However, on a 16-core workstation running Debian testing (fresh install)
> and also a fresh install of petsc and slepc I get the following error:
>
> $ mpiexec -n 2 ./ex7 -f1 A.petsc -f2 B.petsc -st_type sinvert -eps_nev 4
> -eps_target -2e-3+1.01i
>
> Generalized eigenproblem stored in file.
>
> Reading COMPLEX matrices from binary files...
> [1]PETSC ERROR:
> ------------------------------------------------------------------------
> [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
> probably memory access out of range
> [1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [1]PETSC ERROR: or see
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> [1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS
> X to find memory corruption errors
> [1]PETSC ERROR: likely location of problem given in stack below
> [1]PETSC ERROR: --------------------- Stack Frames
> ------------------------------------
> [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not
> available,
> [1]PETSC ERROR: INSTEAD the line number of the start of the function
> [1]PETSC ERROR: is given.
> [1]PETSC ERROR: [1] MatFactorNumeric_MUMPS line 1205
> /home/spin2/petsc-3.10.2/src/mat/impls/aij/mpi/mumps/mumps.c
> [1]PETSC ERROR: [1] MatLUFactorNumeric line 3054
> /home/spin2/petsc-3.10.2/src/mat/interface/matrix.c
> [1]PETSC ERROR: [1] PCSetUp_LU line 59
> /home/spin2/petsc-3.10.2/src/ksp/pc/impls/factor/lu/lu.c
> [1]PETSC ERROR: [1] PCSetUp line 894
> /home/spin2/petsc-3.10.2/src/ksp/pc/interface/precon.c
> [1]PETSC ERROR: [1] KSPSetUp line 304
> /home/spin2/petsc-3.10.2/src/ksp/ksp/interface/itfunc.c
> [1]PETSC ERROR: [1] STSetUp_Sinvert line 96
> /home/spin2/slepc-3.10.1/src/sys/classes/st/impls/sinvert/sinvert.c
> [1]PETSC ERROR: [1] STSetUp line 233
> /home/spin2/slepc-3.10.1/src/sys/classes/st/interface/stsolve.c
> [1]PETSC ERROR: [1] EPSSetUp line 104
> /home/spin2/slepc-3.10.1/src/eps/interface/epssetup.c
> [1]PETSC ERROR: [1] EPSSolve line 129
> /home/spin2/slepc-3.10.1/src/eps/interface/epssolve.c
> [1]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [1]PETSC ERROR: Signal received
> [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [1]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018
> [1]PETSC ERROR: ./ex7 on a arch-linux2-c-opt named wobble-wkst-as by spin2
> Tue Oct 30 17:40:51 2018
> [1]PETSC ERROR: Configure options --download-mpich
> -with-scalar-type=complex --download-mumps --download-parmetis
> --download-metis --download-scalapack --download-fblaslapack
> --with-debugging=1 --download-superlu_dist --download-ptscotch
> [1]PETSC ERROR: #1 User provided function() line 0 in unknown file
> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 1
>
>
>
> the expected output is the following (on a compute node running
> petsc-3.9.2 and also on a mac laptop running petsc-3.10.2):
>
> $ mpiexec -n 2 ./ex7 -f1 A.petsc -f2 B.petsc -st_type sinvert -eps_nev 4
> -eps_target -2e-3+1.01i
>
> Generalized eigenproblem stored in file.
>
> Reading COMPLEX matrices from binary files...
> Number of iterations of the method: 2
> Number of linear iterations of the method: 27
> Solution method: krylovschur
>
> Number of requested eigenvalues: 4
> Stopping condition: tol=1e-08, maxit=63157
> Linear eigensolve converged (4 eigenpairs) due to CONVERGED_TOL;
> iterations 2
> ---------------------- --------------------
> k ||Ax-kBx||/||kx||
> ---------------------- --------------------
> -0.002806+1.009827i 2.00821e-19
> -0.002980+1.008417i 8.08359e-17
> -0.002676+1.011755i 9.49342e-18
> -0.003201+1.007367i 1.50869e-16
> ---------------------- --------------------
>
>
> Just in case, the matrices can be downloaded from here if any one wants to
> give them a try
> https://www.dropbox.com/s/ejpa9owkv8tjnwi/A.petsc?dl=0
> https://www.dropbox.com/s/urjtxaezl0cv3om/B.petsc?dl=0
>
Its not the matrices. Would you be willing to reconfigure and test with
SuperLU_dist? Since it consistently fails
in the MUMPS factorization, it seems like it is either a bug in MUMPS or a
bug in our interface to MUMPS. Valgrind
should show us which one (as Barry suggests), but running with SuperLU_dist
should get you going in the mean time.
Matt
> I tried different petsc/slepc versions to no avail, including an OS
> reinstall. So any help would be highly appreciated. Thanks in advance!
>
> Santiago
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181030/7efa5b64/attachment-0001.html>
More information about the petsc-users
mailing list