[petsc-users] memory access out of range with MUMPS

Yongxiang Wu yongxiang27 at gmail.com
Mon Apr 16 10:42:03 CDT 2018


Hello everyone,
I am using slepc4py solving a generalized eigenvalue problem, with a large
spase matrix.  The following is my python code. The matrix is first created
using PETSc.Mat().createAIJ() and saved to file matrix-A.dat .


   opts = PETSc.Options()
   opts.setValue('-st_pc_factor_mat_solver_package','mumps')

    viewer = PETSc.Viewer().createBinary('matrix-A.dat',
'r',comm=PETSc.COMM_SELF)
    A = PETSc.Mat().create(comm=PETSc.COMM_SELF)
    A.load(viewer)
    A.setFromOptions()
    A.setUp()

    Print('load petsc binary finished')

    t0 = time()

    E = SLEPc.EPS().create(comm=PETSc.COMM_SELF)
    E.setOperators(A,None)
    E.setDimensions(noEigs2solv,PETSc.DECIDE)           # set number of
eigenvalues to compute
    E.setWhichEigenpairs(E.Which.TARGET_MAGNITUDE)
    E.setProblemType(SLEPc.EPS.ProblemType.GNHEP)    # generalized
non-Hermitian eigenvalue problem

    E.setTolerances(arnoldiTol,1000)     # set tolerance
    if SIGMA is not None:
        E.setTarget(SIGMA)               # set the desired eigenvalue

    if options_v0 is not None:
        E.setInitialSpace(options_v0)    # set the initial vector

    st = E.getST()
    st.setType('sinvert')

    E.setFromOptions()
    Print('Start solving ...')
    E.solve()

My problem here is, when I run this code on cluster, I would end up with
the following error. Strange thing is that I used the save library on my
local PC, and it works without any error.  If any one of you could give me
hint what is wrong?  Thanks


 ** ERROR RETURN ** FROM ZMUMPS INFO(1)=   -1
 ** INFO(2)=               0
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X
to find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: ---------------------  Stack Frames
------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR:       INSTEAD the line number of the start of the function
[0]PETSC ERROR:       is given.
[0]PETSC ERROR: [0] PetscInitializeMUMPS line 1188
/zhome/academic/HLRS/iag/iagyonwu/.local/lib/petsc/src/mat/impls/aij/mpi/mumps/mumps.c
[0]PETSC ERROR: [0] MatGetFactor_aij_mumps line 2125
/zhome/academic/HLRS/iag/iagyonwu/.local/lib/petsc/src/mat/impls/aij/mpi/mumps/mumps.c
[0]PETSC ERROR: [0] MatGetFactor line 4341
/zhome/academic/HLRS/iag/iagyonwu/.local/lib/petsc/src/mat/interface/matrix.c
[0]PETSC ERROR: [0] PCSetUp_LU line 59
/zhome/academic/HLRS/iag/iagyonwu/.local/lib/petsc/src/ksp/pc/impls/factor/lu/lu.c
[0]PETSC ERROR: [0] PCSetUp line 886
/zhome/academic/HLRS/iag/iagyonwu/.local/lib/petsc/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: [0] KSPSetUp line 294
/zhome/academic/HLRS/iag/iagyonwu/.local/lib/petsc/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: [0] STSetUp_Sinvert line 96
/zhome/academic/HLRS/iag/iagyonwu/.local/lib/slepc-3.8.2/src/sys/classes/st/impls/sinvert/sinvert.c
[0]PETSC ERROR: [0] STSetUp line 233
/zhome/academic/HLRS/iag/iagyonwu/.local/lib/slepc-3.8.2/src/sys/classes/st/interface/stsolve.c
[0]PETSC ERROR: [0] EPSSetUp line 104
/zhome/academic/HLRS/iag/iagyonwu/.local/lib/slepc-3.8.2/src/eps/interface/epssetup.c
[0]PETSC ERROR: [0] EPSSolve line 129
/zhome/academic/HLRS/iag/iagyonwu/.local/lib/slepc-3.8.2/src/eps/interface/epssolve.c
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59

with regards
Yong
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180416/83589768/attachment.html>


More information about the petsc-users mailing list