[petsc-users] MUMPS and PARMETIS: Crashes

Tim Steinhoff kandanovian at gmail.com
Wed Oct 19 09:58:47 CDT 2016


Hi all,

I have some problems with PETSc using MUMPS and PARMETIS.
In some cases it works fine, but in some others it doesn't, so I am
trying to understand what is happening.

I just picked the following example:
http://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/examples/tutorials/ex53.c.html

Now, when I start it with less than 4 processes it works as expected:
mpirun -n 3 ./ex53 -n 10000 -ksp_view -mat_mumps_icntl_28 1
-mat_mumps_icntl_29 2

But with 4 or more processes, it crashes, but only when I am using Parmetis:
mpirun -n 4 ./ex53 -n 10000 -ksp_view -mat_mumps_icntl_28 1
-mat_mumps_icntl_29 2

Metis worked in every case I tried without any problems.

I wonder if I am doing something wrong or if this is a general problem
or even a bug? Is Parmetis supposed to work with that example with 4
processes?

Thanks a lot and kind regards.

Volker


Here is the error log of process 0:

Entering DMUMPS 5.0.1 driver with JOB, N =   1       10000
 =================================================
 MUMPS compiled with option -Dmetis
 MUMPS compiled with option -Dparmetis
 =================================================
L U Solver for unsymmetric matrices
Type of parallelism: Working host

 ****** ANALYSIS STEP ********

 ** Max-trans not allowed because matrix is distributed
Using ParMETIS for parallel ordering.
[0]PETSC ERROR:
------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation,
probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac
OS X to find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: ---------------------  Stack Frames
------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR:       INSTEAD the line number of the start of the function
[0]PETSC ERROR:       is given.
[0]PETSC ERROR: [0] MatLUFactorSymbolic_AIJMUMPS line 1395
/fsgarwinhpc/133/petsc/sources/petsc-3.7.4a/src/mat/impls/aij/mpi/mumps/mumps.c
[0]PETSC ERROR: [0] MatLUFactorSymbolic line 2927
/fsgarwinhpc/133/petsc/sources/petsc-3.7.4a/src/mat/interface/matrix.c
[0]PETSC ERROR: [0] PCSetUp_LU line 101
/fsgarwinhpc/133/petsc/sources/petsc-3.7.4a/src/ksp/pc/impls/factor/lu/lu.c
[0]PETSC ERROR: [0] PCSetUp line 930
/fsgarwinhpc/133/petsc/sources/petsc-3.7.4a/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: [0] KSPSetUp line 305
/fsgarwinhpc/133/petsc/sources/petsc-3.7.4a/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: [0] KSPSolve line 563
/fsgarwinhpc/133/petsc/sources/petsc-3.7.4a/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: --------------------- Error Message
--------------------------------------------------------------
[0]PETSC ERROR: Signal received
[0]PETSC ERROR: See
http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
shooting.
[0]PETSC ERROR: Petsc Release Version 3.7.4, Oct, 02, 2016
[0]PETSC ERROR: ./ex53 on a linux-manni-mumps named manni by 133 Wed
Oct 19 16:39:49 2016
[0]PETSC ERROR: Configure options --with-cc=mpiicc --with-cxx=mpiicpc
--with-fc=mpiifort --with-shared-libraries=1
--with-valgrind-dir=~/usr/valgrind/
--with-mpi-dir=/home/software/intel/Intel-2016.4/compilers_and_libraries_2016.4.258/linux/mpi
--download-scalapack --download-mumps --download-metis
--download-metis-shared=0 --download-parmetis
--download-parmetis-shared=0
[0]PETSC ERROR: #1 User provided function() line 0 in  unknown file
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0


More information about the petsc-users mailing list