[petsc-users] Error in PETSc with MUMPS on Windows.

Santos Teixeira Frederico fsantost at student.ethz.ch
Mon Jun 17 10:40:50 CDT 2013


>> [0]PETSC ERROR: [0] MatLUFactorSymbolic_AIJMUMPS line 880 src/mat/impls/aij/mpi/mumps/C:\PETSC-~1.1\src\mat\impls\aij\mpi\mumps\mumps.c

> So its crashing somewhere in MatLUFactorSymbolic_AIJMUMPS() - which
> has calls to mumps routines.

I found out that the problems was the call PetscMUMPS_c(&mumps_id) (file mumps.c, line 959). So, a problem with MUMPS...
I contacted the MUMPS team and they suggest me to downgrade to METIS 4.X (and ParMETIS 3.X), once METIS 5.X is not supported yet. By the way, which are the METIS and ParMETIS versions that PETSc install with "--download-***=1"? I used this option on Linux and everything worked normally.

> > --with-mpi-dir=/cygdrive/d/Solvers-dev/Solvers/libs/MPI /cygdrive/d/Solvers-dev/Solvers/libs/MPI/lib/fmpich2.lib

> Looks like you are using mpich2 - but not standard install?

Yes, this is the standard mpich2.

> And Its not clear how one would use blacs/scalapack from mkl. Is it
> supporsed to be compatible with this version of mpich you have? [I
> have no idea]

MKL website says that it compiles with MPICH2 1.x.x.

> Such mixing of mpi compiled codes [where packages are compiled with
> different variants of mpi - but combined into a single binary] can
> potentially have issues.

> And you can do debugging on windows. For sequential you do:

> msdev binary.exe

> [or whatever the current name for the developer studio is.Its 'devenv' for VC2008, and something elese for VC2012]

> And if you wish to debug parallely - you would have to compile the
> application with the developer studio project files - and perhaps
> follow instructions from:
> http://www.mpich.org/static/downloads/1.4.1p1/mpich2-1.4.1p1-windevguide.pdf

Thanks the tips! They were very useful!

> Satish

Regards,
 Frederico.

On Fri, 14 Jun 2013, Santos Teixeira  Frederico wrote:

> Hi,
>
> I got the error below when I tried to execute PETSc with MUMPS. Some important facts:
>
> 1) the following libraries/versions were compiled separatedly: METIS: 5.1.0, ParMETIS: 4.0.3, MUMPS: 4.10.0 and added to PETSc 3.4.1 along with ScaLAPACK 2.0.2 (plus BLAS, LAPACK, BLACS etc. - from latest MKL).
>
> 2) the same code works correctly on Windows with Pardiso (interfaced with PETSc).
>
> 3) the same code works correctly on Linux with Pardiso and the libraries provided by --download-***.
>
> 4) the MUMPS libraries and its dependencies (the same described below) were linked and correctly executed a test provided by the own library.
>
> Despite the lack of informations and debugger (Windows...), could you give some tip and/or guide me to provide you better informations? I appreciate any tip!
>
> Thanks a lot!
>
> Regards,
>  Frederico.
>
> ======================================================================
>
> Starting KSPSolve with MUMPS...
> Entering DMUMPS driver with JOB, N, NZ =   1       74355        2782474
>
>  DMUMPS 4.10.0
> L U Solver for unsymmetric matrices
> Type of parallelism: Working host
>
>  ****** ANALYSIS STEP ********
>
>  Resetting candidate strategy to 0 because NSLAVES=1
>
>  ... Structural symmetry (in percent)=   93
>  Density: NBdense, Average, Median   =    0   39   29
>  ... No column permutation
>  Ordering based on METIS
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
> [0]PETSC ERROR: likely location of problem given in stack below
> [0]PETSC ERROR: ---------------------  Stack Frames ------------------------------------
> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
> [0]PETSC ERROR:       INSTEAD the line number of the start of the function
> [0]PETSC ERROR:       is given.
> [0]PETSC ERROR: [0] MatLUFactorSymbolic_AIJMUMPS line 880 src/mat/impls/aij/mpi/mumps/C:\PETSC-~1.1\src\mat\impls\aij\mpi\mumps\mumps.c
> [0]PETSC ERROR: [0] MatLUFactorSymbolic line 2820 src/mat/interface/C:\PETSC-~1.1\src\mat\INTERF~1\matrix.c
> [0]PETSC ERROR: [0] PCSetUp_LU line 99 src/ksp/pc/impls/factor/lu/C:\PETSC-~1.1\src\ksp\pc\impls\factor\lu\lu.c
> [0]PETSC ERROR: [0] PCSetUp line 868 src/ksp/pc/interface/C:\PETSC-~1.1\src\ksp\pc\INTERF~1\precon.c
> [0]PETSC ERROR: [0] KSPSetUp line 192 src/ksp/ksp/interface/C:\PETSC-~1.1\src\ksp\ksp\INTERF~1\itfunc.c
> [0]PETSC ERROR: [0] KSPSolve line 356 src/ksp/ksp/interface/C:\PETSC-~1.1\src\ksp\ksp\INTERF~1\itfunc.c
> [0]PETSC ERROR: [0] FluidSolverDirect::Solve line 1145 "unknowndirectory/"..\..\..\Source\FluidSolver\FluidSolverDirect\FluidSolverDirect.cpp
> [0]PETSC ERROR: --------------------- Error Message ------------------------------------
> [0]PETSC ERROR: Signal received!
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: Petsc Release Version 3.4.1, Jun, 10, 2013
> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
> [0]PETSC ERROR: See docs/index.html for manual pages.
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: D:\Solvers-dev\Solvers\builds\bin\Debug\FluidSolverDirectCavityDriver.exe on a win-x64-msvc-mkl-real-release named FREDERICO-PC by Frederico Teixeira Fri Jun 14 17:53:16 2013
> [0]PETSC ERROR: Libraries linked from /cygdrive/c/petsc-3.4.1/win-x64-msvc-mkl-real-release/lib
> [0]PETSC ERROR: Configure run at Fri Jun 14 15:34:42 2013
> [0]PETSC ERROR: Configure options PETSC_ARCH=win-x64-msvc-mkl-real-dbg --with-cc="win32fe cl" --with-cxx="win32fe cl" --with-fc=0 --with-x=0 --with-debugging=1 --with-mpi-dir=/cygdrive/d/Solvers-dev/Solvers/libs/MPI --with-blas-lapack-lib="[/cygdrive/d/Solvers-dev/Solvers/libs/MKL/x64/lib/mkl_bl
> as95_lp64.lib,/cygdrive/d/Solvers-dev/Solvers/libs/MKL/x64/lib/mkl_lapack95_lp64.lib,/cygdrive/d/Solvers-dev/Solvers/libs/MKL/x64/lib/mkl_intel_lp64.lib,/cygdrive/d/Solvers-dev/Solvers/libs/MKL/x64/lib/mkl_intel_thread.lib,/cygdrive/d/Solvers-dev/Solvers/libs/MKL/x64/lib/mkl_core.lib]" --with-metis-
> lib=/cygdrive/d/Solvers-dev/Solvers/libs/METIS/win-x64-msvc/lib/metis.lib --with-metis-include=/cygdrive/d/Solvers-dev/Solvers/libs/METIS/include --with-parmetis-lib=/cygdrive/d/Solvers-dev/Solvers/libs/PARMETIS/win-x64-msvc/lib/parmetis.lib --with-parmetis-include=/cygdrive/d/Solvers-dev/Solvers/li
> bs/PARMETIS/include --with-scalapack-lib="[/cygdrive/d/Solvers-dev/Solvers/libs/MKL/x64/lib/mkl_scalapack_lp64.lib,/cygdrive/d/Solvers-dev/Solvers/libs/MKL/x64/lib/mkl_blacs_mpich2_lp64.lib]" --with-scalapack-include=/cygdrive/d/Solvers-dev/Solvers/libs/MKL/include --with-mumps-lib="[/cygdrive/d/Sol
> vers-dev/Solvers/libs/MUMPS/win-x64-msvc-mkl/lib/dmumps.lib,/cygdrive/d/Solvers-dev/Solvers/libs/MUMPS/win-x64-msvc-mkl/lib/mumps-common.lib,/cygdrive/d/Solvers-dev/Solvers/libs/MUMPS/win-x64-msvc-mkl/lib/pord.lib,/cygdrive/d/Solvers-dev/Solvers/libs/MUMPS/win-x64-msvc-mkl/lib/dmumps-f.lib]" --with-
> mumps-include=/cygdrive/d/Solvers-dev/Solvers/libs/MUMPS/include2 --with-hypre-dir=/cygdrive/d/Solvers-dev/Solvers/libs/HYPRE -CFLAGS=-MD -CXXFLAGS=-MD --LIBS="/cygdrive/d/Solvers-dev/Solvers/libs/IntelCompiler/x64/lib/ifconsol.lib /cygdrive/d/Solvers-dev/Solvers/libs/IntelCompiler/x64/lib/libifcore
> md.lib /cygdrive/d/Solvers-dev/Solvers/libs/IntelCompiler/x64/lib/libifportmd.lib /cygdrive/d/Solvers-dev/Solvers/libs/IntelCompiler/x64/lib/libiomp5md.lib /cygdrive/d/Solvers-dev/Solvers/libs/IntelCompiler/x64/lib/libirc.lib /cygdrive/d/Solvers-dev/Solvers/libs/IntelCompiler/x64/lib/libmmd.lib /cyg
> drive/d/Solvers-dev/Solvers/libs/IntelCompiler/x64/lib/svml_dispmd.lib /cygdrive/d/Solvers-dev/Solvers/libs/MPI/lib/fmpich2.lib" --useThreads=0
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file
> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
>
> job aborted:
> rank: node: exit code[: error message]
> 0: Frederico-PC: 59: process 0 exited without calling finalize
>



More information about the petsc-users mailing list