<div dir="ltr">On Sat, Oct 5, 2013 at 10:49 PM, Jose David Bermeol <span dir="ltr"><<a href="mailto:jbermeol@purdue.edu" target="_blank">jbermeol@purdue.edu</a>></span> wrote:<br><div class="gmail_extra"><div class="gmail_quote">
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Hi I'm runnig petsc trying to solve a linear system with superlu_dist. However i have a memory violation, atached is the code, and here is the output. Email me if you need something else to figured out what is happening.<br>
</blockquote><div><br></div><div>So it looks like SuperLU_Dist is bombing during an LAPACK operation. It could be an MKL problem, or a SuperLU_Dist problem, or our problem,</div><div>or a mismatch between versions. I would try to simplify the configuration in order to cut down on the possibilities. Eliminate everything that is not</div>
<div>necessary for SuperLU_dist first. Then change to --download-f-blas-lapack. If you still have a crash, send us the matrix since that should be</div><div>reproducible and we can report a SuperLU_dist bug or fix our code.</div>
<div><br></div><div>  Thanks,</div><div><br></div><div>      Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Thanks<br>
<br>
mpiexec -n 2 ./test_solver -mat_superlu_dist_statprint -mat_superlu_dist_matinput distributed<br>
        Nonzeros in L       10<br>
        Nonzeros in U       10<br>
        nonzeros in L+U     10<br>
        nonzeros in LSUB    10<br>
        NUMfact space (MB) sum(procs):  L\U     0.00    all     0.03<br>
        Total highmark (MB):  All       0.03    Avg     0.02    Max     0.02<br>
        Mat conversion(PETSc->SuperLU_DIST) time (max/min/avg):<br>
                              0.000146866 / 0.000145912 / 0.000146389<br>
        EQUIL time             0.00<br>
        ROWPERM time           0.00<br>
        COLPERM time           0.00<br>
        SYMBFACT time          0.00<br>
        DISTRIBUTE time        0.00<br>
        FACTOR time            0.00<br>
        Factor flops    1.000000e+02    Mflops      0.31<br>
        SOLVE time             0.00<br>
[0]PETSC ERROR: ------------------------------------------------------------------------<br>
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range<br>
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger<br>
[0]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSC</a> ERROR: [1]PETSC ERROR: ------------------------------------------------------------------------<br>

[1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range<br>
[1]PETSC ERROR: or try <a href="http://valgrind.org" target="_blank">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors<br>
Try option -start_in_debugger or -on_error_attach_debugger<br>
[1]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[1]PETSC" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[1]PETSC</a> ERROR: or try <a href="http://valgrind.org" target="_blank">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors<br>

[0]PETSC ERROR: likely location of problem given in stack below<br>
[1]PETSC ERROR: likely location of problem given in stack below<br>
[1]PETSC ERROR: ---------------------  Stack Frames ------------------------------------<br>
[1]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,<br>
[0]PETSC ERROR: ---------------------  Stack Frames ------------------------------------<br>
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,<br>
[0]PETSC ERROR: [1]PETSC ERROR:       INSTEAD the line number of the start of the function<br>
[1]PETSC ERROR:       is given.<br>
[1]PETSC ERROR: [1] SuperLU_DIST:pzgssvx line 234 /home/jbermeol/Nemo5/libs/petsc/build-cplx/src/mat/impls/aij/mpi/superlu_dist/superlu_dist.c<br>
[1]PETSC ERROR: [1] MatMatSolve_SuperLU_DIST line 198 /home/jbermeol/Nemo5/libs/petsc/build-cplx/src/mat/impls/aij/mpi/superlu_dist/superlu_dist.c<br>
[1]PETSC ERROR:       INSTEAD the line number of the start of the function<br>
[0]PETSC ERROR:       is given.<br>
[0]PETSC ERROR: [0] SuperLU_DIST:pzgssvx line 234 /home/jbermeol/Nemo5/libs/petsc/build-cplx/src/mat/impls/aij/mpi/superlu_dist/superlu_dist.c<br>
[0]PETSC ERROR: [1] MatMatSolve line 3207 /home/jbermeol/Nemo5/libs/petsc/build-cplx/src/mat/interface/matrix.c<br>
[1]PETSC ERROR: --------------------- Error Message ------------------------------------<br>
[1]PETSC ERROR: [0] MatMatSolve_SuperLU_DIST line 198 /home/jbermeol/Nemo5/libs/petsc/build-cplx/src/mat/impls/aij/mpi/superlu_dist/superlu_dist.c<br>
[0]PETSC ERROR: [0] MatMatSolve line 3207 /home/jbermeol/Nemo5/libs/petsc/build-cplx/src/mat/interface/matrix.c<br>
Signal received!<br>
[1]PETSC ERROR: ------------------------------------------------------------------------<br>
[1]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013<br>
[1]PETSC ERROR: See docs/changes/index.html for recent updates.<br>
[1]PETSC ERROR: See docs/faq.html for hints about trouble shooting.<br>
[1]PETSC ERROR: See docs/index.html for manual pages.<br>
[1]PETSC ERROR: ------------------------------------------------------------------------<br>
[1]PETSC ERROR: ./test_solver on a linux-complex named <a href="http://carter-fe02.rcac.purdue.edu" target="_blank">carter-fe02.rcac.purdue.edu</a> by jbermeol Sat Oct  5 23:45:21 2013<br>
[1]PETSC ERROR: [0]PETSC ERROR: --------------------- Error Message ------------------------------------<br>
[0]PETSC ERROR: Libraries linked from /home/jbermeol/Nemo5/libs/petsc/build-cplx/linux-complex/lib<br>
[1]PETSC ERROR: Configure run at Sat Oct  5 11:19:36 2013<br>
[1]PETSC ERROR: Configure options --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort --with-scalar-type=complex --with-shared-libraries=1 --with-debugging=1 --with-pic=1 --with-clanguage=C++ --with-fortran=1 --with-fortran-kernels=0 --with-blas-lapack-dir=/apps/rhel6/intel/composer_xe_2013.3.163/mkl --with-blacs-lib=/apps/rhel6/intel/composer_xe_2013.3.163/mkl/lib/intel64/libmkl_blacs_intelmpi_lp64.so --with-blacs-include=/apps/rhel6/intel/composer_xe_2013.3.163/mkl/include --with-scalapack-lib="-L/apps/rhel6/intel/composer_xe_2013.3.163/mkl/lib/intel64 -lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64" --with-scalapack-include=/apps/rhel6/intel/composer_xe_2013.3.163/mkl/include --with-valgrind-dir=/apps/rhel6/valgrind/3.8.1 --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --with-mkl-include=/apps/rhel6/intel/composer_xe_2013.3.163/mkl/include --with-mkl-lib="[/apps/rhel6/intel/composer_xe_2013.3.163/mkl/lib/intel64/libmkl_intel_lp64.so,/apps/rhel6/intel/composer_xe_2013.3.163/mkl/lib/intel64/libmkl_intel_thread.so,/apps/rhel6/intel/composer_xe_2013.3.163/mkl/lib/intel64/libmkl_core.so,/apps/rhel6/intel/composer_xe_2013.3.163/mkl/../compiler/lib/intel64/libiomp5.so]" --with-cpardiso-dir=/home/jbermeol/testPetscSolvers/intel_mkl_cpardiso --with-hdf5 --download-hdf5=yes --download-metis=yes --download-parmetis=yes --download-superlu_dist=yes --download-superlu=yes --download-mumps=yes --download-spooles=yes --download-pastix=yes --download-ptscotch=yes --download-umfpack=yes --download-sowing<br>

Signal received!<br>
[0]PETSC ERROR: ------------------------------------------------------------------------<br>
[0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013<br>
[0]PETSC ERROR: See docs/changes/index.html for recent updates.<br>
[0]PETSC ERROR: [1]PETSC ERROR: ------------------------------------------------------------------------<br>
[1]PETSC ERROR: User provided function() line 0 in unknown directory unknown file<br>
See docs/faq.html for hints about trouble shooting.<br>
[0]PETSC ERROR: See docs/index.html for manual pages.<br>
[0]PETSC ERROR: ------------------------------------------------------------------------<br>
[0]PETSC ERROR: ./test_solver on a linux-complex named <a href="http://carter-fe02.rcac.purdue.edu" target="_blank">carter-fe02.rcac.purdue.edu</a> by jbermeol Sat Oct  5 23:45:21 2013<br>
[0]PETSC ERROR: Libraries linked from /home/jbermeol/Nemo5/libs/petsc/build-cplx/linux-complex/lib<br>
[0]PETSC ERROR: Configure run at Sat Oct  5 11:19:36 2013<br>
[0]PETSC ERROR: application called MPI_Abort(MPI_COMM_WORLD, 59) - process 1<br>
Configure options --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort --with-scalar-type=complex --with-shared-libraries=1 --with-debugging=1 --with-pic=1 --with-clanguage=C++ --with-fortran=1 --with-fortran-kernels=0 --with-blas-lapack-dir=/apps/rhel6/intel/composer_xe_2013.3.163/mkl --with-blacs-lib=/apps/rhel6/intel/composer_xe_2013.3.163/mkl/lib/intel64/libmkl_blacs_intelmpi_lp64.so --with-blacs-include=/apps/rhel6/intel/composer_xe_2013.3.163/mkl/include --with-scalapack-lib="-L/apps/rhel6/intel/composer_xe_2013.3.163/mkl/lib/intel64 -lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64" --with-scalapack-include=/apps/rhel6/intel/composer_xe_2013.3.163/mkl/include --with-valgrind-dir=/apps/rhel6/valgrind/3.8.1 --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --with-mkl-include=/apps/rhel6/intel/composer_xe_2013.3.163/mkl/include --with-mkl-lib="[/apps/rhel6/intel/composer_xe_2013.3.163/mkl/lib/intel64/libmkl_intel_lp64.so,/apps/rhel6/intel/composer_xe_2013.3.163/mkl/lib/intel64/libmkl_intel_thread.so,/apps/rhel6/intel/composer_xe_2013.3.163/mkl/lib/intel64/libmkl_core.so,/apps/rhel6/intel/composer_xe_2013.3.163/mkl/../compiler/lib/intel64/libiomp5.so]" --with-cpardiso-dir=/home/jbermeol/testPetscSolvers/intel_mkl_cpardiso --with-hdf5 --download-hdf5=yes --download-metis=yes --download-parmetis=yes --download-superlu_dist=yes --download-superlu=yes --download-mumps=yes --download-spooles=yes --download-pastix=yes --download-ptscotch=yes --download-umfpack=yes --download-sowing<br>

[0]PETSC ERROR: ------------------------------------------------------------------------<br>
[0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file<br>
application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0</blockquote></div><br><br clear="all"><div><br></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener
</div></div>