<html>
  <head>
    <meta content="text/html; charset=UTF-8" http-equiv="Content-Type">
  </head>
  <body text="#000000" bgcolor="#FFFFFF">
    <div class="moz-cite-prefix">Hi Karl,<br>
      <br>
      Here is the modified ex30.c and a matrix. In the code, you will
      see<br>
      a flag <b>use_vecset_to_avoid_crash_on_gpu </b>set to 0.<br>
      <br>
      mpiexec -np 1 ./ex30 -f0 Matrix_63_rows_1_cpus.petsc -pc_type none<br>
      Fill b with VecSetValues<br>
      Fill x with VecSetValues<br>
        Number of iterations = 169<br>
        Residual norm 0.00278758<br>
      <br>
      mpiexec -np 2 ./ex30 -f0 Matrix_63_rows_1_cpus.petsc -pc_type none<br>
      Fill b with VecSetValues<br>
      Fill x with VecSetValues<br>
        Number of iterations = 108<br>
        Residual norm 0.00133484<br>
      <br>
      mpiexec -np 1 ./ex30 -f0 Matrix_63_rows_1_cpus.petsc -pc_type none
      -mat_type aijcusp -vec_type cusp<br>
      Fill b with VecSetValues<br>
      Fill x with VecSetValues<br>
        Number of iterations = 169<br>
        Residual norm 0.00278986<br>
      <br>
      mpiexec -np 2 ./ex30 -f0 Matrix_63_rows_1_cpus.petsc -pc_type none
      -mat_type aijcusp -vec_type cusp<br>
      Fill b with VecSetValues<br>
      Fill x with VecSetValues<br>
      [0]PETSC ERROR:
      ------------------------------------------------------------------------<br>
      [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation
      Violation, probably memory access out of range<br>
      [0]PETSC ERROR: Try option -start_in_debugger or
      -on_error_attach_debugger<br>
      [0]PETSC ERROR: or see
      <a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind</a>[0]PETSC
      ERROR: or try <a class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and Apple Mac OS X
      to find memory corruption errors<br>
      ....<br>
      <br>
      If you set <b>use_vecset_to_avoid_crash_on_gpu</b> set to 1:<br>
      mpiexec -np 2 ./ex30 -f0 Matrix_63_rows_1_cpus.petsc -pc_type none
      -mat_type aijcusp -vec_type cusp<br>
      Fill b with VecSetValues<br>
      Fill x with VecSet<br>
      Fill x with VecSetValues<br>
        Number of iterations = 108<br>
        Residual norm 0.00135365<br>
      <br>
      Thanks, ask me if you need more details, I am using petsc-dev:<br>
      <br>
      Compiled without FORTRAN kernels<br>
      Compiled with full precision matrices (default)<br>
      sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8
      sizeof(PetscScalar) 8 sizeof(PetscInt) 4<br>
      Configure options:
      --prefix=/work/triou/Version_test_laramon/Trio_U/lib/src/LIBPETSC/petsc/linux_opt
      --with-single-library --with-shared-libraries=0 --with-debugging=0
      --with-errorchecking=1 --COPTFLAGS="   -O3 -fPIC "
      --CXXOPTFLAGS="   -O3 -fPIC " --FOPTFLAGS="  -O3 -fPIC "
      --with-fortran=yes --with-clean=1
      --download-scalapack=../scalapack-2.0.2.tgz
      --download-mumps=../MUMPS_4.10.0-p3.tar.gz
      --download-superlu_dist=../superlu_dist_3.3.tar.gz
      --download-suitesparse=../SuiteSparse-4.2.1.tar.gz
      --download-pastix=../pastix_release_3725.tar.bz2
      --download-parmetis=../parmetis-4.0.2-p5.tar.gz
      --download-metis=../metis-5.0.2-p3.tar.gz
      --download-ptscotch=../ptscotch.tar.gz
      --download-hypre=../hypre-2.9.1a.tar.gz
      --with-valgrind-include=/work/triou/Version_test_laramon/Trio_U/exec/valgrind/include
      --with-blas-lapack-dir=/work/triou/Version_test_laramon/Trio_U/lib/src/LIBLAPACK
      --with-cuda=1
      --with-cuda-dir=/work/triou/Version_test_laramon/Trio_U/exec/cuda5.5
      --with-cusp=1
      --with-cusp-dir=/work/triou/Version_test_laramon/Trio_U/lib/src/LIBPETSC/cusplibrary-0.4.0
      --with-thrust=1 --with-cuda-arch=sm_21 --with-ssl=0
      --with-mpi-dir=/work/triou/Version_test_laramon/Trio_U/lib/src/LIBMPI/openmpi
      --with-x=1<br>
      -----------------------------------------<br>
      Libraries compiled on Tue Jul 29 19:21:34 2014 on laramon <br>
      Machine characteristics:
<a class="moz-txt-link-abbreviated" href="mailto:Linux-2.6.31.14-desktop-1mnb-x86_64-Intel-R-_Xeon-R-_CPU___________X5650__@_2.67GHz-with-mandrake-2010.0-Official">Linux-2.6.31.14-desktop-1mnb-x86_64-Intel-R-_Xeon-R-_CPU___________X5650__@_2.67GHz-with-mandrake-2010.0-Official</a><br>
      Using PETSc directory:
      /work/triou/Version_test_laramon/Trio_U/lib/src/LIBPETSC/petsc-dev<br>
      Using PETSc arch: linux_opt<br>
      -----------------------------------------<br>
      <br>
      Using C compiler:
      /work/triou/Version_test_laramon/Trio_U/lib/src/LIBMPI/openmpi/bin/mpicc 
      -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas
      -O3 -fPIC  ${COPTFLAGS} ${CFLAGS}<br>
      Using Fortran compiler:
      /work/triou/Version_test_laramon/Trio_U/lib/src/LIBMPI/openmpi/bin/mpif90 
      -Wall -Wno-unused-variable -ffree-line-length-0 -O3 -fPIC  
      ${FOPTFLAGS} ${FFLAGS} <br>
      -----------------------------------------<br>
      <br>
      Using include paths:
      -I/work/triou/Version_test_laramon/Trio_U/lib/src/LIBPETSC/petsc-dev/linux_opt/include
      -I/work/triou/Version_test_laramon/Trio_U/lib/src/LIBPETSC/petsc-dev/include
      -I/work/triou/Version_test_laramon/Trio_U/lib/src/LIBPETSC/petsc-dev/include
      -I/work/triou/Version_test_laramon/Trio_U/lib/src/LIBPETSC/petsc-dev/linux_opt/include
      -I/work/triou/Version_test_laramon/Trio_U/exec/cuda5.5/include
      -I/work/triou/Version_test_laramon/Trio_U/lib/src/LIBPETSC/cusplibrary-0.4.0/
      -I/work/triou/Version_test_laramon/Trio_U/lib/src/LIBPETSC/cusplibrary-0.4.0/include
-I/work/triou/Version_test_laramon/Trio_U/lib/src/LIBMPI/openmpi/include<br>
      -----------------------------------------<br>
      <br>
      Using C linker:
      /work/triou/Version_test_laramon/Trio_U/lib/src/LIBMPI/openmpi/bin/mpicc<br>
      Using Fortran linker:
/work/triou/Version_test_laramon/Trio_U/lib/src/LIBMPI/openmpi/bin/mpif90<br>
      Using libraries:
      -Wl,-rpath,/work/triou/Version_test_laramon/Trio_U/lib/src/LIBPETSC/petsc-dev/linux_opt/lib
      -L/work/triou/Version_test_laramon/Trio_U/lib/src/LIBPETSC/petsc-dev/linux_opt/lib
      -lpetsc
      -Wl,-rpath,/work/triou/Version_test_laramon/Trio_U/lib/src/LIBPETSC/petsc-dev/linux_opt/lib
      -L/work/triou/Version_test_laramon/Trio_U/lib/src/LIBPETSC/petsc-dev/linux_opt/lib
      -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord
      -lscalapack -lsuperlu_dist_3.3 -lHYPRE
      -L/work/triou/Version_test_laramon/Trio_U/lib/src/LIBMPI/openmpi/lib
      -L/usr/lib/gcc/x86_64-manbo-linux-gnu/4.4.1 -lstdc++ -lpastix
      -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd
      -lsuitesparseconfig
      -Wl,-rpath,/work/triou/Version_test_laramon/Trio_U/lib/src/LIBLAPACK
      -L/work/triou/Version_test_laramon/Trio_U/lib/src/LIBLAPACK
      -llapack -lblas -lparmetis -lmetis -lX11 -lpthread -lptesmumps
      -lptscotch -lptscotcherr
      -Wl,-rpath,/work/triou/Version_test_laramon/Trio_U/exec/cuda5.5/lib64
      -L/work/triou/Version_test_laramon/Trio_U/exec/cuda5.5/lib64
      -lcufft -lcublas -lcudart -lcusparse -lmpi_f90 -lgfortran -lm
      -lstdc++ -lrt -lm -lrt -lm -lz -lstdc++ -ldl -lmpi_f77 -lmpi
      -lopen-rte -lopen-pal -lnsl -lutil -lgcc_s -lpthread -ldl <br>
      <br>
      Pierre<br>
      <br>
    </div>
    <blockquote cite="mid:53D801B0.2060504@iue.tuwien.ac.at" type="cite">Hi
      Pierre,
      <br>
      <br>
      Dominic Meiser created a couple of pull requests on GPU usage with
      multiple processes
      (<a class="moz-txt-link-freetext" href="https://bitbucket.org/petsc/petsc/pull-requests">https://bitbucket.org/petsc/petsc/pull-requests</a>). I'll work
      through them in the next few days, so chances are good that this
      fixes your problem as well. I'll ping you when it's ready. Can you
      send me your modified ex30.c nevertheless?
      <br>
      <br>
      Best regards,
      <br>
      Karli
      <br>
      <br>
      <br>
      On 07/29/2014 05:09 PM, Projet_TRIOU wrote:
      <br>
      <blockquote type="cite">Yes Matt, I am calling
        VecAssemblyBegin/End() after VecSetValues.
        <br>
        <br>
        What brings me this idea is the ex30 exercice in
        <br>
        src/ksp/ksp/examples/tests/ex30.c
        <br>
        where there was the read of a matrix, then VecSet, then
        VecSetValues
        <br>
        eventually
        <br>
        (and it was working on GPU with mpiexec -np 2 ...). I played a
        lot with
        <br>
        it to discover
        <br>
        VecSet was a turnaround to fix the crash of VecSetValues.
        <br>
        <br>
        I can send my modified ex30.c test to reproduce the bug.
        <br>
        <br>
        Pierre
        <br>
        <blockquote type="cite">On Tue, Jul 29, 2014 at 9:58 AM,
          Projet_TRIOU <<a class="moz-txt-link-abbreviated" href="mailto:triou@cea.fr">triou@cea.fr</a>
          <br>
          <a class="moz-txt-link-rfc2396E" href="mailto:triou@cea.fr"><mailto:triou@cea.fr></a>> wrote:
          <br>
          <br>
              Hello Karl,
          <br>
          <br>
              For your interest, I understood why my code crashes on GPU
          with np
          <br>
              CPUs > 1.
          <br>
          <br>
              I am solving Ax=B with KSPSetInitialGuessNonzero set to
          true so I need
          <br>
              also to fill x with VecSetValues before each solve.
          <br>
          <br>
              VecSetValues works on GPU with np=1 but not for np>1. I
          noticed
          <br>
              that filling
          <br>
              x with VecSet worked fine so when I create x, I now
          initialize
          <br>
              also with VecSet.
          <br>
              (I suppose it copy/create something on GPU that
          VecSetValues doesn't):
          <br>
          <br>
              Vec x;
          <br>
              VecCreate(PETSC_COMM_WORLD,&x);
          <br>
              VecSetSizes(x, nb_rows_, PETSC_DECIDE);
          <br>
              VecSetType(x, VECMPICUSP);
          <br>
              VecSet(x,0.0); // Needed on GPU during parallel
          calculation, else
          <br>
              it crashes
          <br>
          <br>
              // Later:
          <br>
              for (int i=0; i<size; i++)
          <br>
              {
          <br>
                      VecSetValues(b, 1, &colonne_globale,
          &secmem(i),
          <br>
              INSERT_VALUES);
          <br>
                      VecSetValues(x, 1, &colonne_globale,
          &solution(i),
          <br>
              INSERT_VALUES);
          <br>
                      colonne_globale++;
          <br>
              }
          <br>
          <br>
          <br>
          Are you calling VecAssemblyBegin/End() after this? It is
          required for
          <br>
          VecSetValues().
          <br>
          <br>
             Matt
          <br>
          <br>
              Thanks for your help,
          <br>
          <br>
              Pierre
          <br>
          <blockquote type="cite">    Thanks Karl, and sorry for my
            mistake. Indeed, ex9 runs in //
            <br>
                with -pc_type none or -pc_type jacobi.
            <br>
                Good to know I have an example now with VecSetValues I
            can try to
            <br>
                replicate in my code.
            <br>
                I will update the thread as soon as I return to some
            more tests
            <br>
                on it.
            <br>
            <br>
                Pierre
            <br>
            <blockquote type="cite">    Hi Pierre,
              <br>
              <br>
                  I could reproduce the problem, but I could also verify
              that the
              <br>
                  problem you ran into was due to a currently
              unsupported
              <br>
                  block-Jacobi preconditioner. That is, if you first
              comment the
              <br>
                  KSPSolve for the ksp2 object in ex9, recompile ex9,
              and then run
              <br>
              <br>
                  $> mpiexec -np 2 ./ex9 -ksp_monitor -mat_type
              aijcusp -vec_type
              <br>
                  cusp -pc_type none
              <br>
              <br>
                  thinks should work out. This may also be applicable to
              your
              <br>
                  case: Give it a try with -pc_type none and let us
              whether the
              <br>
                  problem with VecSetValues remains. (The use of no
              preconditioner
              <br>
                  is certainly not great, but at least it may give you a
              first
              <br>
                  result)
              <br>
              <br>
                  Best regards,
              <br>
                  Karli
              <br>
              <br>
              <br>
              <br>
                  On 03/26/2014 06:38 PM, Projet_TRIOU wrote:
              <br>
              <blockquote type="cite">    Hello all,
                <br>
                <br>
                    My parallel code is really near to run using GPU
                device
                <br>
                    thanks to PETSc but I am struggling with the best
                way
                <br>
                    to fill the PETSc vectors in order in runs on CPU
                & GPU.
                <br>
                <br>
                    I was using in the past, VecCreateMPIWithArray, to
                <br>
                    create vector but I had trouble with it on GPU.
                <br>
                <br>
                    So I followed the example in the
                <br>
                    src/ksp/ksp/examples/tutorials/ex2.c
                <br>
                    and create now the vectors like this:
                <br>
                <br>
                       // Build x
                <br>
                       ierr =
                VecCreate(PETSC_COMM_WORLD,&SecondMembrePetsc_);
                <br>
                    check(ierr);
                <br>
                       ierr = VecSetSizes(SecondMembrePetsc_, nb_rows,
                nb_rows_tot);
                <br>
                    check(ierr);
                <br>
                       ierr = VecSetFromOptions(SecondMembrePetsc_);
                check(ierr);
                <br>
                       // Build b
                <br>
                       ierr =
                <br>
                   
                VecDuplicate(SecondMembrePetsc_,&SolutionPetsc_);check(ierr);
                <br>
                <br>
                    And fills it with VecSetValues function. It runs
                well on CPU and
                <br>
                    GPU but crashed only in parallel on GPU. It I use
                VecSet
                <br>
                    instead of
                <br>
                    VecSetValues, it didn't crash (but of course VecSet
                is not enough
                <br>
                    for me :-)
                <br>
                <br>
                    I tried to find an example to reproduce for you the
                problem, and I
                <br>
                    think src/ksp/ksp/examples/tutorials/ex9 (it is
                <br>
                    usingVecSetValues,)
                <br>
                    is a good one.
                <br>
                <br>
                    Or did I miss something (I also try
                VecPlaceArray/VecRestoreArray
                <br>
                    but without success on GPU) ?
                <br>
                <br>
                    Thanks, and yes you are right,  "WARNING: Using GPUs
                <br>
                    effectively is
                <br>
                    difficult!" :-)
                <br>
                <br>
                    Pierre
                <br>
                <br>
                   
sitre.intra.cea.fr:/work/triou/git/petsc-dev/Trio_U/lib/src/LIBPETSC/petsc/linux_opt/src/ksp/ksp/examples/tutorials<br>
                <br>
                     > ./ex9 -ksp_monitor
                <br>
                       0 KSP Residual norm 6.612932697792e+00
                <br>
                       1 KSP Residual norm 4.261830032389e-01
                <br>
                       2 KSP Residual norm 2.121746090851e-02
                <br>
                       3 KSP Residual norm 1.233779841608e-03
                <br>
                       4 KSP Residual norm 1.265903168531e-05
                <br>
                       0 KSP Residual norm 1.309416176382e-05
                <br>
                       0 KSP Residual norm 1.404919664063e-05
                <br>
                <br>
                   
sitre.intra.cea.fr:/work/triou/git/petsc-dev/Trio_U/lib/src/LIBPETSC/petsc/linux_opt/src/ksp/ksp/examples/tutorials<br>
                <br>
                     > mpiexec -np 2 ./ex9 -ksp_monitor
                <br>
                       0 KSP Residual norm 2.496821857304e+02
                <br>
                       1 KSP Residual norm 4.522206074831e+01
                <br>
                       2 KSP Residual norm 1.959482408314e+01
                <br>
                       3 KSP Residual norm 7.002013703407e+00
                <br>
                       4 KSP Residual norm 2.144105201713e+00
                <br>
                       5 KSP Residual norm 1.780095080270e-01
                <br>
                       6 KSP Residual norm 5.642702243268e-02
                <br>
                       7 KSP Residual norm 6.439343992306e-03
                <br>
                       8 KSP Residual norm 3.012756374415e-04
                <br>
                    Norm of error 0.000249108, Iterations 8
                <br>
                    Norm of error 0.000715584, Iterations 6
                <br>
                       0 KSP Residual norm 3.422287562824e-04
                <br>
                    Norm of error 0.000249108, Iterations 0
                <br>
                    Norm of error 0.000192805, Iterations 7
                <br>
                       0 KSP Residual norm 4.140588954098e-04
                <br>
                    Norm of error 0.000249108, Iterations 0
                <br>
                    Norm of error 0.000109507, Iterations 7
                <br>
                <br>
                   
sitre.intra.cea.fr:/work/triou/git/petsc-dev/Trio_U/lib/src/LIBPETSC/petsc/linux_opt/src/ksp/ksp/examples/tutorials<br>
                <br>
                     > ./ex9 -ksp_monitor -mat_type aijcusp -vec_type
                cusp
                <br>
                       0 KSP Residual norm 6.612932697792e+00
                <br>
                       1 KSP Residual norm 4.261830032389e-01
                <br>
                       2 KSP Residual norm 2.121746090851e-02
                <br>
                       3 KSP Residual norm 1.233779841608e-03
                <br>
                       4 KSP Residual norm 1.265903168531e-05
                <br>
                       0 KSP Residual norm 1.309416176403e-05
                <br>
                       0 KSP Residual norm 1.404919664088e-05
                <br>
                <br>
                   
sitre.intra.cea.fr:/work/triou/git/petsc-dev/Trio_U/lib/src/LIBPETSC/petsc/linux_opt/src/ksp/ksp/examples/tutorials<br>
                <br>
                     > mpiexec -np 2 ./ex9 -ksp_monitor -mat_type
                aijcusp -vec_type
                <br>
                    cusp
                <br>
                    [0]PETSC ERROR:
                <br>
                   
                ------------------------------------------------------------------------
                <br>
                <br>
                    [0]PETSC ERROR: Caught signal number 11 SEGV:
                Segmentation
                <br>
                    Violation,
                <br>
                    probably memory access out of range
                <br>
                    [0]PETSC ERROR: Try option -start_in_debugger or
                <br>
                    -on_error_attach_debugger
                <br>
                    [0]PETSC ERROR: or see
                <br>
                   
                <a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind</a>[0]PETSC
                <br>
                <br>
                    ERROR: or try <a class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and
                Apple Mac OS
                <br>
                    X to
                <br>
                    find memory corruption errors
                <br>
                    [0]PETSC ERROR: configure using
                --with-debugging=yes,
                <br>
                    recompile, link,
                <br>
                    and run
                <br>
                    [0]PETSC ERROR: to get more information on the
                crash.
                <br>
                    [0]PETSC ERROR: --------------------- Error Message
                <br>
                   
                --------------------------------------------------------------
                <br>
                    [0]PETSC ERROR: Signal received
                <br>
                    [0]PETSC ERROR: See
                <br>
                   
                <a class="moz-txt-link-freetext" href="http://http://www.mcs.anl.gov/petsc/documentation/faq.html">http://http://www.mcs.anl.gov/petsc/documentation/faq.html</a>
                for
                <br>
                    trouble
                <br>
                    shooting.
                <br>
                    [0]PETSC ERROR: Petsc Development GIT revision:
                <br>
                    v3.4.4-3713-g576f62e
                <br>
                    GIT Date: 2014-03-23 15:59:15 -0500
                <br>
                    [0]PETSC ERROR: ./ex9 on a linux_opt named
                sitre.intra.cea.fr
                <br>
                    <a class="moz-txt-link-rfc2396E" href="http://sitre.intra.cea.fr"><http://sitre.intra.cea.fr></a> by triou
                <br>
                    Wed Mar 26 18:32:34 2014
                <br>
                    [0]PETSC ERROR: Configure options
                <br>
                   
--prefix=/work/triou/git/petsc-dev/Trio_U/lib/src/LIBPETSC/petsc/linux_opt<br>
                <br>
                    --with-single-library --with-shared-libraries=0
                --with-debugging=0
                <br>
                    --with-errorchecking=1 --COPTFLAGS="   -O3 -fPIC "
                <br>
                    --CXXOPTFLAGS=" -O3
                <br>
                    -fPIC " --FOPTFLAGS="  -O3 -fPIC "
                --with-fortran=yes
                <br>
                    --with-clean=1
                <br>
                    --download-scalapack=../scalapack-2.0.2.tgz
                <br>
                    --download-mumps=../MUMPS_4.10.0-p3.tar.gz
                <br>
                    --download-superlu_dist=yes
                <br>
                    --download-parmetis=../parmetis-4.0.2-p4.tar.gz
                <br>
                    --download-metis=../metis-5.0.2-p3.tar.gz
                <br>
                    --download-ptscotch=../ptscotch.tar.gz
                <br>
                    --download-hypre=../hypre-2.9.1a.tar.gz
                <br>
                   
--with-valgrind-include=/work/triou/git/petsc-dev/Trio_U/exec/valgrind/include<br>
                <br>
                   
--with-blas-lapack-dir=/work/triou/git/petsc-dev/Trio_U/lib/src/LIBLAPACK<br>
                    --with-cuda=1
                <br>
                   
                --with-cuda-dir=/work/triou/git/petsc-dev/Trio_U/exec/cuda5.5
                <br>
                    --with-cusp=1
                <br>
                   
--with-cusp-dir=/work/triou/git/petsc-dev/Trio_U/lib/src/LIBPETSC/cusplibrary-0.4.0<br>
                <br>
                    --with-thrust=1 --with-cuda-arch=sm_21 --with-ssl=0
                <br>
                   
                --with-mpi-dir=/work/triou/git/petsc-dev/Trio_U/lib/src/LIBMPI/mpich
                <br>
                <br>
                    --with-x=1
                <br>
                    [0]PETSC ERROR: #1 User provided function() line 0
                in  unknown
                <br>
                    file
                <br>
                    [1]PETSC ERROR:
                <br>
                   
                ------------------------------------------------------------------------
                <br>
                <br>
                    [1]PETSC ERROR: Caught signal number 11 SEGV:
                Segmentation
                <br>
                    Violation,
                <br>
                    probably memory access out of range
                <br>
                    [1]PETSC ERROR: Try option -start_in_debugger or
                <br>
                    -on_error_attach_debugger
                <br>
                    [1]PETSC ERROR: or see
                <br>
                   
                <a class="moz-txt-link-freetext" href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind</a>[1]PETSC
                <br>
                <br>
                    ERROR: or try <a class="moz-txt-link-freetext" href="http://valgrind.org">http://valgrind.org</a> on GNU/linux and
                Apple Mac OS
                <br>
                    X to
                <br>
                    find memory corruption errors
                <br>
                    [1]PETSC ERROR: configure using
                --with-debugging=yes,
                <br>
                    recompile, link,
                <br>
                    and run
                <br>
                    [1]PETSC ERROR: application called
                MPI_Abort(MPI_COMM_WORLD,
                <br>
                    59) - process 0
                <br>
                    to get more information on the crash.
                <br>
                    [1]PETSC ERROR: --------------------- Error Message
                <br>
                   
                --------------------------------------------------------------
                <br>
                    [1]PETSC ERROR: Signal received
                <br>
                    [1]PETSC ERROR: See
                <br>
                   
                <a class="moz-txt-link-freetext" href="http://http://www.mcs.anl.gov/petsc/documentation/faq.html">http://http://www.mcs.anl.gov/petsc/documentation/faq.html</a>
                for
                <br>
                    trouble
                <br>
                    shooting.
                <br>
                    [1]PETSC ERROR: Petsc Development GIT revision:
                <br>
                    v3.4.4-3713-g576f62e
                <br>
                    GIT Date: 2014-03-23 15:59:15 -0500
                <br>
                    [1]PETSC ERROR: ./ex9 on a linux_opt named
                sitre.intra.cea.fr
                <br>
                    <a class="moz-txt-link-rfc2396E" href="http://sitre.intra.cea.fr"><http://sitre.intra.cea.fr></a> by triou
                <br>
                    Wed Mar 26 18:32:34 2014
                <br>
                    [1]PETSC ERROR: Configure options
                <br>
                   
--prefix=/work/triou/git/petsc-dev/Trio_U/lib/src/LIBPETSC/petsc/linux_opt<br>
                <br>
                    --with-single-library --with-shared-libraries=0
                --with-debugging=0
                <br>
                    --with-errorchecking=1 --COPTFLAGS="   -O3 -fPIC "
                <br>
                    --CXXOPTFLAGS=" -O3
                <br>
                    -fPIC " --FOPTFLAGS="  -O3 -fPIC "
                --with-fortran=yes
                <br>
                    --with-clean=1
                <br>
                    --download-scalapack=../scalapack-2.0.2.tgz
                <br>
                    --download-mumps=../MUMPS_4.10.0-p3.tar.gz
                <br>
                    --download-superlu_dist=yes
                <br>
                    --download-parmetis=../parmetis-4.0.2-p4.tar.gz
                <br>
                    --download-metis=../metis-5.0.2-p3.tar.gz
                <br>
                    --download-ptscotch=../ptscotch.tar.gz
                <br>
                    --download-hypre=../hypre-2.9.1a.tar.gz
                <br>
                   
--with-valgrind-include=/work/triou/git/petsc-dev/Trio_U/exec/valgrind/include<br>
                <br>
                   
--with-blas-lapack-dir=/work/triou/git/petsc-dev/Trio_U/lib/src/LIBLAPACK<br>
                    --with-cuda=1
                <br>
                   
                --with-cuda-dir=/work/triou/git/petsc-dev/Trio_U/exec/cuda5.5
                <br>
                    --with-cusp=1
                <br>
                   
--with-cusp-dir=/work/triou/git/petsc-dev/Trio_U/lib/src/LIBPETSC/cusplibrary-0.4.0<br>
                <br>
                    --with-thrust=1 --with-cuda-arch=sm_21 --with-ssl=0
                <br>
                   
                --with-mpi-dir=/work/triou/git/petsc-dev/Trio_U/lib/src/LIBMPI/mpich
                <br>
                <br>
                    --with-x=1
                <br>
                    [1]PETSC ERROR: #1 User provided function() line 0
                in  unknown
                <br>
                    file
                <br>
                    application called MPI_Abort(MPI_COMM_WORLD, 59) -
                process 1
                <br>
                <br>
                <br>
                <br>
                    --
                <br>
                    *Trio_U support team*
                <br>
                    Marthe ROUX (01 69 08 00 02) Saclay
                <br>
                    Pierre LEDAC (04 38 78 91 49) Grenoble
                <br>
              </blockquote>
              <br>
            </blockquote>
            <br>
            <br>
                --
            <br>
                *Trio_U support team*
            <br>
                Marthe ROUX (01 69 08 00 02) Saclay
            <br>
                Pierre LEDAC (04 38 78 91 49) Grenoble
            <br>
          </blockquote>
          <br>
          <br>
              --
          <br>
              *Trio_U support team*
          <br>
              Marthe ROUX (01 69 08 00 02) Saclay
          <br>
              Pierre LEDAC (04 38 78 91 49) Grenoble
          <br>
          <br>
          <br>
          <br>
          <br>
          --
          <br>
          What most experimenters take for granted before they begin
          their
          <br>
          experiments is infinitely more interesting than any results to
          which
          <br>
          their experiments lead.
          <br>
          -- Norbert Wiener
          <br>
        </blockquote>
        <br>
        <br>
        --
        <br>
        *Trio_U support team*
        <br>
        Marthe ROUX (01 69 08 00 02) Saclay
        <br>
        Pierre LEDAC (04 38 78 91 49) Grenoble
        <br>
      </blockquote>
      <br>
    </blockquote>
    <br>
    <br>
    <div class="moz-signature">-- <br>
      <b>Trio_U support team</b>
      <br>
      Marthe ROUX (01 69 08 00 02) Saclay
      <br>
      Pierre LEDAC (04 38 78 91 49) Grenoble
    </div>
  </body>
</html>