<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html>
  <head>
    <meta content="text/html; charset=ISO-8859-1"
      http-equiv="Content-Type">
  </head>
  <body bgcolor="#ffffff" text="#000000">
    On 06.05.2012 14:27, Matthew Knepley wrote:
    <blockquote
cite="mid:CAMYG4GmDdNUTaoFP0haxfq1nqVCDCOgVuyE9M5Q2-e+oNTAHLA@mail.gmail.com"
      type="cite">On Sun, May 6, 2012 at 7:28 AM, Alexander Grayver <span
        dir="ltr"><<a moz-do-not-send="true"
          href="mailto:agrayver@gfz-potsdam.de" target="_blank">agrayver@gfz-potsdam.de</a>></span>
      wrote:<br>
      <div class="gmail_quote">
        <blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt
          0.8ex; border-left: 1px solid rgb(204, 204, 204);
          padding-left: 1ex;">
          Hello,<br>
          <br>
          I use KSP and random rhs to compute largest singular value:<br>
        </blockquote>
        <div><br>
        </div>
        <div>1) Is this the whole program? If not, this can be caused by
          memory corruption somewhere else. This is what I suspect.</div>
      </div>
    </blockquote>
    <br>
    Matt,<br>
    <br>
    I can reproduce error using attached test programm and this matrix
    (7 mb):<br>
    <a class="moz-txt-link-freetext" href="http://dl.dropbox.com/u/60982984/A.dat">http://dl.dropbox.com/u/60982984/A.dat</a><br>
    <br>
    <blockquote
cite="mid:CAMYG4GmDdNUTaoFP0haxfq1nqVCDCOgVuyE9M5Q2-e+oNTAHLA@mail.gmail.com"
      type="cite">
      <div class="gmail_quote">
        <div>
          <br>
        </div>
        <div>2) You can put in CHKMEMQ; throughout the code to find
          exactly where the memory corruption happens.</div>
        <div><br>
        </div>
        <div>   Matt</div>
        <div> </div>
        <blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt
          0.8ex; border-left: 1px solid rgb(204, 204, 204);
          padding-left: 1ex;">  ! create solver and set options for
          singular value estimation<br>
           call KSPCreate(MPI_COMM_WORLD,ksp,ierr);CHKERRQ(ierr)<br>
           call KSPSetType(ksp,KSPGMRES,ierr);CHKERRQ(ierr)<br>
           call KSPSetTolerances(ksp,solvertol,PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_DOUBLE_PRECISION,its,ierr);CHKERRQ(ierr)<br>
           call KSPGMRESSetRestart(ksp, its, ierr);CHKERRQ(ierr)<br>
           call KSPSetComputeSingularValues(ksp, flg,
          ierr);CHKERRQ(ierr)<br>
           call KSPSetFromOptions(ksp,ierr);CHKERRQ(ierr)<br>
          <br>
           ! generate random RHS<br>
           call PetscRandomCreate(PETSC_COMM_WORLD,rctx,ierr)<br>
           call PetscRandomSetFromOptions(rctx,ierr)<br>
           call VecSetRandom(b,rctx,ierr)<br>
          <br>
           !no preconditioning<br>
           call KSPGetPC(ksp,pc,ierr);CHKERRQ(ierr)<br>
           call PCSetType(pc,PCNONE,ierr);CHKERRQ(ierr)<br>
           call KSPSetOperators(ksp,A,A,SAME_PRECONDITIONER,ierr);CHKERRQ(ierr)<br>
           !solve system<br>
           call KSPSolve(ksp,b,x,ierr);CHKERRQ(ierr)<br>
           call KSPComputeExtremeSingularValues(ksp, smax, smin,
          ierr);CHKERRQ(ierr)<br>
          <br>
           call KSPDestroy(ksp,ierr);CHKERRQ(ierr)<br>
          <br>
          However it crashes:<br>
          <br>
          [1]PETSC ERROR: ------------------------------------------------------------------------<br>
          [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation
          Violation, probably memory access out of range<br>
          [1]PETSC ERROR: Try option -start_in_debugger or
          -on_error_attach_debugger<br>
          [1]PETSC ERROR: or see <a moz-do-not-send="true"
href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[1]PETSC"
            target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[1]PETSC</a>
          ERROR: or try <a moz-do-not-send="true"
            href="http://valgrind.org" target="_blank">http://valgrind.org</a>
          on GNU/linux and Apple Mac OS X to find memory corruption
          errors<br>
          [1]PETSC ERROR: PetscMallocValidate: error detected at
           PetscDefaultSignalHandler() line 157 in
          /home/lib/petsc-dev1/src/sys/error/signal.c<br>
          [1]PETSC ERROR: Memory at address 0x4aa3f00 is corrupted<br>
          [1]PETSC ERROR: Probably write past beginning or end of array<br>
          [1]PETSC ERROR: Last intact block allocated in
          KSPSetUp_GMRES() line 73 in /home/lib/petsc-dev1/src/ksp/ksp/impls/gmres/gmres.c<br>
          [1]PETSC ERROR: --------------------- Error Message
          ------------------------------------<br>
          [1]PETSC ERROR: Memory corruption!<br>
          [1]PETSC ERROR:  !<br>
          [1]PETSC ERROR: ------------------------------------------------------------------------<br>
          [1]PETSC ERROR: Petsc Development HG revision:
          f3c119f7ddbfee243b51907a90acab15127ccb39  HG Date: Sun Apr 29
          21:37:29 2012 -0500<br>
          [1]PETSC ERROR: See docs/changes/index.html for recent
          updates.<br>
          [1]PETSC ERROR: See docs/faq.html for hints about trouble
          shooting.<br>
          [1]PETSC ERROR: See docs/index.html for manual pages.<br>
          [1]PETSC ERROR: ------------------------------------------------------------------------<br>
          [1]PETSC ERROR: /home/prog on a openmpi-i named node207 by
          user Sun May  6 12:58:24 2012<br>
          [1]PETSC ERROR: Libraries linked from
          /home/lib/petsc-dev1/openmpi-intel-complex-debug-f/lib<br>
          [1]PETSC ERROR: Configure run at Mon Apr 30 10:20:49 2012<br>
          [1]PETSC ERROR: Configure options --with-blacs-include=/opt/intel/Compiler/11.1/072/mkl/include
          --with-blacs-lib=/opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_blacs_openmpi_lp64.a
          --with-blas-lapack-lib="[/opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_intel_lp64.a,/opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_intel_thread.a,/opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_core.a,/opt/intel/Compiler/11.1/072/lib/intel64/libiomp5.a]"
          --with-fortran-interfaces=1 --with-mpi-dir=/opt/mpi/intel/openmpi-1.4.2
          --with-petsc-arch=openmpi-intel-complex-debug-f
          --with-precision=double --with-scalapack-include=/opt/intel/Compiler/11.1/072/mkl/include
          --with-scalapack-lib=/opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_scalapack_lp64.a
          --with-scalar-type=complex --with-x=0
          PETSC_ARCH=openmpi-intel-complex-debug-f<br>
          [1]PETSC ERROR: ------------------------------------------------------------------------<br>
          [1]PETSC ERROR: PetscMallocValidate() line 138 in
          /home/lib/petsc-dev1/src/sys/memory/mtr.c<br>
          [1]PETSC ERROR: PetscDefaultSignalHandler() line 157 in
          /home/lib/petsc-dev1/src/sys/error/signal.c<br>
          <br>
          <br>
          Call stack from debugger:<br>
          <br>
          opal_memory_ptmalloc2_int_free, FP=7fffd4765300<br>
          opal_memory_ptmalloc2_free_hook, FP=7fffd4765330<br>
          PetscFreeAlign,      FP=7fffd4765370<br>
          PetscTrFreeDefault,  FP=7fffd4765520<br>
          KSPReset_GMRES,      FP=7fffd4765740<br>
          KSPReset,            FP=7fffd4765840<br>
          KSPDestroy,          FP=7fffd47659a0<br>
          kspdestroy_,         FP=7fffd47659d0<br>
          <br>
          <br>
          Any ideas?<br>
          <br>
          Thanks.<span class="HOEnZb"><font color="#888888"><br>
              <br>
              -- <br>
              Regards,<br>
              Alexander<br>
              <br>
            </font></span></blockquote>
      </div>
      <br>
      <br clear="all">
      <div><br>
      </div>
      -- <br>
      What most experimenters take for granted before they begin their
      experiments is infinitely more interesting than any results to
      which their experiments lead.<br>
      -- Norbert Wiener<br>
    </blockquote>
    <br>
    <br>
    <pre class="moz-signature" cols="72">-- 
Regards,
Alexander</pre>
  </body>
</html>