On Tue, May 31, 2011 at 3:39 PM, fabien delalondre <span dir="ltr"><<a href="mailto:delalf@scorec.rpi.edu">delalf@scorec.rpi.edu</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
<br><br><div class="gmail_quote"><div class="im">On Tue, May 31, 2011 at 4:35 PM, Matthew Knepley <span dir="ltr"><<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div>On Tue, May 31, 2011 at 3:31 PM, fabien delalondre <span dir="ltr"><<a href="mailto:delalf@scorec.rpi.edu" target="_blank">delalf@scorec.rpi.edu</a>></span> wrote:<br></div><div class="gmail_quote"><div>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<span style="font-family:arial, sans-serif;font-size:15px;border-collapse:collapse;color:rgb(31, 73, 125)"><div><font color="#000000">Hi All,</font></div><div><font color="#000000"><br>
</font></div><div><font color="#000000">I am trying to solve a problem for which we have a couple of 2D planes located along the toroidal direction. We are trying to use the block jacobi preconditioner where a block is computed for each 2D plane.</font></div>
<div><font color="#000000"><br></font></div><div><font color="#000000">When we use a number of blocks that is equal to the number of planes, the execution fails with the following message: "glibc detected" (both on PPPL (</font><span style="border-collapse:separate;color:rgb(0, 0, 0);font-family:Times;font-size:medium">large memory Symetrical MultiProcessing (SMP) system with 80 CPUs and 440 GB of memory</span>) <font color="#000000">and NERSC/Hopper machines). If we run the same test case with number of blocks = 0.5 </font></div>
</span></blockquote><div><br></div></div><div>Did you run this in debug mode? Sometimes the simple PETSc memory tracing can catch things.</div></div></blockquote><div><br></div></div><div>Yes but I did not get anything useful so far.</div>
</div></blockquote><div><br></div><div>It sounds like valgrind is the next logical step.</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
<div class="gmail_quote"><font color="#888888"><div>Fabien</div></font><div class="im"><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div class="gmail_quote">
<div><br></div><div> Matt</div><div>
<div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<span style="font-family:arial, sans-serif;font-size:15px;border-collapse:collapse;color:rgb(31, 73, 125)"><div><font color="#000000">(number of planes), it seems to run fine (although it's obviously slow). I ran Totalview/memscape on it and did not find anything useful so far (no memory leak or memory corruption detected). At this point I am not sure if the problem is on the petsc side or not. I am now trying to recompile everything on our local machines at scorec to use valgrind.</font></div>
<div><font color="#000000"><br></font></div><div><font color="#000000">The version I used on hopper is the petsc-dev version as of 031011 compiled with mumps, hypre, scalapack, parmetis, superlu_dist, parmetis. The used compiler is the default compiler on hopper (pgi)</font></div>
<div><font color="#000000"><br></font></div><div><font color="#000000">Please let me know if you have any idea that could help me solving this problem.</font></div><div><font color="#000000">Thank you.</font></div>
<div><font color="#000000"><br></font></div><div><font color="#000000">Fabien</font></div><div><font color="#000000"><br></font></div><div><font color="#000000"><br>
</font></div><div><font color="#000000"><br></font></div><div><span style="font-family:arial, sans-serif;font-size:15px;border-collapse:collapse;color:rgb(31, 73, 125)"><br>
</span></div></span><br clear="all"><br>-- <br><div>Fabien Delalondre, PhD.</div><div>Senior Research Associate, Scientific Computation Research Center (SCOREC).</div><div>Rensselaer Polytechnic Institute (RPI), Troy, NY.</div>
<div>Email: <a href="mailto:delalf@scorec.rpi.edu" target="_blank">delalf@scorec.rpi.edu</a>, Phone: <a href="tel:%28518%29-276-8045" value="+15182768045" target="_blank">(518)-276-8045</a></div><br>
</blockquote></div></div><font color="#888888"><br><br clear="all"><br>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener<br>
</font></blockquote></div></div><div><div></div><div class="h5"><br><br clear="all"><br>-- <br><div>Fabien Delalondre, PhD.</div><div>Senior Research Associate, Scientific Computation Research Center (SCOREC).</div><div>
Rensselaer Polytechnic Institute (RPI), Troy, NY.</div>
<div>Email: <a href="mailto:delalf@scorec.rpi.edu" target="_blank">delalf@scorec.rpi.edu</a>, Phone: <a href="tel:%28518%29-276-8045" value="+15182768045" target="_blank">(518)-276-8045</a></div><br>
</div></div></blockquote></div><br><br clear="all"><br>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener<br>