[petsc-dev] Potential memory issue with block jacobi preconditioner
fabien delalondre
delalf at scorec.rpi.edu
Tue May 31 15:31:14 CDT 2011
Hi All,
I am trying to solve a problem for which we have a couple of 2D planes
located along the toroidal direction. We are trying to use the block jacobi
preconditioner where a block is computed for each 2D plane.
When we use a number of blocks that is equal to the number of planes, the
execution fails with the following message: "glibc detected" (both on
PPPL (large
memory Symetrical MultiProcessing (SMP) system with 80 CPUs and 440 GB of
memory) and NERSC/Hopper machines). If we run the same test case with number
of blocks = 0.5 (number of planes), it seems to run fine (although it's
obviously slow). I ran Totalview/memscape on it and did not find anything
useful so far (no memory leak or memory corruption detected). At this point
I am not sure if the problem is on the petsc side or not. I am now trying to
recompile everything on our local machines at scorec to use valgrind.
The version I used on hopper is the petsc-dev version as of 031011 compiled
with mumps, hypre, scalapack, parmetis, superlu_dist, parmetis. The used
compiler is the default compiler on hopper (pgi)
Please let me know if you have any idea that could help me solving this
problem.
Thank you.
Fabien
--
Fabien Delalondre, PhD.
Senior Research Associate, Scientific Computation Research Center (SCOREC).
Rensselaer Polytechnic Institute (RPI), Troy, NY.
Email: delalf at scorec.rpi.edu, Phone: (518)-276-8045
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20110531/c5f29866/attachment.html>
More information about the petsc-dev
mailing list