[petsc-dev] Potential memory issue with block jacobi preconditioner
Matthew Knepley
knepley at gmail.com
Tue May 31 15:35:58 CDT 2011
On Tue, May 31, 2011 at 3:31 PM, fabien delalondre <delalf at scorec.rpi.edu>wrote:
> Hi All,
>
> I am trying to solve a problem for which we have a couple of 2D planes
> located along the toroidal direction. We are trying to use the block jacobi
> preconditioner where a block is computed for each 2D plane.
>
> When we use a number of blocks that is equal to the number of planes, the
> execution fails with the following message: "glibc detected" (both on PPPL (large
> memory Symetrical MultiProcessing (SMP) system with 80 CPUs and 440 GB of
> memory) and NERSC/Hopper machines). If we run the same test case with
> number of blocks = 0.5
>
Did you run this in debug mode? Sometimes the simple PETSc memory tracing
can catch things.
Matt
> (number of planes), it seems to run fine (although it's obviously slow). I
> ran Totalview/memscape on it and did not find anything useful so far (no
> memory leak or memory corruption detected). At this point I am not sure if
> the problem is on the petsc side or not. I am now trying to recompile
> everything on our local machines at scorec to use valgrind.
>
> The version I used on hopper is the petsc-dev version as of 031011 compiled
> with mumps, hypre, scalapack, parmetis, superlu_dist, parmetis. The used
> compiler is the default compiler on hopper (pgi)
>
> Please let me know if you have any idea that could help me solving this
> problem.
> Thank you.
>
> Fabien
>
>
>
>
>
>
> --
> Fabien Delalondre, PhD.
> Senior Research Associate, Scientific Computation Research Center (SCOREC).
> Rensselaer Polytechnic Institute (RPI), Troy, NY.
> Email: delalf at scorec.rpi.edu, Phone: (518)-276-8045
>
>
--
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20110531/ad4b82a4/attachment.html>
More information about the petsc-dev
mailing list