<div dir="ltr"><div class="gmail_quote"><div>Hi, </div><div><br></div><div>I encountered the similar problem about a couple of weeks ago. </div><div><br></div><div>I tried to use boomerAMG to accelerate the convergence, while it seems that the memory required increases dramatically with the size of the problem. As a result, boomerAMG seems not usable. </div>
<div><br></div><div>I did not do the tests on relation between the size of the memory required and the No. of MPI tasks used. </div><div><br></div><div>Best Regards, </div><div>Kai </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<br>
Date: Wed, 11 Jul 2012 08:48:08 +0000<br>
From: "Klaij, Christiaan" <<a href="mailto:C.Klaij@marin.nl">C.Klaij@marin.nl</a>><br>
To: "<a href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a>" <<a href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a>><br>
Subject: [petsc-users] boomeramg memory usage<br>
Message-ID:<br>
<A9AE64C525438E45B1D60F4D09AEAA9E43E3C028@MAR160N1.marin.local><br>
Content-Type: text/plain; charset="us-ascii"<br>
<br>
I'm solving a 3D Poisson equation on a 200x200x100 grid using<br>
CG and algebraic multigrid as preconditioner. I noticed that with<br>
boomeramg, the memory increases as the number of procs increases:<br>
<br>
1 proc: 2.8G<br>
2 procs: 2.8G<br>
4 procs: 5.2G<br>
8 procs: >12G (max reached, swapping)<br>
<br>
The memory usage is obtained from top. When using ml or gamg (all<br>
with PETSc defaults), the memory usage remains more or less<br>
constant. Something wrong with my install? Some Hypre option I<br>
should set?<br>
<br>
<br>
dr. ir. Christiaan Klaij<br>
CFD Researcher<br>
Research & Development<br>
E mailto:<a href="mailto:C.Klaij@marin.nl">C.Klaij@marin.nl</a><br>
T +31 317 49 33 44<br>
<br>
MARIN<br>
2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands<br>
T +31 317 49 39 11, F +31 317 49 32 45, I <a href="http://www.marin.nl" target="_blank">www.marin.nl</a><br>
<br><br></blockquote></div></div>