[petsc-users] boomeramg memory usage

Bao Kai paeanball at gmail.com
Wed Jul 11 08:48:02 CDT 2012


Hi,

I encountered the similar problem about a couple of weeks ago.

I tried to use boomerAMG to accelerate the convergence, while it seems that
the memory required increases dramatically with the size of the problem. As
a result, boomerAMG seems not usable.

I did not do the tests on relation between the size of the memory required
and the No. of MPI tasks used.

Best Regards,
Kai

>
> Date: Wed, 11 Jul 2012 08:48:08 +0000
> From: "Klaij, Christiaan" <C.Klaij at marin.nl>
> To: "petsc-users at mcs.anl.gov" <petsc-users at mcs.anl.gov>
> Subject: [petsc-users] boomeramg memory usage
> Message-ID:
>         <A9AE64C525438E45B1D60F4D09AEAA9E43E3C028 at MAR160N1.marin.local>
> Content-Type: text/plain; charset="us-ascii"
>
> I'm solving a 3D Poisson equation on a 200x200x100 grid using
> CG and algebraic multigrid as preconditioner. I noticed that with
> boomeramg, the memory increases as the number of procs increases:
>
> 1 proc:    2.8G
> 2 procs:   2.8G
> 4 procs:   5.2G
> 8 procs:   >12G (max reached, swapping)
>
> The memory usage is obtained from top. When using ml or gamg (all
> with PETSc defaults), the memory usage remains more or less
> constant. Something wrong with my install? Some Hypre option I
> should set?
>
>
> dr. ir. Christiaan Klaij
> CFD Researcher
> Research & Development
> E mailto:C.Klaij at marin.nl
> T +31 317 49 33 44
>
> MARIN
> 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands
> T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120711/409c35d9/attachment.html>


More information about the petsc-users mailing list