[petsc-users] Out of memory and parallel issues

Mark Adams mfadams at lbl.gov
Fri Jan 2 12:18:33 CST 2015


On Thu, Jan 1, 2015 at 12:20 AM, TAY wee-beng <zonexo at gmail.com> wrote:

> Hi,
>
> I used to run my CFD code with 96 procs, with a grid size of 231 x 461 x
> 368.
>
> I used MPI and partition my grid in the z direction. Hence with 96 procs
> (8 nodes, each 12 procs), each procs has a size of 231 x 461 x 3 or 231 x
> 461 x 4.
>
> It worked fine.
>
> Now I modified the code and added some more routines which increases the
> fixed memory requirement per procs. However, the grid size is still the
> same. But the code aborts while solving the Poisson eqn, saying:
>
> Out of memory trying to allocate XXX bytes
>
> I'm using PETSc with HYPRE boomeramg to solve the linear Poisson eqn. I am
> guessing that now the amt of memory per procs is less because I added some
> routines which uses some memory. The result is less memory available for
> the solving of the Poisson eqn.
>
> I'm now changing to KSPBCGS but it seems to take forever. When I abort it,
> the error msg is:
>
> Out of memory. This could be due to allocating
> [10]PETSC ERROR: too large an object or bleeding by not properly
> [10]PETSC ERROR: destroying unneeded objects.
> [10]PETSC ERROR: Memory allocated 0 Memory used by process 4028370944
> [10]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info.
>
> I can't use more procs because some procs will have a size of 231 x 461 x
> 2 (or even 1). This will give error since I need to reference the nearby
> values along the z direction.
>
> So what options do I have? I'm thinking of these at the moment:
>
> 1. Remove as much fixed overhead memory per procs as possible so that
> there's enough memory for each procs.
>
> 2. Re-partition my grid in both x,y direction or x,y,z direction so I will
> not encounter extremely skew grid dimensions per procs.


You will probably have to do this at some point anyway so I'd do this.


> Btw, does having extremely skew grid dimensions affect the performance in
> solving the linear eqn?
>

GAMG and HYPRE are not affected much mathematically by funny partitionings.

Mark


>
> Are there other feasible options
>
> --
> Thank you.
>
> Yours sincerely,
>
> TAY wee-beng
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150102/89af710e/attachment.html>


More information about the petsc-users mailing list