[petsc-users] Out of memory and parallel issues
TAY wee-beng
zonexo at gmail.com
Thu Jan 1 02:15:44 CST 2015
On 1/1/2015 2:06 PM, Matthew Knepley wrote:
> On Wed, Dec 31, 2014 at 11:20 PM, TAY wee-beng <zonexo at gmail.com
> <mailto:zonexo at gmail.com>> wrote:
>
> Hi,
>
> I used to run my CFD code with 96 procs, with a grid size of 231 x
> 461 x 368.
>
> I used MPI and partition my grid in the z direction. Hence with 96
> procs (8 nodes, each 12 procs), each procs has a size of 231 x 461
> x 3 or 231 x 461 x 4.
>
> It worked fine.
>
> Now I modified the code and added some more routines which
> increases the fixed memory requirement per procs. However, the
> grid size is still the same. But the code aborts while solving the
> Poisson eqn, saying:
>
> Out of memory trying to allocate XXX bytes
>
> I'm using PETSc with HYPRE boomeramg to solve the linear Poisson
> eqn. I am guessing that now the amt of memory per procs is less
> because I added some routines which uses some memory. The result
> is less memory available for the solving of the Poisson eqn.
>
>
> I would try GAMG instead of Hypre. It tends to be memory light
> compared to it.
>
> -pc_type gamg
>
> Thanks,
>
> Matt
Hi Matt,
To use gamg, must I use use DMDA to partition the grid?
Also, does MPI partitioning in only the z direction affect parallel
performance? Since the MPI partition grid is almost like 2D plane with
3-4 cell thickness.
Lastly, using 10x10 = 100 procs seems to work for now, although there's
a wastage of 20 procs since each node has 12 procs.
Thanks!
>
> I'm now changing to KSPBCGS but it seems to take forever. When I
> abort it, the error msg is:
>
> Out of memory. This could be due to allocating
> [10]PETSC ERROR: too large an object or bleeding by not properly
> [10]PETSC ERROR: destroying unneeded objects.
> [10]PETSC ERROR: Memory allocated 0 Memory used by process
> 4028370944 <tel:4028370944>
> [10]PETSC ERROR: Try running with -malloc_dump or -malloc_log for
> info.
>
> I can't use more procs because some procs will have a size of 231
> x 461 x 2 (or even 1). This will give error since I need to
> reference the nearby values along the z direction.
>
> So what options do I have? I'm thinking of these at the moment:
>
> 1. Remove as much fixed overhead memory per procs as possible so
> that there's enough memory for each procs.
>
> 2. Re-partition my grid in both x,y direction or x,y,z direction
> so I will not encounter extremely skew grid dimensions per procs.
> Btw, does having extremely skew grid dimensions affect the
> performance in solving the linear eqn?
>
> Are there other feasible options
>
> --
> Thank you.
>
> Yours sincerely,
>
> TAY wee-beng
>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which
> their experiments lead.
> -- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150101/fac85059/attachment.html>
More information about the petsc-users
mailing list