<div dir="ltr"><br><div class="gmail_extra"><br><div class="gmail_quote">On Thu, Jan 1, 2015 at 12:20 AM, TAY wee-beng <span dir="ltr"><<a href="mailto:zonexo@gmail.com" target="_blank">zonexo@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Hi,<br>
<br>
I used to run my CFD code with 96 procs, with a grid size of 231 x 461 x 368.<br>
<br>
I used MPI and partition my grid in the z direction. Hence with 96 procs (8 nodes, each 12 procs), each procs has a size of 231 x 461 x 3 or 231 x 461 x 4.<br>
<br>
It worked fine.<br>
<br>
Now I modified the code and added some more routines which increases the fixed memory requirement per procs. However, the grid size is still the same. But the code aborts while solving the Poisson eqn, saying:<br>
<br>
Out of memory trying to allocate XXX bytes<br>
<br>
I'm using PETSc with HYPRE boomeramg to solve the linear Poisson eqn. I am guessing that now the amt of memory per procs is less because I added some routines which uses some memory. The result is less memory available for the solving of the Poisson eqn.<br>
<br>
I'm now changing to KSPBCGS but it seems to take forever. When I abort it, the error msg is:<br>
<br>
Out of memory. This could be due to allocating<br>
[10]PETSC ERROR: too large an object or bleeding by not properly<br>
[10]PETSC ERROR: destroying unneeded objects.<br>
[10]PETSC ERROR: Memory allocated 0 Memory used by process <a href="tel:4028370944" value="+14028370944" target="_blank">4028370944</a><br>
[10]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info.<br>
<br>
I can't use more procs because some procs will have a size of 231 x 461 x 2 (or even 1). This will give error since I need to reference the nearby values along the z direction.<br>
<br>
So what options do I have? I'm thinking of these at the moment:<br>
<br>
1. Remove as much fixed overhead memory per procs as possible so that there's enough memory for each procs.<br>
<br>
2. Re-partition my grid in both x,y direction or x,y,z direction so I will not encounter extremely skew grid dimensions per procs. </blockquote><div><br></div><div>You will probably have to do this at some point anyway so I'd do this.</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Btw, does having extremely skew grid dimensions affect the performance in solving the linear eqn?<br></blockquote><div><br></div><div>GAMG and HYPRE are not affected much mathematically by funny partitionings.</div><div><br></div><div>Mark</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<br>
Are there other feasible options<span class="HOEnZb"><font color="#888888"><br>
<br>
-- <br>
Thank you.<br>
<br>
Yours sincerely,<br>
<br>
TAY wee-beng<br>
<br>
</font></span></blockquote></div><br></div></div>