On Tue, Jul 10, 2012 at 8:30 AM, Barry Smith <span dir="ltr"><<a href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<br>
<a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#computers" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html#computers</a><br>
<br>
likely you will not benefit from using more than 4 or at most 6 cores at of the 8 for each node. This is a memory hardware limitation.<br></blockquote><div><br></div><div>The easiest way to get an idea of the limit is to look at the scaling of VecAXPY. The performance will scale</div>
<div>beautifully until it hits the memory bandwidth bottleneck.</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Barry<br>
<div><div class="h5"><br>
On Jul 10, 2012, at 8:22 AM, Benjamin Sanderse wrote:<br>
<br>
> Hello all,<br>
><br>
> I am solving a Poisson equation with Neumann BC on a structured grid (arising from an incompressible Navier-Stokes problem). Although my mesh is structured, the matrix is 'given' so I am using AMG instead of geometric multigrid, for the moment.<br>
> To solve the Poisson equation I use CG with preconditioning provided by BoomerAMG, using standard options. I have run my problem for different grid sizes and number of processors, but I am confused regarding the parallel scalability. Attached are some timing results that give the average time spent on solving the Poisson equation. As you can see, when going from 1 to 2 processors, the scaling is very good, even for the case of 200^3 grid points (8 million). For larger number of processors this quickly deteriorates. The cluster I am running on has 8 cores per node and 24GB memory per node.<br>
> Can someone comment on these results? Is this what I should expect?<br>
><br>
> Some additional information:<br>
> - I set the NullSpace of the matrix explicitly with MatNullSpaceCreate<br>
> - The set-up of the problem is not included in the timing results. The set-up is not efficient yet (use of MatRow, for example), and there is some code cleanup to do (too many matrices and vectors), but I think this should not affect the performance of the Poisson solve.<br>
> - log_summary and ksp_monitor are attached.<br>
><br>
> Thanks a lot,<br>
><br>
> Benjamin<br>
><br>
</div></div>> <log_summary><br>
> <timing><br>
> <ksp_view><br>
><br>
><br>
<span class="HOEnZb"><font color="#888888">><br>
> --<br>
> Ir. B. Sanderse<br>
><br>
> Centrum Wiskunde en Informatica<br>
> Science Park 123<br>
> 1098 XG Amsterdam<br>
><br>
> t: <a href="tel:%2B31%2020%20592%204161" value="+31205924161">+31 20 592 4161</a><br>
> e: <a href="mailto:sanderse@cwi.nl">sanderse@cwi.nl</a><br>
><br>
<br>
</font></span></blockquote></div><br><br clear="all"><div><br></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener<br>