[petsc-users] Poisson equation with CG and BoomerAMG

Matthew Knepley knepley at gmail.com
Tue Jul 10 08:56:16 CDT 2012


On Tue, Jul 10, 2012 at 8:30 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
>   http://www.mcs.anl.gov/petsc/documentation/faq.html#computers
>
> likely you will not benefit from using more than 4 or at most 6 cores at
> of the 8 for each node. This is a memory hardware limitation.
>

The easiest way to get an idea of the limit is to look at the scaling of
VecAXPY. The performance will scale
beautifully until it hits the memory bandwidth bottleneck.

    Matt


>    Barry
>
> On Jul 10, 2012, at 8:22 AM, Benjamin Sanderse wrote:
>
> > Hello all,
> >
> > I am solving a Poisson equation with Neumann BC on a structured grid
> (arising from an incompressible Navier-Stokes problem). Although my mesh is
> structured, the matrix is 'given' so I am using AMG instead of geometric
> multigrid, for the moment.
> > To solve the Poisson equation I use CG with preconditioning provided by
> BoomerAMG, using standard options. I have run my problem for different grid
> sizes and number of processors, but I am confused regarding the parallel
> scalability. Attached are some timing results that give the average time
> spent on solving the Poisson equation. As you can see, when going from 1 to
> 2 processors, the scaling is very good, even for the case of 200^3 grid
> points (8 million). For larger number of processors this quickly
> deteriorates. The cluster I am running on has 8 cores per node and 24GB
> memory per node.
> > Can someone comment on these results? Is this what I should expect?
> >
> > Some additional information:
> > - I set the NullSpace of the matrix explicitly with MatNullSpaceCreate
> > - The set-up of the problem is not included in the timing results. The
> set-up is not efficient yet (use of MatRow, for example), and there is some
> code cleanup to do (too many matrices and vectors), but I think this should
> not affect the performance of the Poisson solve.
> > - log_summary and ksp_monitor are attached.
> >
> > Thanks a lot,
> >
> > Benjamin
> >
> > <log_summary>
> > <timing>
> > <ksp_view>
> >
> >
> >
> > --
> > Ir. B. Sanderse
> >
> > Centrum Wiskunde en Informatica
> > Science Park 123
> > 1098 XG Amsterdam
> >
> > t: +31 20 592 4161
> > e: sanderse at cwi.nl
> >
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120710/70263b2e/attachment.html>


More information about the petsc-users mailing list