[petsc-dev] boomerAmg scalability

Matthew Knepley knepley at gmail.com
Wed Jan 4 11:29:46 CST 2012


On Wed, Jan 4, 2012 at 11:14 AM, Ravi Kannan <rxk at cfdrc.com> wrote:

> Hi Mark, Matt,****
>
> ** **
>
> We recently downloaded the petsc development version, to test the gamg
> package.****
>
> ** **
>
> This works in serial : we tried for small cases. The parallel case (even
> with 2 partitions) just hangs. As of now, we have not set any parameters.
> So I guess that the default parameters are being used.****
>
> ** **
>
> Do we need to explicitly set the block size to MatSetBlockSize(mat,1) for
> a PARALLEL run? Our solver solves U,V,W and P separately. ****
>
> ** **
>
> Any input on this would be great.
>

Can you try running the ex56 in parallel?

   Matt


> Thanks,****
>
> Ravi. ****
>
> ** **
>
> *From:* petsc-dev-bounces at mcs.anl.gov [mailto:
> petsc-dev-bounces at mcs.anl.gov] *On Behalf Of *Mark F. Adams
> *Sent:* Thursday, December 15, 2011 1:18 PM
>
> *To:* For users of the development version of PETSc
> *Subject:* Re: [petsc-dev] boomerAmg scalability****
>
> ** **
>
> ** **
>
> On Dec 15, 2011, at 1:56 PM, Matthew Knepley wrote:****
>
>
>
> ****
>
> On Thu, Dec 15, 2011 at 10:23 AM, Ravi Kannan <rxk at cfdrc.com> wrote:****
>
> Dear All,****
>
>  ****
>
> This is Ravi Kannan from CFD Research Corporation. Recently, we are
> experimenting with the BoomerAMG preconditioner for some “stiff” CFD
> problems. In that regard, all the other standard solver-preconditioner
> combinations failed for the current CFD problem. The boomer is the only one
> which is able to provide with “converged” solutions.****
>
>  ****
>
> We noticed that the scalability of this boomer preconditioner is really
> poor. For instance, even with a cell size of 2 million, we cannot scale to
> even 16 partitions (in contrast, the other solver-preconditioner
> combinations like the BI-CGS/BJacobi gave good enough scalability).****
>
>  ****
>
> Are we missing something? Do we need to use a more latest version of
> boomer?****
>
> ** **
>
> Have you tried -pc_type gamg in petsc-dev?****
>
> ** **
>
> For gamg you also want to use MPIAIJ matrices and set the block size
> MatSetBlockSize(mat,3), for a 3D velocity field, for instance.  You can
> also try '-pc_gamg_type pa' or '-pc_gamg_type sa'.  "pa", for plain
> aggregation might be better for CFD problems.****
>
> ** **
>
> Mark****
>
>
>
> ****
>
> ** **
>
>   Matt****
>
>   ****
>
> Thanks,****
>
> Ravi. ****
>
> ** **
>
>
> ****
>
> ** **
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener****
>
> ** **
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120104/1c29a924/attachment.html>


More information about the petsc-dev mailing list