[petsc-users] Help with ML/BoomerAMG

Gaetan Kenway gaetank at gmail.com
Mon Apr 29 08:51:31 CDT 2013


Hi Jed

This problem was external  flow, transonic Euler, (M=0.85), conserved
variables.  As I stated in my email, the additive schwartz method + (block)
ILU on the subdomains works extremely well for this problem. The real
problem I am interested in however, is preconditioning for the RANS
equations. For the most  part, ASM+ILU works fine for these problems as
well, but I am investigating other methods that may potentially increase
robustness/reduce memory/reduce  computational cost.

Since the solver I'm using is a structured multiblock solver that  uses
 multigrid for the primal  problem, I can use geometric  multigrid,
provided I construct the restriction and prolongation operators myself.

I guess geometric multigrid is  the best approach here.

Thank you

Gaetan

On Mon, Apr 29, 2013 at 9:40 AM, Jed Brown <jedbrown at mcs.anl.gov> wrote:

> Gaetan Kenway <gaetank at gmail.com> writes:
>
> > Hello
> >
> > I am the process of trying out some of the multigrid functionality in
> PETSc
> > and not having much luck. The simple system I am trying to solve is
> adjoint
> > system of equations resulting from the finite volume discretization of
> the
> > Euler equation on a 147,456 cell mesh resulting in a linear system of
> > equations of size 5*147,456=737280. All of the test are done on a single
> > processor and use petsc-3.2-p7.
>
> Is this steady-state Euler?  Exterior or recirculating flow?
> Conservative variables?  What Mach number?
>
> The heuristics used in algebraic multigrid do not work for hyperbolic
> systems like Euler.  There has been some research, but the multigrid
> efficiency that we enjoy for elliptic problems continues to elude us.
>
> For low Mach number, we can build preconditioners based on splitting,
> reducing to an elliptic solve in the pressure space (changing variables
> in the preconditioner if you use conservative variables for the full
> problem).  Otherwise, we're currently stuck with geometric multigrid if
> we want significant coarse-grid acceleration.  With finite volume
> methods, this is done by agglomeration, leading to large cells with many
> faces, but that exactly preserve the conservation statement of the
> fine-grid problem.
>
> The implementation effort required for such methods is why it's still
> popular to use one-level domain decomposition.
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130429/820887a2/attachment.html>


More information about the petsc-users mailing list