[petsc-dev] Are odd grid numbers intentionally required for structured interpolation/refinement in 3.2.2?

Jed Brown jedbrown at mcs.anl.gov
Sun Oct 9 08:00:41 CDT 2011


On Sun, Oct 9, 2011 at 05:23, Aron Ahmadia <aron.ahmadia at kaust.edu.sa>wrote:

> Jed pointed out that I should be demo-ing ex50, not ex19, so I tried to
> port my stuff over.  Oddly enough, if I am using 2^n grid spacing,
> everything falls over. 2^n+1 works, though, which seems non-obvious to me.
>
> --- petsc-ex50/ex50 ‹master*➔ M⁇› » mpirun -n 4 ./ex50 -da_grid_x 65
> -da_grid_y 65 -pc_type mg -pc_mg_levels 3
> lid velocity = 0.000236686, prandtl # = 1, grashof # = 1
> Number of Newton iterations = 6
>
> ./ex50 -da_grid_x 64 -da_grid_y 64 -pc_type mg -pc_mg_levels 2
>
> lid velocity = 0.000244141, prandtl # = 1, grashof # = 1
> [0]PETSC ERROR: --------------------- Error Message
> ------------------------------------
> [0]PETSC ERROR: Arguments are incompatible!
> [0]PETSC ERROR: Ratio between levels: (mx - 1)/(Mx - 1) must be integer: mx
> 64 Mx 32!
>

DMMG always started with a coarse problem and refined a given number of
times regardless of whether linear or nonlinear multigrid is being used.

Now we have

-snes_grid_sequence NLEVELS

which starts with whatever grid you give it and refines. If you use
-snes_grid_sequence N -pc_type mg, then the number of levels in PCMG will be
automatically changed inside the grid sequencing so that it references the
same coarse level.

If instead, you only do linear multigrid, then you pass

-pc_type mg -pc_mg_levels N

and your specified problem is _coarsened_ to produce the number of levels.
That coarsening requires some divisbility in the grid size. The reason it is
2^n+1 instead of 2^n is because this example uses a vertex-centered instead
cell-centered discretization. This is natural for finite differences instead
of finite volumes, but either method could be used with either centering
strategy.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20111009/ff5f1c21/attachment.html>


More information about the petsc-dev mailing list