Putting a peculiar multilevel scheme into DMMG?

Matthew Knepley knepley at gmail.com
Sun Dec 13 10:11:07 CST 2009


Not sure if you would need API additions. Can't everything you describe be
handled through defining the grid, operator, residual, and projections?

  Matt

On Sun, Dec 13, 2009 at 8:59 AM, Jed Brown <jed at 59a2.org> wrote:

> I can think of a few ways to implement the following multilevel scheme,
> but I'm not sure if it's possible/desirable to have DMMG to manage it.
>
> Finest level       (A)  : DMComposite[DA_2D(dof=1),DA_3D(dof=4+)]
> Intermediate level (B)  : DM_2D(dof=3)
> Coarsest levels    (C1) : DM_2D(dof=1)
>  OR               (C2) : DM_2D(dof=3)
>
> No matrices are assembled on level A, but the smoother involves 1D
> solves within columns.  Restriction A->B involves integration and
> discarding "slow" quantities.  Assembly is available for Level B and
> higher matrices.  There are two coarsening strategies after B, one
> involves a fieldsplit where the coarsest grids are only for a scalar
> problem, other coarsens B directly.
>
> I think grid sequencing and nonlinear multigrid are not important here
> because level B does not contain the long time scales for which
> globalization would be challenging.  So this hierarchy is strictly for
> preconditioning.  The DMMG interface is rather different from other
> components and I recall Barry saying he would like to eventually get rid
> of it.  Is there an advantage to using it here?  Unless I'm missing
> something, it would require some additions to the API (which I can do if
> it is the right thing to do).
>
> Jed
>



-- 
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20091213/ebb34a46/attachment.html>


More information about the petsc-dev mailing list