[petsc-dev] What do people want to have working before a petsc-3.2 release?
Jed Brown
jed at 59A2.org
Tue Dec 21 15:54:43 CST 2010
On Mon, Dec 20, 2010 at 01:31, Barry Smith <bsmith at mcs.anl.gov> wrote:
> > 2. Is it as easy as PCSetDM? Or provide a coarse DM and get a
> hierarchy? Is there an example? I agree about having FieldSplit forward the
> pieces. I recall starting on that. Who is responsible for assembling
> rediscretized coarse operators?
>
> src/ksp/ksp/examples/tutorials/ex45.c
Neat, this does more than I thought. We currently pick up DMComposite and
forward index sets and field names into the splits. Two issues remain
1. Forward DMs into splits. This is superficial, it pretty much just
requires a non-vararg DMComposite accessor. It would be a massive
simplification to create a DMRedundant instead of all the Array
specialization in DMComposite.
2. What if the user wants to run PCMG in only one split? I think this one
is actually hard. Suppose we have
FormFunction1(dm1,X1,X2,F1,user1);
FormFunction2(dm2,X1,X2,F2,user2);
If we do monolithic multigrid, then we call both of these functions with
(dm1,dm2) on each level and there is no conceptual difficulty relative to
the single-level method. But if we only do multigrid on physics 2, then we
still need a way to get X1 (interpolated into whatever form FormFunction2
needs) on coarse levels. If we want to call FormFunction2 from
PCSetUp_FieldSplit:PCSetUp_MG, then we have lost all reference to X1. But
handling all the MG stuff from PCSetUp_FieldSplit is breaking the
abstraction. I'm not seeing an elegant way to handle this, perhaps we
should have a way to cache coupling information on the DM. Any ideas?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20101221/6f340622/attachment.html>
More information about the petsc-dev
mailing list