On Tue, Jan 31, 2012 at 5:12 AM, Stefano Zampini <span dir="ltr"><<a href="mailto:stefano.zampini@gmail.com">stefano.zampini@gmail.com</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<br><br><div class="gmail_quote">2012/1/31 Jed Brown <span dir="ltr"><<a href="mailto:jedbrown@mcs.anl.gov" target="_blank">jedbrown@mcs.anl.gov</a>></span><br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div><div class="gmail_quote">On Thu, Jan 26, 2012 at 04:04, Stefano Zampini <span dir="ltr"><<a href="mailto:stefano.zampini@gmail.com" target="_blank">stefano.zampini@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div>I'm partially refactoring BDDC code I contributed to petsc-dev, trying to limit the extent of hand-written code and to exploit already existing PETSc code. I want to know if there are DM objects in PETSc suitable to store informations related to the local to global map of MATIS objects which I can use to manage communications between different levels (through DMCoarsen, DMGetInterpolation and DMGetMatrix). DMMesh can do the job?<br>
</div></blockquote></div><br></div><div>Sorry for the slow reply. The question, I think, is how to produce a Galerkin coarse grid operator of type MATIS. I think we can write MatPtAP_IS without too much trouble, although I don't know how it would perform.</div>
<div><br></div><div>I don't see how there is enough information in a MATIS to define coarsening and interpolation algebraically.</div></blockquote><blockquote class="gmail_quote" style="margin:0pt 0pt 0pt 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<div>But maybe you are thinking of making PCBDDC algebraically construct a grid hierarchy? I would not be inclined to put that into DM, instead, I would call PCMG functions directly.</div></blockquote><div><br>About DMMesh, here is my idea: you can attach the adjacency matrix of subdomain
connectivity through faces to
a DMMesh object. Then, DM should be able to manage one level coarsening using
MatPartitioning interface. All stuffs needed to manage communications between levels will then be hidden in the DM interface.<br></div></div></blockquote><div><br></div><div>One thing: Use DMComplex instead of DMMesh. This is a pure C reimplementation without a lot of the cruft.</div>
<div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div class="gmail_quote"><div>I thought about wrapping PCBDDC into PCMG: it can be done, but you will need to apply PCMG inside a PCIS preconditioner. The problem is that BDDC per se (see PCBDDCApplyInterfacePreconditioner) is a 2 level additive method (and this can be wrapped easily into PCMG) applied on a statically-condensed corrected residual (see PCApply_BDDC) (same correction is applied also for PCNN in PETSc). In order to wrap properly PCBDDC into PCMG, I think you will first need to call PCApply_IS (with the same code of the actual PCApply_BDDC, very similar to PCApply_NN), and inside this call PCApplyInterfacePreconditioner_BDDC (or PCApplyInterfacePreconditioner_NN). This will also imply that you can select BDDC (and NN) only by calls of the type<br>
<br>KSPGetPC(ksp,&pc);<br>PCSetType(pc,PCIS);<br>PCISSetType(pc,PCBDDC); (or PCNN)<br><br>
Am I wrong?<span class="HOEnZb"><font color="#888888"><br></font></span></div></div><span class="HOEnZb"><font color="#888888">-- <br>Stefano<br>
</font></span></blockquote></div><br><br clear="all"><div><br></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener<br>