On Sat, Aug 18, 2012 at 4:11 PM, Chris Eldred <span dir="ltr"><<a href="mailto:chris.eldred@gmail.com" target="_blank">chris.eldred@gmail.com</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Yes I think that makes sense. How do you access the associated PetscSF<br>
for the DMComplex? Is it possible to define multiple PetscSF's for a<br>
given DMComplex, each associated with different PetscSections?<br></blockquote><div><br></div><div>1) DMGetDefaultSF() for the one for the default Section</div><div><br></div><div>2) Yes, but you need to create them yourself using DMCreateDefaultSF()</div>
<div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
My current understanding of using DMComplex is (for the serial case):<br>
<br>
1) Create a DMComplex object, set its cones/supports, etc.<br></blockquote><div><br></div><div>A good guide is to look at complexcreate.c, where I create a few default meshes.</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
2) Create PetscSections that describe the layout of data across the DMComplex<br>
3) Set a PetscSection as the default section<br>
4) Use DMCreateLocalVector to create the Vec objects needed for the computation<br>
<br>
Is this correct?<br></blockquote><div><br></div><div>Yes</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
In the parallel case, what else needs to be done?</blockquote><div><br></div><div>In parallel,</div><div><br></div><div> a) I call DMComplexDistribute(). The alternative is to create your mesh directly in parallel,</div>
<div> but this does not work with any mesh generator I know, and you would have to create</div><div> the PetscSF for points that this function makes automatically.</div><div><br></div><div> b) Now the section you create is local</div>
<div><br></div><div> c) You can use the local section and pointSF to create a global section, which knows about overlap</div><div><br></div><div> d) You can use the local and global section to make the dofSF</div><div><br>
</div><div>c) and d) are done automatically for the default section (data layout).</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<span class="HOEnZb"><font color="#888888"><br>
-Chris<br>
</font></span><div class="HOEnZb"><div class="h5"><br>
On Fri, Aug 17, 2012 at 1:24 PM, Matthew Knepley <<a href="mailto:knepley@gmail.com">knepley@gmail.com</a>> wrote:<br>
> On Fri, Aug 17, 2012 at 1:21 PM, Chris Eldred <<a href="mailto:chris.eldred@gmail.com">chris.eldred@gmail.com</a>><br>
> wrote:<br>
>><br>
>> I was wondering how ghost values were handled when creating a<br>
>> distributed (global) vector from a specific PetscSection over a<br>
>> DMComplex object- ie what defines the stencils used to figure out<br>
>> which values need to be ghosted? What about local vs. global indexes?<br>
><br>
><br>
> This answer will have to be long because the question is mixing levels in<br>
> the interface.<br>
><br>
> A PetscSection is only an offset structure, so it has no designated fields<br>
> for overlap information. What I do for DM is to create 2 sections: one<br>
> describing the layout of the local vector, and one describing the global<br>
> vector and ghosts. The global section can be created automatically given<br>
> the local section and a PetscSF (successor to VecScatter).<br>
><br>
> The PetscSF describes the parallel overlap, and also has scatters and<br>
> reductions.<br>
> This is created by the DMComplex instead of a scatter since it can handle<br>
> arbitrary types, and has a better query interface. DMComplex creates a<br>
> (non-overlapping) partition of some stratum of points, usually cells for<br>
> FEM.<br>
> Then the closure(p) U star(p) for each point is added to that partition, and<br>
> the dofs on any shared point are ghosted.<br>
><br>
> This is all now done using the vanilla DM interface, so DMGlobalToLocal()<br>
> and LocalToGlobal() and cousins work as normal.<br>
><br>
> Does that make sense?<br>
><br>
> Matt<br>
><br>
>><br>
>><br>
>> -Chris<br>
>><br>
>> --<br>
>> Chris Eldred<br>
>> DOE Computational Science Graduate Fellow<br>
>> Graduate Student, Atmospheric Science, Colorado State University<br>
>> B.S. Applied Computational Physics, Carnegie Mellon University, 2009<br>
>> <a href="mailto:chris.eldred@gmail.com">chris.eldred@gmail.com</a><br>
><br>
><br>
><br>
><br>
> --<br>
> What most experimenters take for granted before they begin their experiments<br>
> is infinitely more interesting than any results to which their experiments<br>
> lead.<br>
> -- Norbert Wiener<br>
<br>
<br>
<br>
--<br>
Chris Eldred<br>
DOE Computational Science Graduate Fellow<br>
Graduate Student, Atmospheric Science, Colorado State University<br>
B.S. Applied Computational Physics, Carnegie Mellon University, 2009<br>
<a href="mailto:chris.eldred@gmail.com">chris.eldred@gmail.com</a><br>
</div></div></blockquote></div><br><br clear="all"><div><br></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener<br>