<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Wed, Nov 27, 2013 at 12:43 PM, Geoffrey Irving <span dir="ltr"><<a href="mailto:irving@naml.us" target="_blank">irving@naml.us</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Currently, FE fields on a DMPlex are created with the function<br>
<br>
DMPlexProjectFunction<br>
<br>
This function has three issues:<br>
<br>
1. It doesn't take a void* argument, and therefore requires global<br>
variables if you need to pass in additional data.<br></blockquote><div><br></div><div>I am not opposed to DMPlex functions taking contexts. However, my</div><div>question is: how is intended to receive this context? So far, I am</div>
<div>unconvinced that PetscFE needs an outside context. It just seems like</div><div>bad design to me.</div><div><br></div><div>The original intention is to pass evaluation data in as a mesh field. How</div><div>does this break down?</div>
<div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
2. It operates one quadrature point and a time, so is inefficient in<br>
situations where evaluation overhead can be amortized across multiple<br>
points (especially from scripts). For the workhorse FE routines this<br>
is inconvenient but acceptable since the goal is probably OpenCL, but<br>
it'd would be nice to have access to python and numpy in the setup<br>
phase.<br></blockquote><div><br></div><div>I think this is oversimplified. DMPlex does operate in batches of quadrature</div><div>points, in fact in batches of cells (although the batch size is 1 now). It is</div><div>PetscDualSpace that calls things one quad point at a time. I would think that </div>
<div>he Python interface would come in at the level of the PetscDualSpace evaluator. How</div><div>does this differ from what you had planned?</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
3. It doesn't pass a cell argument, so data in an existing mesh fields<br>
requires a hierarchy traversal to access.<br></blockquote><div><br></div><div>I think I do not understand something fundamental about how you are using this</div><div>function. This is intended to be an orthogonal projection of the function f onto the</div>
<div>finite element space V. Why would we need a cell index or additional information?</div><div>Is it because the function f you have is not analytic, but another mesh field? I will</div><div>think about optimizing this case.</div>
<div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
I'm going to make an alternate version of DMPlexProjectFunction that<br>
addresses these issues. If you think it's too early to know the best<br>
way to do this in core petsc, I can put the routine in my code for now<br>
and we can revisit migrating it into petsc later. I'm fine either<br>
way. Concretely, these issues are solved by<br>
<br>
1. Adding a void* argument.<br>
2. Batching quadrature point evaluation.<br>
3. Passing the cell index.<br>
<br>
so there aren't a lot of choices to get wrong. Really the only choice<br>
is who chooses the batch size.<br>
<span class="HOEnZb"><font color="#888888"><br>
Geoffrey<br>
</font></span></blockquote></div><br><br clear="all"><div><br></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener
</div></div>