[MOAB-dev] Getting ent set data when tag_size > 1
Jed Brown
jed at 59A2.org
Mon Apr 19 13:59:13 CDT 2010
On Mon, 19 Apr 2010 13:15:25 -0500, Tim Tautges <tautges at mcs.anl.gov> wrote:
> Well, maybe one is trying out different discretization schemes before
> settling on one
That's a significant part of my point. Additionally, as the physics
changes (which it *always* does), you usually end up needing to combine
somewhat different discretizations to respect the natural structure of
the different processes. This is a reason to have a uniform yet
flexible interface so that this is easy.
> Support for parallel IO (eventually, doesn't work well now), viz,
> handling of metadata associated with the mesh, etc.
I don't see any reason why the data needs to *be stored on the mesh* in
order to offer all of the above services. I believe in storing metadata
on the mesh, but generally not field values.
> That kind of indirection has to be possible, but I'm asserting others
> also need to be possible.
Why? I don't see others offering any additional capability, just a way
to tempt service-writers to write code that doesn't actually
interoperate because it only works for some special case like Q1 with
data attached to the mesh. The "default" indirection is pure array
lookup so it's not like it should be considered "expensive" (at least
without a use case in which array lookup is demonstrated to be expensive
and cannot trivially be rearranged to work in blocks).
> Some people are very sensitive to library dependencies, and you want
> to be careful about introducing ones that may not be absolutely
> necessary.
Then you can make a metalibrary -lstuff that contains mesh and geometry
and discretization and solvers and IO and visualization and an email
client.
> A more concrete example is point location in a mesh, which depends on
> the shape functions. You're certainly going to store the vertex
> locations with the mesh; whatever library you use for field evaluation
> should be able to evaluate those directly from the mesh. From there
> it's a short way to interpolating solution fields, for coupling.
It's common to use parametrically mapped spaces in which case the true
element shape is in a similar function space to the fields you are
solving for (and with implicit ALE the element shape is a bonified field
you are solving for). Now I realize that node location is important for
lots of spatial queries, but I'm not convinced that it should even be
the final word on the geometry of the mesh. This is most fundamental if
element shape is non-polynomial in which case it cannot be expressed in
a nodal basis at all. iMesh also currently has no support for moving
meshes so every change in mesh geometry (not topology) requires a whole
new mesh.
On the other hand, mesh geometry will most commonly live in function
spaces which can be represented in a conforming (perhaps with hanging
nodes, but not with arbitrary overlap and gaps) nodal space. This is
not true of more general spaces (for "other" solution data). In other
words, physical variables, in addition to being "heavier" and typically
changing more rapidly, tend to live in more exotic spaces. XFEM for
large-deformation crack propagation is probably an exception, depending
on what the geometric queries want to be obtaining.
It is also worth recognizing that an application/discretization library
may well be able to represent a mixed space with much less metadata than
would be required to represent such a space in a general manner (because
it knows about inherent relationships between different fields).
Jed
More information about the moab-dev
mailing list