[petsc-dev] Fwd: setting up a model funky system

Barry Smith bsmith at mcs.anl.gov
Mon Feb 16 17:34:24 CST 2015


  This is why the MPI programming model sucks and needs to be replaced ASAP, it makes people think that things we we should have been doing for years are "really difficult" when they should not be really difficult.


> Begin forwarded message:
> 
> From: Dmitry Karpeyev <karpeev at uchicago.edu>
> Date: February 16, 2015 at 4:04:47 PM CST
> Subject: Re: setting up a model funky system
> To: Barry Smith <bsmith at mcs.anl.gov>
> Cc: "Zhang, Hong" <hongzhang at anl.gov>, Emil Constantinescu <emconsta at mcs.anl.gov>, Jed Brown <jed at jedbrown.org>
> 
> 
> 
> On Mon Feb 16 2015 at 2:57:04 PM Barry Smith <bsmith at mcs.anl.gov <mailto:bsmith at mcs.anl.gov>> wrote:
> 
> > Say we resign to not supporting mesh adaptation.  I only brought that example up to illustrate why you might need steps that aren't part of the ODE.  In the case of moving particles this would mean we do not support particle redistribution.
> 
>   You are really hung up on the particle redistribution, it doesn't change the model in any way nor the numerics so I view it as just a minor book keeping thing and that bookkeeping is happening through the ODE solver process.
> Where in TS is this bookkeeping done or can be done? 
> How could I tell TS to redistribute for me?  More importantly, how can I write this redistribution into my F(t,u,u_t) = G(t,u) formulation?
> It may be a minor thing, but this is what started this whole discussion of whether everything can be hidden inside TSSolve() or not, and what implication this has on the computation of sensitivities.
> 
> 
> > Then we simply cannot do particle transport and have to say that we do not support these "funky" systems.
> >
> >
> >
> >
> > > On Feb 16, 2015, at 3:25 PM, Dmitry Karpeyev <karpeev at uchicago.edu <mailto:karpeev at uchicago.edu>> wrote:
> > >
> > >
> > >
> > > On Mon Feb 16 2015 at 2:21:18 PM Barry Smith <bsmith at mcs.anl.gov <mailto:bsmith at mcs.anl.gov>> wrote:
> > >
> > > > On Feb 16, 2015, at 3:12 PM, Dmitry Karpeyev <karpeev at uchicago.edu <mailto:karpeev at uchicago.edu>> wrote:
> > > >
> > > >
> > > >
> > > > On Mon Feb 16 2015 at 12:42:55 PM Dmitry Karpeyev <karpeev at uchicago.edu <mailto:karpeev at uchicago.edu>> wrote:
> > > > On Mon Feb 16 2015 at 11:52:29 AM Barry Smith <bsmith at mcs.anl.gov <mailto:bsmith at mcs.anl.gov>> wrote:
> > > >
> > > > > On Feb 16, 2015, at 12:34 PM, Dmitry Karpeyev <karpeev at uchicago.edu <mailto:karpeev at uchicago.edu>> wrote:
> > > > >
> > > > > I dropped the ball on this last week, but now I'm talking to the libMesh guys about the best way to integrate TS (and later sensitivity) into libMesh.
> > > > > Here's one issue that came up: adjoint sensitivity computation with mesh adaptivity is going to be hard, because checkpointing just the state isn't enough.
> > > >
> > > >   If you are adapting the mesh then the mesh is part of the state.
> > > > >
> > > > > This is along the same lines as moving particles, but harder. It might require serializing the DM at each checkpoint, but even that might not be sufficient -- we would have to somehow record the history of refinements/coarsenings, since the intervening interpolation/projection operators have to be differentiated through.
> > > > >
> > > > > Coming back to the simpler case of particles, do we need to record the particle movements between the checkpoints?  In any event TSTrajectory might have to be more sophisticated or at least extensible to store this auxiliary information.
> > > >
> > > >    The particles positions at each time-step are part of the solution, are they not, hence they are stored in TSTrajectory. It's just a big ol set of ODEs why make it harder by "making it easier"?
> > > > I guess in the particle-in-mesh case checkpointing can be made tractable. Each particle belongs to the processor that owns the mesh element containing the particle, because the particle drives the right-hand side associated with that element and is affected by the field in that element. So I have an element-to-particle map on each processor.
> > > > Say the mesh isn't changing, so I only have to dump the field-and-particle-positions vector (and maybe the auxiliary data like particle charges, etc.)  When I read the state back in, the current PetscLayout associated with a particle-mesh system tells me which rank each vector entry goes to.  I also know the number of the mesh dofs, so the extra local dofs must be the particle positions. I do particle location and rebuild my element-to-particle map. In this case TSTrajectorySet() can be simple, but TSTrajectoryGet() needs to invoke a user callback that restores the state of the system (wrapped up as a DM, perhaps).
> > > >
> > > > Now I'm marching (back) in time and I need to relate the particle degrees of freedom at one time to another.  Using a canonical particle numbering (assuming the number of particles doesn't change) I can do that with an IS, which also needs to be dumped.  Anything else I'm missing?  What do I do with this IS?  A pair of ISs at two different times allows me to build a scatter, which is equivalent to a permutation matrix that has to be used somewhere in the Jacobian.
> > > >
> > > > This actually suggests the general pattern: we need to be able to differentiate through the TSPostStep events.  In the case of a particle system, at the end of a time step we have updated the particle coordinate vector,
> > >
> > >   Why the heck are you updating it there?  It is an unknown in the coupled set of ODEs it should be completely handled within the ODE solver. For example, here is a crude ODE solver that Emil can fix: integrate the field variables implicitly then integrate the particle positions explicitly; all inside the TSStep().
> > > I think this issue is more obvious in the case of mesh adaptation: how can it be written as part of the ODE?  It amounts to applying an interpolation operator between the timesteps.  Whether we move dofs from rank to rank, or not.
> > >
> > >   Barry
> > >
> > >
> > >
> > > > but not its parallel layout.  That's done outside of TSStep by moving particles to the new owning processors.  That's a mapping (permutation) between the two versions of the state at the same time n -- immediately following timestep n that resulted in state n and immediately preceding timestep n+1.  This mapping is part of the trajectory and has to be differentiated, too.  Should the user be required to provide this TSPostStepJacobian?  Mesh adaptivity can be handled the same way, except this TSPostStepJacobian might be a full-blown interpolation between the two meshes, rather than a MatScatter resulting from particle redistribution.
> > > >
> > > >
> > > >
> > > > Dmitry.
> > > >
> > > >
> > > >
> > > > >
> > > > > Dmitry.
> > > > >
> > > > > On Mon Feb 09 2015 at 8:58:07 PM Dmitry Karpeyev <karpeev at uchicago.edu <mailto:karpeev at uchicago.edu>> wrote:
> > > > > Yes, I can do that.
> > > > > Rather, I'll ask Xujun to help me with it, since otherwise it won't get done till next Christmas.
> > > > > I"m still diddling around with moving particles from rank to rank, but if we ignore that, it should
> > > > > be relatively easy to do it even now.  It might be kinda slow, unless we spend some time on it,
> > > > > though: each time particles move we have to apply PointLocator on them, which is expensive.
> > > > > We ought to be able to do some kind of "ray tracing" to optimize that, but that's later.
> > > > >
> > > > > Dmitry.
> > > > >
> > > > > On Mon Feb 09 2015 at 9:43:40 PM Barry Smith <bsmith at mcs.anl.gov <mailto:bsmith at mcs.anl.gov>> wrote:
> > > > >
> > > > >   So, I ask again, can you cook up a coupled particle-field ODE system that we can apply "proper" ODE methods to (as opposed to ad hoc, splitting, lagging nonsense)?
> > > > >
> > > > >   Barry
> > > > >
> > > > > > On Feb 9, 2015, at 9:36 PM, Dmitry Karpeyev <karpeev at uchicago.edu <mailto:karpeev at uchicago.edu>> wrote:
> > > > > >
> > > > > >
> > > > > >
> > > > > > On Mon Feb 09 2015 at 9:19:49 PM Barry Smith <bsmith at mcs.anl.gov <mailto:bsmith at mcs.anl.gov>> wrote:
> > > > > >
> > > > > > > On Feb 9, 2015, at 9:13 PM, Dmitry Karpeyev <karpeev at uchicago.edu <mailto:karpeev at uchicago.edu>> wrote:
> > > > > > >
> > > > > > > Guys,
> > > > > > > If the particles don't travel from processor to processor, we don't really have anything funky going on.
> > > > > >
> > > > > >   So you are saying you can write down the one set of coupled ODEs and solve them with fully implicit or IMEX methods? Great let's do that then in a simple sequential example. If this is the case then why does no one do it this way?
> > > > > >
> > > > > > So, fully implicit methods aren't necessarily always the goal -- sometimes the physics has natural timescales that are really short (recall the discussion of "topological changes" at the CHiMaD workshop).  This is frequently the case for particle systems so using explicit methods is more appropriate.  I still want to be able to do sensitivity analysis to see how the shape of the channel or the strength of hydrodynamic interactions or the boundary conditions affect the output.  For that reason I have to fit the system into the TS paradigm.
> > > > > >
> > > > > > Even if I'm using an explicit time integrator for the particles, the fields are still implicit, since they are the algebraic part of this ODE: at low Reynolds-number flows (as is frequently the case in soft matter) the flow field is very similar to the electrostatic potential (also present here), which satisfies an elliptic PDE without any time derivatives.  Perhaps IMEX would be useful here.  Why doesn't everybody do it this way?  I'm not sure.
> > > > > >
> > > > > > I think partly the problem is that once you have a mesh, you bring in your favorite shape function library (libMesh, Deal.II), so unless you do everything in something like DMPlex you don't have a great integration with TS.  I actually don't know about Deal.II, but libMesh kinda designed themselves into a corner by insisting that it be in the driver's seat.  It's partly because they wanted to support both PETSc and Trilinos.
> > > > > >
> > > > > > I'm having a conference call with Derek and Roy Stogner (the de facto leader of libMesh development) to talk about how to improve DMlibMesh and integrate better with TS.
> > > > > >
> > > > > > One idea is to have Xujun help with this, since he is the "particle-mesh postdoc".  Maybe we could put him and Hong together so they could help each other: Xujun knows FEM and meshes, Hong is great at the TS stuff.
> > > > > >
> > > > > > Finally, particles in mesh is but the simplest example of "moving meshes" problem: one mesh moving past another.  In fact, even our particles in reality are connected by a 1D mesh making them into a polymer.  Thus, when they move from rank to rank, they have to drag their connectivity, ghosts, spring constants, etc. with them. That complexity is largely on the libMesh side, though.
> > > > > >
> > > > > > Dmitry.
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > > > If we only allow local coupling (particle to the element(s) containing it) the only structural thing that changes is the
> > > > > > > Jacobian sparsity pattern, and we don't have to do anything to deal with it, except reallocate.  The rest is
> > > > > > > automatically taken care of.
> > > > > > >
> > > > > > > Our Mat/Vec paradigm falls apart only when the local Vec sizes change and we can't easily hide it from the
> > > > > > > rest of the solver.  At least that seemed to be the conclusion earlier.
> > > > > >
> > > > > >   We should be able to hide it, we just need to improve the vector class with an implementation that allows entries to change processes; this seems largely independent of the solvers but (except for mat and pc).
> > > > > >
> > > > > >   Barry
> > > > > >
> > > > > > >
> > > > > > > Also, on a technical note periodic boundary conditions actually make things more difficult: in order for the particles
> > > > > > > to reenter on the other side we have to set up these nonlocal ghosts. I guess it's not so hard using DMDA, but I
> > > > > > > have been building the ParticleMesh system in libMesh.
> > > > > > >
> > > > > > > By the way, integrating libMesh with the TS system is a very useful thing: it would open it up to many more meaningful
> > > > > > > examples. If only we had more postdocs to work on this :-)
> > > > > > >
> > > > > > > Dmitry.
> > > > > > >
> > > > > > > On Mon Feb 09 2015 at 9:03:38 PM Barry Smith <bsmith at mcs.anl.gov <mailto:bsmith at mcs.anl.gov>> wrote:
> > > > > > >
> > > > > > > > On Feb 9, 2015, at 8:54 PM, Emil Constantinescu <emconsta at mcs.anl.gov <mailto:emconsta at mcs.anl.gov>> wrote:
> > > > > > > >
> > > > > > > > On 2/9/15 7:24 PM, Barry Smith wrote:
> > > > > > > >>
> > > > > > > >>    We should make a plan and maybe meet for how to solve a funky system of particles plus fields properly with TS. This means creating a single coupled set of ODEs/PDEs and deciding how to integrate them. IMEX with implicit field and explicit in particle positions?? Something else?  Dmitry can you cook a simple 2d model of particles in a field (periodic boundary conditions or whatever to make it easy)?  We won't worry about particles moving between processors etc, the goal is simply how to use TS properly for it (and no the user will not be calling TSStep() themselves in the application code they will call TSSolve()).
> > > > > > > >
> > > > > > > > If the particles don't affect the field, then we can solve the field first and then the particles (or in parallel, but particles are one step back). This is easy to do with TSStep and an example is listed below.
> > > > > > > >
> > > > > > > > If there is a two-way interaction, they obviously need to be solved at the same time for accuracy. Splittings such as IMEX may be considered, but that is really problem dependent.
> > > > > > >
> > > > > > >   The particles have charges and thus affect the field. (of course many people "decouple" them in the integrator, we need your help to do it correctly).
> > > > > > >
> > > > > > > >
> > > > > > > > I am familiar with atmospheric dynamics with chemical tracers (or reactive flows). Here the practice is to solve the fields (some form of NS), save fields to disk, then run the simulation for the particles (advection-diff-reaction) with the offline generated fields. We can cook something with Debo here. But this example may be boring and or out of the realm of cool apps. Dmitry has cooler examples.
> > > > > > > >
> > > > > > > > That being said, I like having TSStep around for trying different things and playing with integrators. I've always hated the way packages like Sundials hide everything and give you only the highest-level functions, but that's just me.
> > > > > > >
> > > > > > >   No one is advocating taking away TSStep, just making it so only people who really know what they are doing start using it on the first day they start using PETSc :-)
> > > > > > >
> > > > > > > >
> > > > > > > > Emil
> > > > > > > >
> > > > > > > >>   Barry
> > > > > > > >>
> > > > > > >
> > > > >
> > > >
> > >
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20150216/45b6a0a3/attachment.html>


More information about the petsc-dev mailing list