<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Thu, Apr 9, 2015 at 3:17 AM, Patrick Farrell <span dir="ltr"><<a href="mailto:patrick.farrell@maths.ox.ac.uk" target="_blank">patrick.farrell@maths.ox.ac.uk</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Hi Matt, Jason, and everyone else,<br>
<br>
I have a big system (for two variables, u and alpha) and I'd like to solve this<br>
by "alternate minimisation": block Gauss-Seidel-Newton, first solving for u and<br>
then for alpha ("smaller systems"). I'd like to write the code in a generic way,<br>
like a nonlinear fieldsplit, and in an ideal world contribute this back<br>
upstream. It would be very useful for lots of things (like PDE-constrained<br>
optimisation, where you cycle through forward - adjoint - gradient a few times<br>
before starting Newton).<br>
<br>
I have the first draft (in Python) written and working; it takes in a list of<br>
index sets, creates subsneses for each subproblem, and wires them up with<br>
residual and Jacobian functions that take the relevant subsets. I've only<br>
implemented the multiplicative version so far, although implementing the<br>
additive variant would be straightforward.<br></blockquote><div><br></div><div>So your code is more like Fieldsplit than the SNESMultiblock?</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
I'm stuck on one point: I'd like to do this matrix-free, where the Jacobian of<br>
the big problem is a shell matrix (for deflation, but also for time-dependent<br>
PDE-constrained optimisation). It's straightforward (or will be when Jason's<br>
commits land) to set up the sub-Jacobians with MatCreateSubMatrix. But I'm not<br>
sure how to precondition the small Jacobian arising in applying Newton to each<br>
subproblem. Given a preconditioner object for the big coupled system, is there a<br>
way to "notify" the PC to say: here is your operator, which is the big Jacobian<br>
with an index set attached? In my case with deflation, if the PC object could<br>
know the right index set, I can construct an effective preconditioner for the<br>
problem; I just don't know how to communicate that information from the solver<br>
to the PC.<br>
<br>
The same question arises in matrix-free preconditioning of variational<br>
inequalities with active set methods, so I thought you might have thought about<br>
this before.<br></blockquote><div><br></div><div>Just so I understand, you are talking about a method that projects the problem onto</div><div>a subspace, so</div><div><br></div><div> F(u) = 0 ==> P^T F (u) = 0</div><div><br></div><div>which I assume has Jacobian</div><div><br></div><div> P^T F' P</div><div><br></div><div>and you would do something like</div><div><br></div><div> P^T \hat J P</div><div><br></div><div>so you want to know P when calculating a MF PC. Is this right? I can see two</div><div>related ways to do this. First, the information about P should be available from</div><div>the solver, which you get passed in the PC construction. Second, we could have</div><div>a mode for these projected solvers where the Jacobian PC is also projected.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Thanks for your advice!<span class="HOEnZb"><font color="#888888"><br>
<br>
Patrick<br>
</font></span></blockquote></div><br><br clear="all"><div><br></div>-- <br><div class="gmail_signature">What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div>
</div></div>