<div dir="ltr">Thanks Barry - I'll try it and get back to you.<div><br></div><div>Matt: There are lots of cases where this could be a large savings. Here are a few examples:</div><div><br></div><div>1. If you have automatic differentiation. With my newest code, just computing a residual computes the Jacobian as a side effect. If you throw away that Jacobian that's obviously a waste. If you compute one residual without computing the Jacobian (which isn't always possible, depending on how you have your automatic differentiation setup) then you still have to compute _another_ residual to compute the Jacobian... so you're directly doing double the residual computations that are necessary.</div><div><br></div><div>2. Anytime you have extremely expensive work to do "per element" that would need to be done for both the residual and Jacobian. A few examples:</div><div> - Extremely complex, heavy shape function evaluation (think super high order with first and second derivatives needing to be computed)</div><div> - Extremely heavy material property computations that need to happen at each quadrature point. Think: multiscale. Maybe you have an expensive lower-length-scale solve to do at every quadrature point (yes, we've actually done this).</div><div> - MANY coupled variables (we've run thousands). Each of those variables needs to have value, gradient and (possibly) second derivatives computed at every quadrature point. These values are exactly the same for the residual and Jacobian.</div><div><br></div><div>These cases could be so extreme that these heavy "element" calculations actually dominate your residual/jacobian assembly time. That would mean that by computing the residual and Jacobian simultaneously you could directly cut your assembly time in _half_. That could be significant for many applications. In my current application that essentially cuts the whole runtime of the application in half (runtime is very much assembly dominated).</div><div><br></div><div>Derek</div></div><br><div class="gmail_quote"><div dir="ltr">On Fri, Dec 9, 2016 at 3:11 PM Matthew Knepley <<a href="mailto:knepley@gmail.com">knepley@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr" class="gmail_msg"><div class="gmail_extra gmail_msg"><div class="gmail_quote gmail_msg">On Fri, Dec 9, 2016 at 2:10 PM, Barry Smith <span dir="ltr" class="gmail_msg"><<a href="mailto:bsmith@mcs.anl.gov" class="gmail_msg" target="_blank">bsmith@mcs.anl.gov</a>></span> wrote:<br class="gmail_msg"><blockquote class="gmail_quote gmail_msg" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><span class="gmail_msg"><br class="gmail_msg">
> On Dec 9, 2016, at 1:50 PM, Derek Gaston <<a href="mailto:friedmud@gmail.com" class="gmail_msg" target="_blank">friedmud@gmail.com</a>> wrote:<br class="gmail_msg">
><br class="gmail_msg">
> Oh man! Sorry Barry! I swear I looked around before I sent the email. I should have checked the FAQ a little more closely!<br class="gmail_msg">
><br class="gmail_msg">
> I can understand the reasoning in the FAQ... but I still wonder if it might not be useful to provide all three options (Function, Jacobian, FunctionJacobian). In my case I could fill in each one to do the right thing. That way PETSc could call the "FunctionJacobian" one when it knew it needed both<br class="gmail_msg">
<br class="gmail_msg">
</span> Derek,<br class="gmail_msg">
<br class="gmail_msg">
The code literally never knows if it will need a Jacobian following the function evaluation, yes at the first function evaluation it will need the Jacobian unless the function norm is sufficiently small but after that it is only a question of probabilities (which it can't know) whether it will need the Jacobian.<br class="gmail_msg">
<span class="gmail_msg"><br class="gmail_msg">
> (by default that could just farm out to the individual calls). But you guys have definitely thought a lot more about this than I have.<br class="gmail_msg">
><br class="gmail_msg">
> So, do you still recommend what's suggested in the FAQ? Save off the Jacobian computation during the residual computation and then use that when SNES asks for a Jacobian?<br class="gmail_msg">
<br class="gmail_msg">
</span> Yes, try it. I think you can get away with simply putting the new Jacobian matrix values into the same Jacobian matrix that is regularly used so there is no need to "stash the values" somewhere else and copy them over later.<br class="gmail_msg">
<br class="gmail_msg">
I'd be interested in hearing how the performance works out, compute always or compute only when requested.</blockquote><div class="gmail_msg"><br class="gmail_msg"></div></div></div></div><div dir="ltr" class="gmail_msg"><div class="gmail_extra gmail_msg"><div class="gmail_quote gmail_msg"><div class="gmail_msg">Can anyone write down a simple model for a concrete algorithm where this is more efficient? I would like to see the high level reasoning.</div><div class="gmail_msg"><br class="gmail_msg"></div><div class="gmail_msg"> Thanks,</div><div class="gmail_msg"><br class="gmail_msg"></div><div class="gmail_msg"> Matt</div></div></div></div><div dir="ltr" class="gmail_msg"><div class="gmail_extra gmail_msg"><div class="gmail_quote gmail_msg"><div class="gmail_msg"> </div><blockquote class="gmail_quote gmail_msg" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><span class="m_-2227800430047974988HOEnZb gmail_msg"><font color="#888888" class="gmail_msg"><br class="gmail_msg">
Barry<br class="gmail_msg">
</font></span><div class="m_-2227800430047974988HOEnZb gmail_msg"><div class="m_-2227800430047974988h5 gmail_msg"><br class="gmail_msg">
> In the case of automatic differentiation this could make a pretty huge difference in time...<br class="gmail_msg">
><br class="gmail_msg">
> Derek<br class="gmail_msg">
><br class="gmail_msg">
> On Fri, Dec 9, 2016 at 1:49 PM Barry Smith <<a href="mailto:bsmith@mcs.anl.gov" class="gmail_msg" target="_blank">bsmith@mcs.anl.gov</a>> wrote:<br class="gmail_msg">
><br class="gmail_msg">
> Sorry the title in the FAQ is a bit tongue-in-check.<br class="gmail_msg">
><br class="gmail_msg">
> <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#functionjacobian" rel="noreferrer" class="gmail_msg" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html#functionjacobian</a><br class="gmail_msg">
><br class="gmail_msg">
><br class="gmail_msg">
> > On Dec 9, 2016, at 12:45 PM, Derek Gaston <<a href="mailto:friedmud@gmail.com" class="gmail_msg" target="_blank">friedmud@gmail.com</a>> wrote:<br class="gmail_msg">
> ><br class="gmail_msg">
> > Is there a way to tell SNES to simultaneously compute both the residual and the Jacobian in one callback?<br class="gmail_msg">
> ><br class="gmail_msg">
> > My code can compute both simultaneously and it will be more efficient (think FE where you can reuse the shape-functions, variables, material properties, etc. for both residual and Jacobian computation). In addition, I also have automatic differentiation as an option which _definitely_ computes both efficiently (and actually computes residuals, by themselves, much slower).<br class="gmail_msg">
> ><br class="gmail_msg">
> > I was thinking that I may just save off the Jacobian whenever the initial residual computation is asked for by SNES... and then just return that Jacobian when SNES asks for it. This may be a bit dicey though as SNES can ask for residual computations at many different points during the solve.<br class="gmail_msg">
> ><br class="gmail_msg">
> > Thanks for any help!<br class="gmail_msg">
> ><br class="gmail_msg">
> > Derek<br class="gmail_msg">
><br class="gmail_msg">
<br class="gmail_msg">
</div></div></blockquote></div></div></div><div dir="ltr" class="gmail_msg"><div class="gmail_extra gmail_msg"><br class="gmail_msg"><br clear="all" class="gmail_msg"><div class="gmail_msg"><br class="gmail_msg"></div>-- <br class="gmail_msg"><div class="m_-2227800430047974988gmail_signature gmail_msg" data-smartmail="gmail_signature">What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br class="gmail_msg">-- Norbert Wiener</div>
</div></div></blockquote></div>