On Fri, Nov 11, 2011 at 3:52 PM, Juha Jäykkä <span dir="ltr"><<a href="mailto:juhaj@iki.fi">juhaj@iki.fi</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
<div class="im">> What exactly are you using TAO to do?<br>
<br>
</div>I have a large scale minimisation problem related to the problem I am using<br>
PETSc to solve. The PETSc code is independent, so I could have two PETSc's<br>
around and use the older one for TAO, but that seems like a lot of hassle<br>
without significant benefit.<br>
<br>
I know the SNES module could be used as a substitute for TAO. At the moment, I<br>
solve min(\int f(x)), where TAO needs the gradients of f(x); I could also<br>
solve discrete grad(f(x))=0 with SNES, but as I then need compute the Jacobian<br>
(=Hessian of f(x)), too, this has seemed to be too memory intensive to be<br>
useful. I am not aware of a way around this.</blockquote><div><br></div><div>There are plenty of ways to use SNES without a Mat. You can use -snes_mf, which</div><div>uses a FD approximation to the action of the matrix. You can use -snes_type qn</div>
<div>which uses a quasi-Newton approximation, or even -snes_type nrichardson, which</div><div>just uses successive substitutions with your residual function.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div>
<div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;"><font color="#888888"><br>
-Juha</font></blockquote></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener<br>