<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Wed, May 13, 2015 at 2:54 PM, Justin Chang <span dir="ltr"><<a href="mailto:jychang48@gmail.com" target="_blank">jychang48@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div><div><div>Hello everyone,<br><br></div>From what I read online, one could use a matrix-free method if using the conjugate gradient method for a symmetric and positive definite problem. Some of the PETSc SNES examples like ex12 compute the Jacobian A explicitly, but I was wondering if it's possible to convert this problem into matrix free form. That is, to directly assemble the product of A*x into vector form without first forming A and then invoking MatMult().<br></div></div></div></blockquote><div><br></div><div>Yes, Jed advocates this all the time, but there is some infrastructure that you want for this</div><div>to be really effective. For quadratic and higher orders, it makes sense to compute the action</div><div>of your operator MF. This is done by defining something that looks a lot like your residual</div><div>evaluation (element vec in and out) but computes the linearization. However, you still usually</div><div>need a low order explicit matrix for preconditioning. I have not spent time hooking that all together.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div><div></div>Thanks,<br></div>Justin<br></div>
</blockquote></div><br><br clear="all"><div><br></div>-- <br><div class="gmail_signature">What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div>
</div></div>