On Fri, Dec 2, 2011 at 6:20 PM, Dave Nystrom <span dir="ltr"><<a href="mailto:Dave.Nystrom@tachyonlogic.com">Dave.Nystrom@tachyonlogic.com</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
I never received any reply to this question but would very much appreciate<br>
one. Not sure if it fell through the cracks.<br></blockquote><div><br></div><div>you can do whatever you want in PCSHELL. I would look at sacusp, since this does about</div><div>what you want.</div><div><br></div><div> Matt</div>
<div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
Thanks,<br>
<br>
Dave<br>
<br>
Dave Nystrom writes:<br>
> I have a 2d resistive mhd code interfaced to petsc. The code has seven<br>
> different linear solves per timestep and these linear solves consume around<br>
> 95 percent of the run time for a reasonably small grid of 100x301. The run<br>
> time is dominated by the solution of one of the linear solves that is<br>
> particularly difficult to solve. This particular linear solve actually takes<br>
> about 80 percent of the total run time. All of these linear systems are<br>
> symmetric. I am actually able to run a simulation where I am using jacobi<br>
> preconditioning with conjugate gradient on a gpu and get a solution and this<br>
> is the fastest solution which I currently get but the iteration count ranges<br>
> from a minimum of 771 to a max of 47300 for the difficult linear system.<br>
><br>
> We also have a native cg solver in the code which uses a full cholesky band<br>
> solve of the inner set of bands and that solver has a much better iteration<br>
> count although it takes a bit over 2x the run time of petsc w/ jacobi and<br>
> cusp. So, I have been interested in making a custom petsc preconditioner<br>
> that does this cholesky solve using the PCSHELL capability of petsc. And I'm<br>
> interested in trying to do this cholesky solve on the GPU using a CULA SPARSE<br>
> band solver. However, I'm wondering if the petsc PCSHELL capability only<br>
> runs on the cpu and if I would need a GPU analog such as a PCGPUSHELL<br>
> capability in petsc. I'm wondering if this is the case and if so, whether<br>
> there is a possibility that a PCGPUSHELL capability might be added to petsc<br>
> in the near term.<br>
><br>
> Thanks,<br>
><br>
> Dave<br>
><br>
> --<br>
> Dave Nystrom<br>
><br>
> phone: <a href="tel:505-661-9943" value="+15056619943">505-661-9943</a> (home office)<br>
> <a href="tel:505-662-6893" value="+15056626893">505-662-6893</a> (home)<br>
> skype: dave.nystrom76<br>
> email: <a href="mailto:dnystrom1@comcast.net">dnystrom1@comcast.net</a><br>
> smail: 219 Loma del Escolar<br>
> Los Alamos, NM 87544<br>
</blockquote></div><br><br clear="all"><div><br></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener<br>