<div dir="ltr">Thanks Jed. </div><div class="gmail_extra"><br><br><div class="gmail_quote">On Fri, Jan 3, 2014 at 4:14 PM, Jed Brown <span dir="ltr"><<a href="mailto:jedbrown@mcs.anl.gov" target="_blank">jedbrown@mcs.anl.gov</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div class="HOEnZb"><div class="h5">John Mousel <<a href="mailto:john.mousel@gmail.com">john.mousel@gmail.com</a>> writes:<br>
<br>
> I would like to investigate using pipelined GMRES and IBCGS on Kraken but<br>
> it seems from the FAQ page that Cray MPT-5.6 is required for the pipelined<br>
> methods to be effective. Checking the module avail, I see that they only<br>
> have up to MPT-5.3.6. Is there any benefit at all to these methods if I<br>
> can't configure with MPT-5.6?<br>
<br>
</div></div>IBCGS does not use pipelining so MPT version is irrelevant. For<br>
pipelined GMRES/CG/CR, you need MPT-5.6 or later to use the asynchronous<br>
interface (MPI_Iallreduce). Without that, the pipelined methods will do<br>
fewer reductions, but they will be synchronous instead of asynchronous,<br>
thus likely not paying off.<br>
<br>
Unfortunately, even MPT-5.6 does not managed to make significant<br>
asynchronous progress. So when you use MPI_Iallreduce, almost nothing<br>
happens until you call MPI_Wait. The Cray MPI team says they think this<br>
is a software/implementation problem rather than a hardware problem, and<br>
they are hopeful that they'll be able to release a new MPT that does a<br>
good job of making asynchronous progress.<br>
<br>
This probably means that PGMRES won't provide large benefits until<br>
Cray's MPI team writes some code and you switch to a machine that uses<br>
that new code.<br>
</blockquote></div><br></div>