<html><head></head><body><div dir="auto">Yes, very strange. I tested it with Intel MPI and ParastationMPI, both available on the cluster. <br></div>
<div dir="auto">Output log I sent may show something interesting (?) <br><br></div>
<div dir="auto">Best, <br></div>
<div dir="auto">Damian</div>
<div class="gmail_quote" >W dniu 19 cze 2017, o 19:53, użytkownik Satish Balay <<a href="mailto:balay@mcs.anl.gov" target="_blank">balay@mcs.anl.gov</a>> napisał:
<blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;">
<pre class="blue">MPI=16 OMP=1 time=45.62.<br><br>This timing [without OpenMP] looks out of place. Perhaps something<br>else [wierd MPI behavior?] is going on here..<br><br>Satish<br><br>On Fri, 16 Jun 2017, Damian Kaliszan wrote:<br><br><blockquote class="gmail_quote" style="margin: 0pt 0pt 1ex 0.8ex; border-left: 1px solid #729fcf; padding-left: 1ex;"> Hi,<br> <br> For several days I've been trying to figure out what is going wrong<br> with my python app timings solving Ax=b with KSP (GMRES) solver when trying to run on Intel's KNL 7210/7230.<br> <br> I downsized the problem to 1000x1000 A matrix and a single node and<br> observed the following:<br> <br> <br> I'm attaching 2 extreme timings where configurations differ only by 1 OMP thread (64MPI/1 OMP vs 64/2 OMPs),<br> 23321 vs 23325 slurm task ids.<br> <br> Any help will be appreciated....<br> <br> Best,<br> Damian<br> <br></blockquote></pre></blockquote></div></body></html>