[petsc-users] strange PETSc/KSP GMRES timings for MPI+OMP configuration on KNLs

Damian Kaliszan damian at man.poznan.pl
Fri Jun 16 07:57:10 CDT 2017


Hi,

For  several  days  I've been trying to figure out what is going wrong
with my python app timings solving Ax=b with KSP (GMRES) solver when trying to run on Intel's KNL 7210/7230.

I  downsized  the  problem  to  1000x1000 A matrix and a single node and
observed the following:


I'm attaching 2 extreme timings where configurations differ only by 1 OMP thread (64MPI/1 OMP vs 64/2 OMPs),
23321 vs 23325 slurm task ids.

Any help will be appreciated....

Best,
Damian
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170616/7ddbf65b/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: int_1.jpg
Type: image/jpeg
Size: 92539 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170616/7ddbf65b/attachment-0001.jpg>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: slurm-23321.out
Type: application/octet-stream
Size: 39057 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170616/7ddbf65b/attachment-0002.obj>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: slurm-23325.out
Type: application/octet-stream
Size: 39054 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170616/7ddbf65b/attachment-0003.obj>


More information about the petsc-users mailing list