[petsc-users] strange PETSc/KSP GMRES timings for MPI+OMP configuration on KNLs
Satish Balay
balay at mcs.anl.gov
Mon Jun 19 12:53:36 CDT 2017
MPI=16 OMP=1 time=45.62.
This timing [without OpenMP] looks out of place. Perhaps something
else [wierd MPI behavior?] is going on here..
Satish
On Fri, 16 Jun 2017, Damian Kaliszan wrote:
> Hi,
>
> For several days I've been trying to figure out what is going wrong
> with my python app timings solving Ax=b with KSP (GMRES) solver when trying to run on Intel's KNL 7210/7230.
>
> I downsized the problem to 1000x1000 A matrix and a single node and
> observed the following:
>
>
> I'm attaching 2 extreme timings where configurations differ only by 1 OMP thread (64MPI/1 OMP vs 64/2 OMPs),
> 23321 vs 23325 slurm task ids.
>
> Any help will be appreciated....
>
> Best,
> Damian
>
More information about the petsc-users
mailing list