[petsc-users] Direct PETSc to use MCDRAM on KNL and other optimizations for KNL

Sajid Ali sajidsyed2021 at u.northwestern.edu
Wed Feb 27 16:15:13 CST 2019


Hi,

I ran a TS integrator for 25 steps on a Broadwell-Xeon and Xeon-Phi (KNL).
The problem size is 5000x5000 and I'm using scalar=complex.

The program takes 125 seconds to run on Xeon and 451 seconds on KNL !

The first thing I want to change is to convert the memory access for the
program on KNL from DRAM to MCDRAM. I did run the problem in an interactive
SLURM job and specified -C quad,flat and yet I see DRAM is being used.

I'm attaching the PETSc log files and Intel APS reports as well. Any help
on how I should change my runtime parameters on KNL will be highly
appreciated. Thanks in advance.

-- 
Sajid Ali
Applied Physics
Northwestern University
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190227/b37859ea/attachment-0003.html>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190227/b37859ea/attachment-0004.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: bdw_petsc
Type: application/octet-stream
Size: 24909 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190227/b37859ea/attachment-0002.obj>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: knl_petsc
Type: application/octet-stream
Size: 24296 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190227/b37859ea/attachment-0003.obj>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190227/b37859ea/attachment-0005.html>


More information about the petsc-users mailing list