[petsc-users] Assistance Needed with PETSc KSPSolve Performance Issue

Yongzhong Li yongzhong.li at mail.utoronto.ca
Wed Jun 12 17:36:23 CDT 2024


Dear PETSc’s developers,
I hope this email finds you well.
I am currently working on a project using PETSc and have encountered a performance issue with the KSPSolve function. Specifically, I have noticed that the time taken by KSPSolve is almost two times greater than the CPU time for matrix-vector product multiplied by the number of iteration steps. I use C++ chrono to record CPU time.
For context, I am using a shell system matrix A. Despite my efforts to parallelize the matrix-vector product (Ax), the overall solve time remains higher than the matrix vector product per iteration indicates when multiple threads were used. Here are a few details of my setup:

  *   Matrix Type: Shell system matrix
  *   Preconditioner: Shell PC
  *   Parallel Environment: Using Intel MKL as PETSc’s BLAS/LAPACK library, multithreading is enabled
I have considered several potential reasons, such as preconditioner setup, additional solver operations, and the inherent overhead of using a shell system matrix. However, since KSPSolve is a high-level API, I have been unable to pinpoint the exact cause of the increased solve time.
Have you observed the same issue? Could you please provide some experience on how to diagnose and address this performance discrepancy? Any insights or recommendations you could offer would be greatly appreciated.
Thank you for your time and assistance.
Best regards,
Yongzhong
-----------------------------------------------------------
Yongzhong Li
PhD student | Electromagnetics Group
Department of Electrical & Computer Engineering
University of Toronto
https://urldefense.us/v3/__http://www.modelics.org__;!!G_uCfscf7eWS!YME6NPPibCKcgA6BRrCcOZBp90jG3xcObexgXGxsVV6i12v_JAnZlhZNJ1SQdikKzM6jBmFVU2Tqrhfjag9YiyROKq6IsdoQ8Lw$ 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240612/0e61fcbb/attachment.html>


More information about the petsc-users mailing list