[petsc-users] About parallel performance

Barry Smith bsmith at mcs.anl.gov
Thu May 29 14:12:00 CDT 2014


   You need to determine where the other 80% of the time is. My guess it is in setting the values into the matrix each time. Use PetscLogEventRegister() and put a PetscLogEventBegin/End() around the code that computes all the entries in the matrix and calls MatSetValues() and MatAssemblyBegin/End().

   Likely the reason the linear solver does not scale better is that you have a machine with multiple cores that share the same memory bandwidth and the first core is already using well over half the memory bandwidth so the second core cannot be fully utilized since both cores have to wait for data to arrive from memory.  If you are using the development version of PETSc you can run make streams NPMAX=2 from the PETSc root directory and send this to us to confirm this.

   Barry


On May 29, 2014, at 1:23 PM, Qin Lu <lu_qin_2000 at yahoo.com> wrote:

> Hello,
> 
> I implemented PETSc parallel linear solver in a program, the implementation is basically the same as /src/ksp/ksp/examples/tutorials/ex2.c, i.e., I preallocated the MatMPIAIJ, and let PETSc partition the matrix through MatGetOwnershipRange. However, a few tests shows the parallel solver is always a little slower the serial solver (I have excluded the matrix generation CPU).
> 
> For serial run I used PCILU as preconditioner; for parallel run, I used ASM with ILU(0) at each subblocks (-sub_pc_type ilu -sub_ksp_type preonly -ksp_type bcgs -pc_type asm). The number of unknowns are around 200,000.
>  
> I have used -log_summary to print out the performance summary as attached (log_summary_p1 for serial run and log_summary_p2 for the run with 2 processes). It seems the KSPSolve counts only for less than 20% of Global %T. 
> My questions are:
>  
> 1. what is the bottle neck of the parallel run according to the summary?
> 2. Do you have any suggestions to improve the parallel performance?
>  
> Thanks a lot for your suggestions!
>  
> Regards,
> Qin    <log_summary_p1.txt><log_summary_p2.txt>



More information about the petsc-users mailing list