[petsc-users] ISDestroy performance

Barry Smith bsmith at mcs.anl.gov
Wed Feb 24 13:45:37 CST 2016

  Yikes, you are good at finding the bad hidden backwaters of PETSc that kill performance.

  Edit src/sys/dll/reg.c and look for the lines

#if defined(PETSC_USE_LOG)
    /* add this new list to list of all lists */
    if (!dlallhead) {
      dlallhead        = *fl;
      (*fl)->next_list = 0;
    } else {
      ne               = dlallhead;
      dlallhead        = *fl;
      (*fl)->next_list = ne;

   Remove them. Let us know if this works.

   This logging of functions is completely non-scalable for large numbers of PETSc objects.

   Thanks for sending the trace it made it easy for me to determine the problem.


> On Feb 24, 2016, at 12:48 PM, Bhalla, Amneet Pal S <amneetb at live.unc.edu> wrote:
> Hi Folks,
> In our MG algorithm for Stokes like system we are using PETSc solvers at each level for smoothers. In particular, we are using PCASM and PCFIELDSPLIT as smoothers (with 1/2 iterations of Chebyshev or Richardson solvers).
> We are also defining our own domains (IS'es) for these two PCs. The solvers at each level are initialized at the beginning of the timestep and destroyed at the end of the 
> timestep (it's a dynamic problem). The HPCToolKit shows that about 74% of the time is used in doing ISDestroy, whereas we expect it to use in SNES solve. This seems very weird. I also tried Instruments (OSX) just to
> be sure that it was not an anomaly with HPCToolKit reporting, but the same performance sink shows up there too. Attached is the profiling from both profilers. Can anything be done to mitigate 
> this time sink for ISDestroy?
> <Instruments_FieldsplitRun.trace.zip>
> <hpctoolkit-Fielsplit-database.zip>
> Thanks, 
> — Amneet 
> =====================================================
> Amneet Bhalla
> Postdoctoral Research Associate
> Department of Mathematics and McAllister Heart Institute
> University of North Carolina at Chapel Hill
> Email: amneet at unc.edu
> Web:  https://abhalla.web.unc.edu
> =====================================================

More information about the petsc-users mailing list