[petsc-users] performance issue solving multiple linear systems of the same size with the different preconditioning methods

Алексей Рязанов ram at ibrae.ac.ru
Sat Aug 27 08:10:50 CDT 2011


I've figured out, that the amount of undestroyed objects depends on the type
of the linear system matrix. Now I'm assembling 3d laplacian with Neuman
boundary conditions, which is singular, and I have huge amount of
undestroyed vectors, mappings and scatters. If I use zero matrix, the number
of destroyed objects equals to the number of created. But if I even
use unitary matrix, which is not singular,I already have some undestroyed
objects.

27 августа 2011 г. 1:29 пользователь Алексей Рязанов <ram at ibrae.ac.ru>написал:

> I have also checked KSPSolve behavior in my other PETSc programs and found
> the same memory lack
>
> 27 августа 2011 г. 0:57 пользователь Алексей Рязанов <ram at ibrae.ac.ru>написал:
>
>
>> Thank you for your response!
>>
>> I have the memory leak in both my programs. But I don't create plenty of
>> vectors.
>>
>> My code looks like:
>>
>>   ***INIT_ALL***
>>   PetscLogStageRegister("Iteration  :", &StageNum1);
>>   PetscLogStagePush(StageNum1);
>>   KSPSolve(dKSP, dvec_origRHS, dvec_Solution);
>>   PetscLogStagePop();
>>   ***DESTROY_ALL***
>>
>>
>> And when I comment (or delete) KSPSolve, the log_summary output is:
>>
>> =====================================================
>> Memory usage is given in bytes:
>>
>> Creations   Destructions     Memory  Descendants' Mem Object Type
>>
>> Reports information only for process 0.
>>
>> --- Event Stage 0: Main Stage
>> 1              1         729472     0   Application Order
>> 1              1         225452     0   Distributed array
>> 8              8       1533424     0   Vec
>> 3              3             2604     0   Vec Scatter
>> 8              8         613852     0   Index Set
>> 1              1         221304     0   IS L to G Mapping
>> 3              3     16603440     0   Matrix
>> 1              1               832     0   Krylov Solver
>> 1              1               688     0   Preconditioner
>> 1              1               448     0   PetscRandom
>>
>> --- Event Stage 1: Iteration  :
>> =====================================================
>>
>> When I run the code with KSPSolve instruction, it gives me:
>>
>> =====================================================
>> Memory usage is given in bytes:
>> Creations   Destructions     Memory  Descendants' Mem Object Type
>>
>> Reports information only for process 0.
>>
>> --- Event Stage 0: Main Stage
>> 1              0            0            0      Application Order
>> 1              0            0            0      Distributed array
>> 8             17      4963592     0      Vec
>> 3              2         1736         0      Vec Scatter
>> 8             12      1425932     0      Index Set
>> 1              0            0            0      IS L to G Mapping
>> 3              5     50158132     0      Matrix
>> 1              2         1664         0      Krylov Solver
>> 1              2         1440         0      Preconditioner
>> 1              1          448          0      PetscRandom
>> 0              1          544          0      Viewer
>>
>> --- Event Stage 1: Iteration  :
>> 355        173   64692312     0      Vec
>> 1              0            0            0      Vec Scatter
>> 6              2         1024         0      Index Set
>> 2              0            0            0      Matrix
>> 1              0            0            0      Krylov Solver
>> 1              0            0            0      Preconditioner
>> 2              1          544          0      Viewer
>> =====================================================
>>
>>
>>
>>
>> 2011/8/25 Jed Brown <jedbrown at mcs.anl.gov>
>>
>>> On Tue, Aug 23, 2011 at 02:37, Алексей Рязанов <ram at ibrae.ac.ru> wrote:
>>>
>>>> When i delete the 4-5-6 part of 2nd, 1-2-3 works great! with exact like
>>>> 1st results.
>>>> When i delete the 1-2-3 part of 2nd, 4-5-6 works great! with exact like
>>>> 1st results.
>>>> All program (1-2-3-4-5-6) works badly.
>>>>
>>>
>>> From the -log_summary, you have a memory leak (many more vector creations
>>> than destructions). Try running with -malloc_dump to debug it. Perhaps you
>>> are creating a vector every time one of your functions is called? You should
>>> also build --with-debugging=0 when looking at timing results. (You can keep
>>> it in PETSC_ARCH=linux-gnu-opt.)
>>>
>>
>>
>>
>> --
>> Best regards,
>> Alexey Ryazanov
>> ______________________________________
>> Nuclear Safety Institute of Russian Academy of Sciences
>>
>>
>>
>
>
> --
> Best regards,
> Alexey Ryazanov
> ______________________________________
> Nuclear Safety Institute of Russian Academy of Sciences
>
>
>


-- 
Best regards,
Alexey Ryazanov
______________________________________
Nuclear Safety Institute of Russian Academy of Sciences
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110827/7c1729f6/attachment.htm>


More information about the petsc-users mailing list