[petsc-users] performance issue solving multiple linear systems of the same size with the different preconditioning methods
Barry Smith
bsmith at mcs.anl.gov
Sat Aug 27 16:02:46 CDT 2011
You can use -ksp_monitor_true_residual from the command line to always monitor the unpreconditioned residuals, no memory leaks.
But this would be an unfair test for timing because
1) having ASCII IO screws up timing results
2) some methods like left preconditioned GMRES get the preconditioned residual "for free" in that it is needed in the algorithm but if you monitor the true residual this introduces a lot of additional operations (which are not needed for the linear solve) making the linear solver look slower than it really is.
Barry
On Aug 27, 2011, at 3:58 PM, Алексей Рязанов wrote:
> So. All memory and time issues appeared because of my KSPConvergedTest function, which calls KSPBuildResidual to monitor true residual. KSPBuildResidual incorrectly works with memory. As soon as I've commented the line with KSPBuildResidual, all troubles gone. So now I use default monitor and have to monitor preconditioned residual. This is inconvenient for me, but I cant figure out, how can I set up using true monitor for all methods without any memory or performance leaks. I'll try to get used t deal with preconditioned norm.
>
> Thank you very much for taking the time to reply to me.
>
> --
> Best regards,
> Alexey Ryazanov
> ______________________________________
> Nuclear Safety Institute of Russian Academy of Sciences
>
>
> 27 августа 2011 г. 17:32 пользователь Алексей Рязанов <ram at ibrae.ac.ru> написал:
> Oh BOY! It was all fault of my MyKSPConvergedTest.
>
> 27 августа 2011 г. 1:29 пользователь Алексей Рязанов <ram at ibrae.ac.ru> написал:
>
> I have also checked KSPSolve behavior in my other PETSc programs and found the same memory lack
>
> 27 августа 2011 г. 0:57 пользователь Алексей Рязанов <ram at ibrae.ac.ru> написал:
>
>
> Thank you for your response!
>
> I have the memory leak in both my programs. But I don't create plenty of vectors.
>
> My code looks like:
>
> ***INIT_ALL***
> PetscLogStageRegister("Iteration :", &StageNum1);
> PetscLogStagePush(StageNum1);
> KSPSolve(dKSP, dvec_origRHS, dvec_Solution);
> PetscLogStagePop();
> ***DESTROY_ALL***
>
> And when I comment (or delete) KSPSolve, the log_summary output is:
>
> =====================================================
> Memory usage is given in bytes:
>
> Creations Destructions Memory Descendants' Mem Object Type
> Reports information only for process 0.
>
> --- Event Stage 0: Main Stage
> 1 1 729472 0 Application Order
> 1 1 225452 0 Distributed array
> 8 8 1533424 0 Vec
> 3 3 2604 0 Vec Scatter
> 8 8 613852 0 Index Set
> 1 1 221304 0 IS L to G Mapping
> 3 3 16603440 0 Matrix
> 1 1 832 0 Krylov Solver
> 1 1 688 0 Preconditioner
> 1 1 448 0 PetscRandom
>
> --- Event Stage 1: Iteration :
> =====================================================
>
> When I run the code with KSPSolve instruction, it gives me:
>
> =====================================================
> Memory usage is given in bytes:
> Creations Destructions Memory Descendants' Mem Object Type
> Reports information only for process 0.
>
> --- Event Stage 0: Main Stage
> 1 0 0 0 Application Order
> 1 0 0 0 Distributed array
> 8 17 4963592 0 Vec
> 3 2 1736 0 Vec Scatter
> 8 12 1425932 0 Index Set
> 1 0 0 0 IS L to G Mapping
> 3 5 50158132 0 Matrix
> 1 2 1664 0 Krylov Solver
> 1 2 1440 0 Preconditioner
> 1 1 448 0 PetscRandom
> 0 1 544 0 Viewer
>
> --- Event Stage 1: Iteration :
> 355 173 64692312 0 Vec
> 1 0 0 0 Vec Scatter
> 6 2 1024 0 Index Set
> 2 0 0 0 Matrix
> 1 0 0 0 Krylov Solver
> 1 0 0 0 Preconditioner
> 2 1 544 0 Viewer
> =====================================================
>
>
>
> 2011/8/25 Jed Brown <jedbrown at mcs.anl.gov>
> On Tue, Aug 23, 2011 at 02:37, Алексей Рязанов <ram at ibrae.ac.ru> wrote:
> When i delete the 4-5-6 part of 2nd, 1-2-3 works great! with exact like 1st results.
> When i delete the 1-2-3 part of 2nd, 4-5-6 works great! with exact like 1st results.
> All program (1-2-3-4-5-6) works badly.
>
> From the -log_summary, you have a memory leak (many more vector creations than destructions). Try running with -malloc_dump to debug it. Perhaps you are creating a vector every time one of your functions is called? You should also build --with-debugging=0 when looking at timing results. (You can keep it in PETSC_ARCH=linux-gnu-opt.)
>
>
>
> --
> Best regards,
> Alexey Ryazanov
> ______________________________________
> Nuclear Safety Institute of Russian Academy of Sciences
>
>
>
>
>
> --
> Best regards,
> Alexey Ryazanov
> ______________________________________
> Nuclear Safety Institute of Russian Academy of Sciences
>
>
>
>
>
> --
> Best regards,
> Alexey Ryazanov
> ______________________________________
> Nuclear Safety Institute of Russian Academy of Sciences
>
>
>
>
>
> --
> Best regards,
> Alexey Ryazanov
> ______________________________________
> Nuclear Safety Institute of Russian Academy of Sciences
>
>
More information about the petsc-users
mailing list