[petsc-users] How to measure the memory usage of the application built on the Petsc?

Barry Smith bsmith at mcs.anl.gov
Mon May 27 22:48:48 CDT 2013


   There are several ways to monitor the memory usage. You can divide them into two categories: those that monitor how much memory has been malloced specifically by PETSc and how much is used totally be the process. 

PetscMallocGetCurrentUsage() and PetscMallocGetMaximumUsage() which only work with the command line option -malloc provide how much PETSc has malloced.

PetscMemoryGetCurrentUsage() and PetscMemoryGetMaximumUsage() (call PetscMemorySetGetMaximumUsage() immediately after PetscInitialize() for this one to work) provide total memory usage.

These are called on each process so use a MPI_Reduce() to gather the total memory across all processes to process 0 to print it out. Suggest calling it after the mesh as been set up, then call again immediately before the XXXSolve() is called and then after the XXXSolve() is called.

   Please let us know if you have any difficulties.

    As always we recommend you upgrade to PETSc 3.4

    Barry



On May 27, 2013, at 10:22 PM, Fande Kong <fande.kong at colorado.edu> wrote:

> Hi all,
> 
> How to measure the memory usage of the application built on the Petsc?  I am now solving linear elasticity equations with fgmres preconditioned by two-level method, that is, preconditioned by multigrid method where on each level the additive Schwarz method is adopted.  More than 1000 cores are adopted to solve this problem on the supercomputer. When the total freedom of the problem is about 60M, the application correctly run and produce correct results. But when the total freedom increases to 600M, the application abort and say there is not enough memory (  the system administrator of the supercomputer told me that my application run out memory).
> 
> Thus, I want to monitor the memory usage dynamically when the application running. Are there any functions or strategies that could be used for this purpose?
> 
> The error information is attached.
> 
> Regards,
> -- 
> Fande Kong
> Department of Computer Science
> University of Colorado at Boulder
> <solid3dcube2.o1603352><configure and make log.zip>



More information about the petsc-users mailing list