[petsc-users] Memory Usage in Matrix Assembly.

Dave May dave.mayhem23 at gmail.com
Tue Mar 14 09:40:46 CDT 2023


On Tue 14. Mar 2023 at 07:15, Pantelis Moschopoulos <
pmoschopoulos at outlook.com> wrote:

> Hi everyone,
>
> I am a new Petsc user that incorporates Petsc for FEM in a Fortran code.
> My question concerns the sudden increase of the memory that Petsc needs
> during the assembly of the jacobian matrix. After this point, memory is
> freed. It seems to me like Petsc performs memory allocations and the
> deallocations during assembly.
> I have used the following commands with no success:
> CALL MatSetOption(petsc_A, MAT_NEW_NONZERO_LOCATIONS,PETSC_FALSE,ier)
> CALL MatSetOption(petsc_A, MAT_NEW_NONZERO_LOCATION_ERR,PETSC_TRUE,ier)
> CALL MatSetOption(petsc_A, MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_TRUE,ier).
> CALL MatSetOption(petsc_A, MAT_KEEP_NONZERO_PATTERN,PETSC_TRUE,ier)
>
> The structure of the matrix does not change during my simulation, just the
> values. I am expecting this behavior the first time that I create this
> matrix because the preallocation instructions that I use are not very
> accurate but this continues every time I assemble the matrix.
> What I am missing here?
>

I am guessing this observation is seen when you run a parallel job.

MatSetValues() will cache values in a temporary memory buffer if the values
are to be sent to a different MPI rank.
Hence if the parallel layout of your matrix doesn’t closely match the
layout of the DOFs on each mesh sub-domain, then a huge number of values
can potentially be cached. After you call MatAssemblyBegin(),
MatAssemblyEnd() this cache will be freed.

Thanks,
Dave



> Thank you very much,
> Pantelis
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230314/fe063a5c/attachment.html>


More information about the petsc-users mailing list