[petsc-users] Memory usage scaling with number of processors

Matthew Thomas matthew.thomas1 at anu.edu.au
Wed Jul 24 20:32:42 CDT 2024


Hi Matt,

I have attached the configuration file below.

Thanks,
Matt




On 25 Jul 2024, at 11:26 AM, Matthew Knepley <knepley at gmail.com> wrote:

On Wed, Jul 24, 2024 at 8:37 PM Matthew Thomas <matthew.thomas1 at anu.edu.au<mailto:matthew.thomas1 at anu.edu.au>> wrote:
Hello Matt,

Thanks for the help. I believe the problem is coming from an incorrect linking with MPI and PETSc.

I tried running with petscmpiexec from $PETSC_DIR/lib/petsc/bin/petscmpiexec. This gave me the error

Error build location not found! Please set PETSC_DIR and PETSC_ARCH correctly for this build.


Naturally I have set these two values and echo $PETSC_DIR gives the path I expect, so it seems like I am running my programs with a different version of MPI than petsc expects which could explain the memory usage.

Do you have any ideas how to fix this?

Yes. First we determine what MPI you configured with. Send configure.log, which has this information.

  Thanks,

      Matt

Thanks,
Matt

On 24 Jul 2024, at 8:41 PM, Matthew Knepley <knepley at gmail.com<mailto:knepley at gmail.com>> wrote:

You don't often get email from knepley at gmail.com<mailto:knepley at gmail.com>. Learn why this is important<https://urldefense.us/v3/__https://aka.ms/LearnAboutSenderIdentification__;!!G_uCfscf7eWS!dRBUqYBDOj9QHmnycbLUuktH6lG_1402EZUstkt_mXuc_Qjt6o_CVNgqncZyn7znHJWBTI0ASPvo3VLgqx-G5FI3ovuhtWw4Xcs$ >
On Tue, Jul 23, 2024 at 8:02 PM Matthew Thomas <matthew.thomas1 at anu.edu.au<mailto:matthew.thomas1 at anu.edu.au>> wrote:
Hello Matt,

I have attached the output with mat_view for 8 and 40 processors.

I am unsure what is meant by the matrix communicator and the partitioning. I am using the default behaviour in every case. How can I find this information?

This shows that the matrix is taking the same amount of memory for 8 and 40 procs, so that is not your problem. Also,
it is a very small amount of memory:

  100K rows x 3 nz/row x 8 bytes/nz = 2.4 MB

and 50% overhead for indexing, so something under 4MB. I am not sure what is taking up the rest of the memory, but I do not
think it is PETSc from the log you included.

  Thanks,

     Matt

I have attached the log view as well if that helps.

Thanks,
Matt




On 23 Jul 2024, at 9:24 PM, Matthew Knepley <knepley at gmail.com<mailto:knepley at gmail.com>> wrote:

You don't often get email from knepley at gmail.com<mailto:knepley at gmail.com>. Learn why this is important<https://urldefense.us/v3/__https://aka.ms/LearnAboutSenderIdentification__;!!G_uCfscf7eWS!dRBUqYBDOj9QHmnycbLUuktH6lG_1402EZUstkt_mXuc_Qjt6o_CVNgqncZyn7znHJWBTI0ASPvo3VLgqx-G5FI3ovuhtWw4Xcs$ >
Also, you could run with

  -mat_view ::ascii_info_detail

and send the output for both cases. The storage of matrix values is not redundant, so something else is
going on. First, what communicator do you use for the matrix, and what partitioning?

  Thanks,

     Matt

On Mon, Jul 22, 2024 at 10:27 PM Barry Smith <bsmith at petsc.dev<mailto:bsmith at petsc.dev>> wrote:
This Message Is From an External Sender
This message came from outside your organization.


  Send the code.

On Jul 22, 2024, at 9:18 PM, Matthew Thomas via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:

This Message Is From an External Sender
This message came from outside your organization.

Hello,

I am using petsc and slepc to solve an eigenvalue problem for sparse matrices. When I run my code with double the number of processors, the memory usage also doubles.

I am able to reproduce this behaviour with ex1 of slepc’s hands on exercises.

The issue is occurring with petsc not with slepc as this still occurs when I remove the solve step and just create and assemble the petsc matrix.

With n=100000, this uses ~1Gb with 8 processors, but ~5Gb with 40 processors.

This was done with petsc 3.21.3, on linux compiled with Intel using Intel-MPI

Is this the expected behaviour? If not, how can I bug fix this?


Thanks,
Matt



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!dRBUqYBDOj9QHmnycbLUuktH6lG_1402EZUstkt_mXuc_Qjt6o_CVNgqncZyn7znHJWBTI0ASPvo3VLgqx-G5FI3ovuhAhmUSCQ$ <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!dRBUqYBDOj9QHmnycbLUuktH6lG_1402EZUstkt_mXuc_Qjt6o_CVNgqncZyn7znHJWBTI0ASPvo3VLgqx-G5FI3ovuh_hc2fPM$ >



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!dRBUqYBDOj9QHmnycbLUuktH6lG_1402EZUstkt_mXuc_Qjt6o_CVNgqncZyn7znHJWBTI0ASPvo3VLgqx-G5FI3ovuhAhmUSCQ$ <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!dRBUqYBDOj9QHmnycbLUuktH6lG_1402EZUstkt_mXuc_Qjt6o_CVNgqncZyn7znHJWBTI0ASPvo3VLgqx-G5FI3ovuh_hc2fPM$ >



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!dRBUqYBDOj9QHmnycbLUuktH6lG_1402EZUstkt_mXuc_Qjt6o_CVNgqncZyn7znHJWBTI0ASPvo3VLgqx-G5FI3ovuhAhmUSCQ$ <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!dRBUqYBDOj9QHmnycbLUuktH6lG_1402EZUstkt_mXuc_Qjt6o_CVNgqncZyn7znHJWBTI0ASPvo3VLgqx-G5FI3ovuh_hc2fPM$ >

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240725/c5e4c536/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: configure.log
Type: application/octet-stream
Size: 1136496 bytes
Desc: configure.log
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240725/c5e4c536/attachment-0001.obj>


More information about the petsc-users mailing list