[petsc-users] Memory usage scaling with number of processors
Matthew Thomas
matthew.thomas1 at anu.edu.au
Tue Jul 23 19:02:33 CDT 2024
Hello Matt,
I have attached the output with mat_view for 8 and 40 processors.
I am unsure what is meant by the matrix communicator and the partitioning. I am using the default behaviour in every case. How can I find this information?
I have attached the log view as well if that helps.
Thanks,
Matt
On 23 Jul 2024, at 9:24 PM, Matthew Knepley <knepley at gmail.com> wrote:
You don't often get email from knepley at gmail.com<mailto:knepley at gmail.com>. Learn why this is important<https://urldefense.us/v3/__https://aka.ms/LearnAboutSenderIdentification__;!!G_uCfscf7eWS!defNF55JDHADXFMCPrlWVnASGb8l1sxXg5-10IVx4Ff5FFmO2N003z0BQ80cCU3clrwdPmEGeMWVUhdzckDhFG0VKlPdvbvDJrA$ >
Also, you could run with
-mat_view ::ascii_info_detail
and send the output for both cases. The storage of matrix values is not redundant, so something else is
going on. First, what communicator do you use for the matrix, and what partitioning?
Thanks,
Matt
On Mon, Jul 22, 2024 at 10:27 PM Barry Smith <bsmith at petsc.dev<mailto:bsmith at petsc.dev>> wrote:
This Message Is From an External Sender
This message came from outside your organization.
Send the code.
On Jul 22, 2024, at 9:18 PM, Matthew Thomas via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:
This Message Is From an External Sender
This message came from outside your organization.
Hello,
I am using petsc and slepc to solve an eigenvalue problem for sparse matrices. When I run my code with double the number of processors, the memory usage also doubles.
I am able to reproduce this behaviour with ex1 of slepc’s hands on exercises.
The issue is occurring with petsc not with slepc as this still occurs when I remove the solve step and just create and assemble the petsc matrix.
With n=100000, this uses ~1Gb with 8 processors, but ~5Gb with 40 processors.
This was done with petsc 3.21.3, on linux compiled with Intel using Intel-MPI
Is this the expected behaviour? If not, how can I bug fix this?
Thanks,
Matt
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!defNF55JDHADXFMCPrlWVnASGb8l1sxXg5-10IVx4Ff5FFmO2N003z0BQ80cCU3clrwdPmEGeMWVUhdzckDhFG0VKlPduQ6gjvc$ <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!defNF55JDHADXFMCPrlWVnASGb8l1sxXg5-10IVx4Ff5FFmO2N003z0BQ80cCU3clrwdPmEGeMWVUhdzckDhFG0VKlPdbv0ojtA$ >
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240724/03d54771/attachment-0001.html>
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: mat_view_8.txt
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240724/03d54771/attachment-0003.txt>
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: mat_view_40.txt
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240724/03d54771/attachment-0004.txt>
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: log_view.txt
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240724/03d54771/attachment-0005.txt>
More information about the petsc-users
mailing list