[petsc-users] memory usage for dense vs sparse matrices

D D driver.dan12 at yahoo.com
Mon Apr 24 11:24:53 CDT 2017


Unless, of course, my assumption is incorrect. But why should my assumption be incorrect?

I think I'm constructing my sparse matrix properly by calling MatSetFromOptions. The loop from line 52 - 57 in example1.cpp may be incorrect.

How do you think I should measure the effect of the size of the sparse vs dense matrix structure to make sure I'm effectively using the PETSc sparse matrix structure in my example code? 

    On Monday, April 24, 2017 12:10 PM, D D <driver.dan12 at yahoo.com> wrote:
 

 You are correct, and that is why I'm using the peak RSS. The total memory should be lower to reflect the sparse versus dense structure.
 

    On Monday, April 24, 2017 11:28 AM, "Zhang, Hong" <hongzhang at anl.gov> wrote:
 

 The peak RSS does not tell you how much memory the matrix takes. It may include many things such as the binary, the libraries linked to it, and stack and heap memory.
Hong (Mr.)


On Apr 24, 2017, at 9:46 AM, D D <driver.dan12 at yahoo.com> wrote:
Hello,
I see memory usage that confuses me:
me at blah:src$ ./example1 -n_row 200 -n_col 2000 -sparsity 0.03 -mat_type mpidenseInitialize
Got options
Create and assemble matrix
Assembled
Peak RSS 21 Mb
me at blah:~/src$ ./example1 -n_row 200 -n_col 2000 -sparsity 0.03 -mat_type mpiaij
Initialize
Got options
Create and assemble matrix
Assembled
Peak RSS 19 Mb

I put my example code on Github so I can more effectively communicate my question. And here is my question: why does the program as written use so much memory for the sparse case - matrix type mpiaij? Note that I'm creating a random dense matrix with at most 3% non-zero entries since this is my use case.

I have read the relevant portions of the user's manual and searched for answers. Have I missed a resource that can answer my question?
dtsmith2001/hpc


| 
| 
| 
|  |  |

 |

 |
| 
|  | 
dtsmith2001/hpc
hpc - High Performance Computing Explorations using PETSc and SLEPc |  |

 |

 |



Dale




   

   
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170424/fd4637c7/attachment.html>


More information about the petsc-users mailing list