[petsc-users] memory usage for dense vs sparse matrices

D D driver.dan12 at yahoo.com
Mon Apr 24 09:46:26 CDT 2017


Hello,
I see memory usage that confuses me:
me at blah:src$ ./example1 -n_row 200 -n_col 2000 -sparsity 0.03 -mat_type mpidenseInitialize
Got options
Create and assemble matrix
Assembled
Peak RSS 21 Mb
me at blah:~/src$ ./example1 -n_row 200 -n_col 2000 -sparsity 0.03 -mat_type mpiaij
Initialize
Got options
Create and assemble matrix
Assembled
Peak RSS 19 Mb

I put my example code on Github so I can more effectively communicate my question. And here is my question: why does the program as written use so much memory for the sparse case - matrix type mpiaij? Note that I'm creating a random dense matrix with at most 3% non-zero entries since this is my use case.

I have read the relevant portions of the user's manual and searched for answers. Have I missed a resource that can answer my question?
dtsmith2001/hpc

  
|  
|   
|   
|   |    |

   |

  |
|  
|    |  
dtsmith2001/hpc
 hpc - High Performance Computing Explorations using PETSc and SLEPc  |   |

  |

  |

 

Dale
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170424/798ea1dc/attachment.html>


More information about the petsc-users mailing list