[petsc-users] memory usage for dense vs sparse matrices
Andrew McRae
A.T.T.McRae at bath.ac.uk
Mon Apr 24 11:44:33 CDT 2017
This matrix is 2000 x 200? That's tiny. Even a 2000-by-200 dense matrix
takes only (2000*200 entries)*(8 bytes per entry) to store, or about 3 MB.
The sparse version might take 150KB. The 'peak RSS' differs by 2MB, so
this seems consistent.
Try a 20,000 x 20,000 dense and sparse matrix, by which time the memory
usage will be dominated by the matrix storage.
On 24 April 2017 at 17:24, D D <driver.dan12 at yahoo.com> wrote:
> Unless, of course, my assumption is incorrect. But why should my
> assumption be incorrect?
>
> I think I'm constructing my sparse matrix properly by calling
> MatSetFromOptions. The loop from line 52 - 57 in example1.cpp may be
> incorrect.
>
> How do you think I should measure the effect of the size of the sparse vs
> dense matrix structure to make sure I'm effectively using the PETSc sparse
> matrix structure in my example code?
>
>
> On Monday, April 24, 2017 12:10 PM, D D <driver.dan12 at yahoo.com> wrote:
>
>
> You are correct, and that is why I'm using the peak RSS. The total memory
> should be lower to reflect the sparse versus dense structure.
>
>
> On Monday, April 24, 2017 11:28 AM, "Zhang, Hong" <hongzhang at anl.gov>
> wrote:
>
>
> The peak RSS does not tell you how much memory the matrix takes. It may
> include many things such as the binary, the libraries linked to it, and
> stack and heap memory.
>
> Hong (Mr.)
>
> On Apr 24, 2017, at 9:46 AM, D D <driver.dan12 at yahoo.com> wrote:
>
> Hello,
>
> I see memory usage that confuses me:
>
> me at blah:src$ ./example1 -n_row 200 -n_col 2000 -sparsity 0.03 -mat_type
> mpidenseInitialize
> Got options
> Create and assemble matrix
> Assembled
> Peak RSS 21 Mb
> me at blah:~/src$ ./example1 -n_row 200 -n_col 2000 -sparsity 0.03 -mat_type
> mpiaij
> Initialize
> Got options
> Create and assemble matrix
> Assembled
> Peak RSS 19 Mb
>
> I put my example code on Github so I can more effectively communicate my
> question. And here is my question: why does the program as written use so
> much memory for the sparse case - matrix type mpiaij? Note that I'm
> creating a random dense matrix with at most 3% non-zero entries since this
> is my use case.
>
> I have read the relevant portions of the user's manual and searched for
> answers. Have I missed a resource that can answer my question?
>
> dtsmith2001/hpc <https://github.com/dtsmith2001/hpc>
>
> dtsmith2001/hpc
> hpc - High Performance Computing Explorations using PETSc and SLEPc
> <https://github.com/dtsmith2001/hpc>
>
>
> Dale
>
>
>
>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170424/8c2dc851/attachment-0001.html>
More information about the petsc-users
mailing list