[petsc-users] Read/Write large dense matrix

Sreeram R Venkat srvenkat at utexas.edu
Mon Aug 5 20:19:51 CDT 2024


Here's an example code that should replicate the error:
https://urldefense.us/v3/__https://github.com/s769/petsc-test/tree/master__;!!G_uCfscf7eWS!dWU2gJCvykWqg3TTfkkQOsW3q32Sny3r399zmyr6MCiJQh6_dH-T3IktQLg9fbvc4okbbHP2koQZkzL0fCjOTrC90w$ .

I tried using the PETSC_FORMAT_NATIVE, but I still get the error. I have a
situation where the matrix is created on PETSC_COMM_WORLD but only has
entries on the first process due to some layout constraints elsewhere in
the program. The nodes I'm running on should have more than enough memory
to hold the entire matrix on one process, and the error I get is not an
out-of-memory error anyway.

Let me know if you aren't able to build the example.

I noticed that if I fully distributed the matrix over all processes, then
the save works fine. Is there some way to do that after I create the matrix
but before saving it?

On Mon, Aug 5, 2024 at 1:19 PM Barry Smith <bsmith at petsc.dev> wrote:

>
>    By default PETSc MatView() to a binary viewer uses the "standard"
> compressed sparse storage format. This is not efficient (or reasonable) for
> dense matrices and
> produces issues with integer overflow.
>
>    To store a dense matrix as dense on disk, use the PetscViewerFormat
> of PETSC_VIEWER_NATIVE. So for example
>
>    PetscViewerPushFormat(viewer,PETSC_VIEWER_NATIVE);
>    MatView(mat, viewer);
>    PetscViewerPopFormat(viewer);
>
>
> On Aug 5, 2024, at 1:10 PM, Sreeram R Venkat <srvenkat at utexas.edu> wrote:
>
> This Message Is From an External Sender
> This message came from outside your organization.
> I have a large dense matrix (size ranging from 5e4 to 1e5) that arises as
> a result of doing MatComputeOperator() on a MatShell. When the total number
> of nonzeros exceeds the 32 bit integer value, I get an error (MPI buffer
> size too big) when trying to do MatView() on this to save to binary. Is
> there a way I can save this matrix to load again for later use?
>
> The other thing I tried was to save each column as a separate dataset in
> an hdf5 file. Then, I tried to load this in python, combine them to an np
> array, and then create/save a dense matrix with petsc4py. I was able to
> create the dense Mat, but the MatView() once again resulted in an error
> (out of memory).
>
> Thanks,
> Sreeram
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240805/0e8f0600/attachment.html>


More information about the petsc-users mailing list