[petsc-users] Read/Write large dense matrix

Barry Smith bsmith at petsc.dev
Mon Aug 5 13:19:37 CDT 2024


   By default PETSc MatView() to a binary viewer uses the "standard" compressed sparse storage format. This is not efficient (or reasonable) for dense matrices and 
produces issues with integer overflow.

   To store a dense matrix as dense on disk, use the PetscViewerFormat of PETSC_VIEWER_NATIVE. So for example

   PetscViewerPushFormat(viewer,PETSC_VIEWER_NATIVE);
   MatView(mat, viewer);
   PetscViewerPopFormat(viewer);


> On Aug 5, 2024, at 1:10 PM, Sreeram R Venkat <srvenkat at utexas.edu> wrote:
> 
> This Message Is From an External Sender
> This message came from outside your organization.
> I have a large dense matrix (size ranging from 5e4 to 1e5) that arises as a result of doing MatComputeOperator() on a MatShell. When the total number of nonzeros exceeds the 32 bit integer value, I get an error (MPI buffer size too big) when trying to do MatView() on this to save to binary. Is there a way I can save this matrix to load again for later use? 
> 
> The other thing I tried was to save each column as a separate dataset in an hdf5 file. Then, I tried to load this in python, combine them to an np array, and then create/save a dense matrix with petsc4py. I was able to create the dense Mat, but the MatView() once again resulted in an error (out of memory). 
> 
> Thanks,
> Sreeram

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240805/1786cb51/attachment-0001.html>


More information about the petsc-users mailing list