[petsc-users] Read/Write large dense matrix
Matthew Knepley
knepley at gmail.com
Mon Aug 5 12:25:27 CDT 2024
On Mon, Aug 5, 2024 at 1:10 PM Sreeram R Venkat <srvenkat at utexas.edu> wrote:
> I have a large dense matrix (size ranging from 5e4 to 1e5) that arises as
> a result of doing MatComputeOperator() on a MatShell. When the total number
> of nonzeros exceeds the 32 bit integer value, I get an error (MPI buffer
> size too big) when
> ZjQcmQRYFpfptBannerStart
> This Message Is From an External Sender
> This message came from outside your organization.
>
> ZjQcmQRYFpfptBannerEnd
> I have a large dense matrix (size ranging from 5e4 to 1e5) that arises as
> a result of doing MatComputeOperator() on a MatShell. When the total number
> of nonzeros exceeds the 32 bit integer value, I get an error (MPI buffer
> size too big) when trying to do MatView() on this to save to binary. Is
> there a way I can save this matrix to load again for later use?
>
I think you need to reconfigure with --with-64-bit-indices.
Thanks,
Matt
> The other thing I tried was to save each column as a separate dataset in
> an hdf5 file. Then, I tried to load this in python, combine them to an np
> array, and then create/save a dense matrix with petsc4py. I was able to
> create the dense Mat, but the MatView() once again resulted in an error
> (out of memory).
>
> Thanks,
> Sreeram
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!ZWJy1vnQRcamRkpD9AvtD6y9h9bvIfbWSTz0DllLxYWq7hwcAytyX_EC7cpuwneyYXURUCUm2lSCptmMMZy4$ <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!ZWJy1vnQRcamRkpD9AvtD6y9h9bvIfbWSTz0DllLxYWq7hwcAytyX_EC7cpuwneyYXURUCUm2lSCptfV0OFZ$ >
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240805/7c553e93/attachment.html>
More information about the petsc-users
mailing list