[petsc-users] Read/Write large dense matrix

Matthew Knepley knepley at gmail.com
Mon Aug 5 12:40:43 CDT 2024


On Mon, Aug 5, 2024 at 1:26 PM Sreeram R Venkat <srvenkat at utexas.edu> wrote:

> I do have 64 bit indices turned on. The problem I think is that the
> PetscMPIInt is always a 32 bit int, and that's what's overflowing
>

We should be using the large count support from MPI. However, it appears we
forgot somewhere. Would it be possible to
construct a simple example that I can run and find the error? You should be
able to just create a dense matrix of zeros with the
correct size.

  Thanks,

      Matt


> On Mon, Aug 5, 2024 at 12:25 PM Matthew Knepley <knepley at gmail.com> wrote:
>
>> On Mon, Aug 5, 2024 at 1:10 PM Sreeram R Venkat <srvenkat at utexas.edu>
>> wrote:
>>
>>> I have a large dense matrix (size ranging from 5e4 to 1e5) that arises
>>> as a result of doing MatComputeOperator() on a MatShell. When the total
>>> number of nonzeros exceeds the 32 bit integer value, I get an error (MPI
>>> buffer size too big) when
>>> ZjQcmQRYFpfptBannerStart
>>> This Message Is From an External Sender
>>> This message came from outside your organization.
>>>
>>> ZjQcmQRYFpfptBannerEnd
>>> I have a large dense matrix (size ranging from 5e4 to 1e5) that arises
>>> as a result of doing MatComputeOperator() on a MatShell. When the total
>>> number of nonzeros exceeds the 32 bit integer value, I get an error (MPI
>>> buffer size too big) when trying to do MatView() on this to save to binary.
>>> Is there a way I can save this matrix to load again for later use?
>>>
>>
>> I think you need to reconfigure with --with-64-bit-indices.
>>
>>   Thanks,
>>
>>      Matt
>>
>>
>>> The other thing I tried was to save each column as a separate dataset in
>>> an hdf5 file. Then, I tried to load this in python, combine them to an np
>>> array, and then create/save a dense matrix with petsc4py. I was able to
>>> create the dense Mat, but the MatView() once again resulted in an error
>>> (out of memory).
>>>
>>> Thanks,
>>> Sreeram
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!a-sxRcKHh_nd4gLTjiXZxx0nYU4_lvIBL8xVFhNVrOwEBeVFcnTWMFNkyHuJ15bZDhKacKWF1t8swumsFxgH$ 
>> <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!a-sxRcKHh_nd4gLTjiXZxx0nYU4_lvIBL8xVFhNVrOwEBeVFcnTWMFNkyHuJ15bZDhKacKWF1t8swuTKLNGG$ >
>>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!a-sxRcKHh_nd4gLTjiXZxx0nYU4_lvIBL8xVFhNVrOwEBeVFcnTWMFNkyHuJ15bZDhKacKWF1t8swumsFxgH$  <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!a-sxRcKHh_nd4gLTjiXZxx0nYU4_lvIBL8xVFhNVrOwEBeVFcnTWMFNkyHuJ15bZDhKacKWF1t8swuTKLNGG$ >
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240805/8a97c14b/attachment.html>


More information about the petsc-users mailing list