[petsc-users] MatView to file
Mark Adams
mfadams at lbl.gov
Fri Aug 12 14:42:23 CDT 2022
With 4 million elements you are nowhere near the 32 but integer limit of 2B
or 32Gb of memory.
See the https://petsc.org/main/docs/manualpages/Mat/MatView
You should go to binary format when doing large matrices.
Mark
On Fri, Aug 12, 2022 at 1:00 PM Alfredo Jaramillo <ajaramillopalma at gmail.com>
wrote:
> Hello Mark,
> Thank you, I added the lines that you sent.
> This only happens when running the code with more than 1 process. With
> only 1 MPI process the matrix is printed out.
> With 2 processes or more I observed the program begins to allocate RAM
> until it exceeds the computer capacity (32GB) so I wasn't able to get the
> stack trace.
>
> However, I was able to reproduce the problem by compiling
> src/ksp/ksp/tutorials/ex54.c.html
> <https://petsc.org/release/src/ksp/ksp/tutorials/ex54.c.html> (modifying
> line 144) and running it with
>
> mpirun -np 2 ex54 -ne 1000
>
> This gives a sparse matrix of order ~1 million. When running ex54 with
> only one MPI process I don't observe this excessive allocation and the
> matrix is printed out.
>
> Thanks,
> Alfredo
>
>
>
> On Fri, Aug 12, 2022 at 10:02 AM Mark Adams <mfadams at lbl.gov> wrote:
>
>> You also want:
>>
>> PetscCall(PetscViewerPopFormat(viewer));
>> PetscCall(PetscViewerDestroy(&viewer));
>>
>> This should not be a problem.
>> If this is a segv and you configure it with '--with-debugging=1', you
>> should get a stack trace, which would help immensely.
>> Or run in a debugger to get a stack trace.
>>
>> Thanks,
>> Mark
>>
>>
>>
>> On Fri, Aug 12, 2022 at 11:26 AM Alfredo Jaramillo <
>> ajaramillopalma at gmail.com> wrote:
>>
>>> Dear developers,
>>>
>>> I'm writing a sparse matrix into a file by doing
>>>
>>> if (dump_mat) {
>>> PetscViewer viewer;
>>> PetscViewerASCIIOpen(PETSC_COMM_WORLD,"mat-par-aux.m",&viewer);
>>> PetscViewerPushFormat(viewer, PETSC_VIEWER_ASCII_MATLAB);
>>> MatView(A,viewer);
>>> }
>>>
>>> This works perfectly for small cases.
>>> The program crashes for a case where the matrix A is of order 1 million
>>> but with only 4 million non-zero elements.
>>>
>>> Maybe at some point petsc is full-sizing A?
>>>
>>> Thank you,
>>> Alfredo
>>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220812/6bac7af2/attachment-0001.html>
More information about the petsc-users
mailing list