[petsc-users] MatView to file

Barry Smith bsmith at petsc.dev
Fri Aug 12 17:13:49 CDT 2022



> On Aug 12, 2022, at 3:50 PM, Pierre Jolivet <pierre at joliv.et> wrote:
> 
> 
>> On 12 Aug 2022, at 9:47 PM, Alfredo Jaramillo <ajaramillopalma at gmail.com <mailto:ajaramillopalma at gmail.com>> wrote:
>> 
>> Hello Mark,
>> But why should this depend on the number of processes?
> 
> Because with non-binary formats, the matrix is centralized on the first process, which can become very costly.

! On. We expect one to use a binary format for anything but trivially sized matrices, we don't consider ASCII a reasonable format for large matrices.


> 
> Thanks,
> Pierre
> 
>> thanks
>> Alfredo
>> 
>> On Fri, Aug 12, 2022 at 1:42 PM Mark Adams <mfadams at lbl.gov <mailto:mfadams at lbl.gov>> wrote:
>> With 4 million elements you are nowhere near the 32 but integer limit of 2B or 32Gb of memory.
>> 
>> See the https://petsc.org/main/docs/manualpages/Mat/MatView <https://petsc.org/main/docs/manualpages/Mat/MatView>
>> You should go to binary format when doing large matrices.
>> 
>> Mark
>> 
>> On Fri, Aug 12, 2022 at 1:00 PM Alfredo Jaramillo <ajaramillopalma at gmail.com <mailto:ajaramillopalma at gmail.com>> wrote:
>> Hello Mark,
>> Thank you, I added the lines that you sent.
>> This only happens when running the code with more than 1 process. With only 1 MPI process the matrix is printed out.
>> With 2 processes or more I observed the program begins to allocate RAM until it exceeds the computer capacity (32GB) so I wasn't able to get the stack trace.
>> 
>> However, I was able to reproduce the problem by compiling src/ksp/ksp/tutorials/ex54.c.html <https://petsc.org/release/src/ksp/ksp/tutorials/ex54.c.html> (modifying line 144) and running it with
>> 
>> mpirun -np 2 ex54 -ne 1000
>> 
>> This gives a sparse matrix of order ~1 million. When running ex54 with only one MPI process I don't observe this excessive allocation and the matrix is printed out.
>> 
>> Thanks,
>> Alfredo
>> 
>> 
>> 
>> On Fri, Aug 12, 2022 at 10:02 AM Mark Adams <mfadams at lbl.gov <mailto:mfadams at lbl.gov>> wrote:
>> You also want:
>> 
>>     PetscCall(PetscViewerPopFormat(viewer));
>>     PetscCall(PetscViewerDestroy(&viewer));
>> 
>> This should not be a problem.
>> If this is a segv and you configure it with   '--with-debugging=1', you should get a stack trace, which would help immensely.
>> Or run in a debugger to get a stack trace.
>> 
>> Thanks,
>> Mark
>> 
>> 
>> On Fri, Aug 12, 2022 at 11:26 AM Alfredo Jaramillo <ajaramillopalma at gmail.com <mailto:ajaramillopalma at gmail.com>> wrote:
>> Dear developers,
>> 
>> I'm writing a sparse matrix into a file by doing 
>> 
>>     if (dump_mat) {
>>         PetscViewer viewer;
>>         PetscViewerASCIIOpen(PETSC_COMM_WORLD,"mat-par-aux.m",&viewer);
>>         PetscViewerPushFormat(viewer, PETSC_VIEWER_ASCII_MATLAB);
>>         MatView(A,viewer);
>>     }
>> 
>> This works perfectly for small cases.
>> The program crashes for a case where the matrix A is of order 1 million but with only 4 million non-zero elements.
>> 
>> Maybe at some point petsc is full-sizing A?
>> 
>> Thank you,
>> Alfredo
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220812/eac5949b/attachment-0001.html>


More information about the petsc-users mailing list