<div dir="ltr"><div>Hello Mark,</div><div>But why should this depend on the number of processes?</div><div>thanks</div><div>Alfredo<br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Fri, Aug 12, 2022 at 1:42 PM Mark Adams <<a href="mailto:mfadams@lbl.gov">mfadams@lbl.gov</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">With 4 million elements you are nowhere near the 32 but integer limit of 2B or 32Gb of memory.<div><br></div><div>See the <a href="https://petsc.org/main/docs/manualpages/Mat/MatView" target="_blank">https://petsc.org/main/docs/manualpages/Mat/MatView</a></div><div>You should go to binary format when doing large matrices.</div><div><br></div><div>Mark</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Fri, Aug 12, 2022 at 1:00 PM Alfredo Jaramillo <<a href="mailto:ajaramillopalma@gmail.com" target="_blank">ajaramillopalma@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>Hello Mark,</div><div>Thank you, I added the lines that you sent.<br></div><div>This only happens when running the code with more than 1 process. With only 1 MPI process the matrix is printed out.</div><div>With 2 processes or more I observed the program begins to allocate RAM until it exceeds the computer capacity (32GB) so I wasn't able to get the stack trace.</div><div><br></div><div>However, I was able to reproduce the problem by compiling <a href="https://petsc.org/release/src/ksp/ksp/tutorials/ex54.c.html" target="_blank">src/ksp/ksp/tutorials/ex54.c.html</a> (modifying line 144) and running it with</div><div><br></div><div>mpirun -np 2 ex54 -ne 1000</div><div><br></div><div>This gives a sparse matrix of order ~1 million. When running ex54 with only one MPI process I don't observe this excessive allocation and the matrix is printed out.</div><div><br></div><div>Thanks,<br></div><div>Alfredo<br></div><div><br></div><div><br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Fri, Aug 12, 2022 at 10:02 AM Mark Adams <<a href="mailto:mfadams@lbl.gov" target="_blank">mfadams@lbl.gov</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">You also want:<br><br> PetscCall(PetscViewerPopFormat(viewer));<br> PetscCall(PetscViewerDestroy(&viewer));<br><br>This should not be a problem.<div>If this is a segv and you configure it with '--with-debugging=1', you should get a stack trace, which would help immensely.</div><div>Or run in a debugger to get a stack trace.</div><div><br></div><div>Thanks,</div><div>Mark<br><div><pre lang="c"><span lang="c"></span></pre><pre lang="c"><span lang="c"><br></span></pre></div></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Fri, Aug 12, 2022 at 11:26 AM Alfredo Jaramillo <<a href="mailto:ajaramillopalma@gmail.com" target="_blank">ajaramillopalma@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>Dear developers,</div><div><br></div><div>I'm writing a sparse matrix into a file by doing <br></div><div><br></div><div> if (dump_mat) {<br> PetscViewer viewer;<br> PetscViewerASCIIOpen(PETSC_COMM_WORLD,"mat-par-aux.m",&viewer);<br> PetscViewerPushFormat(viewer, PETSC_VIEWER_ASCII_MATLAB);<br> MatView(A,viewer);<br> }</div><div><br></div><div>This works perfectly for small cases.<br></div><div>The program crashes for a case where the matrix A is of order 1 million but with only 4 million non-zero elements.</div><div><br></div><div>Maybe at some point petsc is full-sizing A?</div><div><br></div><div>Thank you,</div><div>Alfredo<br></div></div>
</blockquote></div>
</blockquote></div>
</blockquote></div>
</blockquote></div>