<div dir="ltr"><div>Q0) does -memory_view trace GPU memory as well, or is there another method to query the peak device memory allocation?</div><div><br></div><div>Q1) I'm loading a aijcusparse matrix with MatLoad, and running with -ksp_type fgmres -pc_type gamg -mg_levels_pc_type asm with mat info 27,142,948 rows and cols, bs=4, total nonzeros 759,709,392. Using 8 ranks on 8x80GB GPUs, and during the setup phase before crashing with CUSPARSE_STATUS_INSUFFICIENT_RESOURCES nvidia-smi shows the below pasted content.</div><div><br></div><div>GPU memory usage spanning from 36GB-50GB but with one rank at 77GB. Is this expected? Do I need to manually repartition this somehow?</div><div><br></div><div>Thanks,</div><div>Mark<br></div><div><br></div><div><br></div><div><p class="MsoNormal">+-----------------------------------------------------------------------------+<u></u><u></u></p>
<p class="MsoNormal">| Processes: |<u></u><u></u></p>
<p class="MsoNormal">| GPU GI CI PID Type Process name GPU Memory |<u></u><u></u></p>
<p class="MsoNormal">| ID ID Usage |<u></u><u></u></p>
<p class="MsoNormal">|=============================================================================|<u></u><u></u></p>
<p class="MsoNormal">| 0 N/A N/A 1630309 C nvidia-cuda-mps-server 27MiB |<u></u><u></u></p>
<p class="MsoNormal">| 0 N/A N/A 1696543 C ./petsc_solver_test 38407MiB |<u></u><u></u></p>
<p class="MsoNormal">| 0 N/A N/A 1696544 C ./petsc_solver_test 467MiB |<u></u><u></u></p>
<p class="MsoNormal">| 0 N/A N/A 1696545 C ./petsc_solver_test 467MiB |<u></u><u></u></p>
<p class="MsoNormal">| 0 N/A N/A 1696546 C ./petsc_solver_test 467MiB |<u></u><u></u></p>
<p class="MsoNormal">| 0 N/A N/A 1696548 C ./petsc_solver_test 467MiB |<u></u><u></u></p>
<p class="MsoNormal">| 0 N/A N/A 1696550 C ./petsc_solver_test 471MiB |<u></u><u></u></p>
<p class="MsoNormal">| 0 N/A N/A 1696551 C ./petsc_solver_test 467MiB |<u></u><u></u></p>
<p class="MsoNormal">| 0 N/A N/A 1696552 C ./petsc_solver_test 467MiB |<u></u><u></u></p>
<p class="MsoNormal">| 1 N/A N/A 1630309 C nvidia-cuda-mps-server 27MiB |<u></u><u></u></p>
<p class="MsoNormal">| 1 N/A N/A 1696544 C ./petsc_solver_test 35849MiB |<u></u><u></u></p>
<p class="MsoNormal">| 2 N/A N/A 1630309 C nvidia-cuda-mps-server 27MiB |<u></u><u></u></p>
<p class="MsoNormal">| 2 N/A N/A 1696545 C ./petsc_solver_test 36719MiB |<u></u><u></u></p>
<p class="MsoNormal">| 3 N/A N/A 1630309 C nvidia-cuda-mps-server 27MiB |<u></u><u></u></p>
<p class="MsoNormal">| 3 N/A N/A 1696546 C ./petsc_solver_test 37343MiB |<u></u><u></u></p>
<p class="MsoNormal">| 4 N/A N/A 1630309 C nvidia-cuda-mps-server 27MiB |<u></u><u></u></p>
<p class="MsoNormal">| 4 N/A N/A 1696548 C ./petsc_solver_test 36935MiB |<u></u><u></u></p>
<p class="MsoNormal">| 5 N/A N/A 1630309 C nvidia-cuda-mps-server 27MiB |<u></u><u></u></p>
<p class="MsoNormal">| 5 N/A N/A 1696550 C ./petsc_solver_test 49953MiB |<u></u><u></u></p>
<p class="MsoNormal">| 6 N/A N/A 1630309 C nvidia-cuda-mps-server 27MiB |<u></u><u></u></p>
<p class="MsoNormal">| 6 N/A N/A 1696551 C ./petsc_solver_test 47693MiB |<u></u><u></u></p>
<p class="MsoNormal">| 7 N/A N/A 1630309 C nvidia-cuda-mps-server 27MiB |<u></u><u></u></p>
<p class="MsoNormal">| 7 N/A N/A 1696552 C ./petsc_solver_test 77331MiB |<u></u><u></u></p>
<p class="MsoNormal">+-----------------------------------------------------------------------------+</p></div></div>