[petsc-users] Status of PETScSF failures with GPU-aware MPI on Perlmutter

Sajid Ali sajidsyed2021 at u.northwestern.edu
Thu Nov 2 15:36:59 CDT 2023


Hi PETSc-developers,

I had posted about crashes within PETScSF when using GPU-aware MPI on
Perlmutter a while ago (
https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2022-February/045585.html).
Now that the software stacks have stabilized, I was wondering if there was
a fix for the same as I am still observing similar crashes.

I am attaching the trace of the latest crash (with PETSc-3.20.0) for
reference.

Thank You,
Sajid Ali (he/him) | Research Associate
Data Science, Simulation, and Learning Division
Fermi National Accelerator Laboratory
s-sajid-ali.github.io
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20231102/59fb8108/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: 2_gpu_crash
Type: application/octet-stream
Size: 11301 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20231102/59fb8108/attachment-0001.obj>


More information about the petsc-users mailing list