[petsc-users] MatAssemblyBegin freezes during MPI communication

袁煕 yuanxi at advancesoft.jp
Thu Jan 18 01:46:47 CST 2024


Dear PETSc Experts,

My FEM program works well generally, but in some specific cases with
multiple CPUs are used, it freezes when calling MatAssemblyBegin where
PMPI_Allreduce is called (see attached file).

After some investigation, I found that it is most probably due to

・ MatSetValue is not called from all CPUs before MatAssemblyBegin

For example, when 4 CPUs are used, if there are elements in CPU 0,1,2 but
no elements in CPU 3, then all CPUs other than CPU 3 would call
MatSetValue  function. I want to know

1. If my conjecture could be right? And If so
2. Are there any convenient means to avoid this problem?

Thanks,
Xi YUAN, PhD Solid Mechanics
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240118/5bbcceff/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: aa.PNG
Type: image/png
Size: 112520 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240118/5bbcceff/attachment-0001.png>


More information about the petsc-users mailing list