[petsc-users] MatAssemblyBegin freezes during MPI communication
Matthew Knepley
knepley at gmail.com
Thu Jan 18 06:46:44 CST 2024
On Thu, Jan 18, 2024 at 2:47 AM 袁煕 <yuanxi at advancesoft.jp> wrote:
> Dear PETSc Experts,
>
> My FEM program works well generally, but in some specific cases with
> multiple CPUs are used, it freezes when calling MatAssemblyBegin where
> PMPI_Allreduce is called (see attached file).
>
> After some investigation, I found that it is most probably due to
>
> ・ MatSetValue is not called from all CPUs before MatAssemblyBegin
>
> For example, when 4 CPUs are used, if there are elements in CPU 0,1,2 but
> no elements in CPU 3, then all CPUs other than CPU 3 would call
> MatSetValue function. I want to know
>
> 1. If my conjecture could be right? And If so
>
No, you do not have to call MatSetValue() from all processes.
> 2. Are there any convenient means to avoid this problem?
>
Are you calling MatAssemblyBegin() from all processes? This is necessary.
Thanks,
Matt
> Thanks,
> Xi YUAN, PhD Solid Mechanics
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240118/654ffdcd/attachment.html>
More information about the petsc-users
mailing list