[petsc-users] MatAssemblyBegin freezes during MPI communication

袁煕 yuanxi at advancesoft.jp
Thu Jan 18 20:17:41 CST 2024


Thanks for your explanation.

It seems that it is due to my calling of MatDiagonalSet() before
MatAssemblyBegin(). My problem is resolved by putting MatDiagonalSet()
after MatAssemblyBegin().

Much thanks for your help.
Xi YUAN, PhD Solid Mechanics

2024年1月18日(木) 22:20 Junchao Zhang <junchao.zhang at gmail.com>:

>
>
>
> On Thu, Jan 18, 2024 at 1:47 AM 袁煕 <yuanxi at advancesoft.jp> wrote:
>
>> Dear PETSc Experts,
>>
>> My FEM program works well generally, but in some specific cases with
>> multiple CPUs are used, it freezes when calling MatAssemblyBegin where
>> PMPI_Allreduce is called (see attached file).
>>
>> After some investigation, I found that it is most probably due to
>>
>> ・ MatSetValue is not called from all CPUs before MatAssemblyBegin
>>
>> For example, when 4 CPUs are used, if there are elements in CPU 0,1,2 but
>> no elements in CPU 3, then all CPUs other than CPU 3 would call
>> MatSetValue  function. I want to know
>>
>> 1. If my conjecture could be right? And If so
>>
> No.  All processes do MPI_Allreduce to know if there are incoming values
> set by others.  To know why hanging, you can attach gdb to all MPI
> processes to see where they are.
>
>>
>>
> 2. Are there any convenient means to avoid this problem?
>>
>> Thanks,
>> Xi YUAN, PhD Solid Mechanics
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240119/1e3898f3/attachment.html>


More information about the petsc-users mailing list