[petsc-users] Parallel Matrix Causes a Deadlock
Ali Berk Kahraman
aliberkkahraman at yahoo.com
Sun Jan 28 14:16:33 CST 2018
Dear Barry,
I see what you are talking about, I have oversimplified my problem. The
problem is that I will have to call MatGetValues on Wavelets matrix for
every row in a loop. Thus, if I do not call assembly routine in the
child function, it starts to give "not for unassembled matrix" error
after the first loop. The relevant part of the code is appended on this
e-mail. You can see the matgetvalues calls on line 81, and it is also in
line 94's function GetWaveletPolyFit2DSingleRow_x1.
Ali
On 28-01-2018 22:24, Smith, Barry F. wrote:
> In your code there is NO reason to call the MatAssemblyBegin/End where you do. Just pull it out and call it once. I submit this is the same for any other code. Please explain enough about your code (or send it) that has to call the assembly routines a different number of times. You just pull it above all the calls to MatSetValues().
>
> Barry
>
>
>> On Jan 28, 2018, at 1:15 PM, Ali Berk Kahraman <aliberkkahraman at yahoo.com> wrote:
>>
>> Hello All,
>>
>>
>> My apologies, I have closed this e-mail window and the first thing I read on the manual is "ALL processes that share a matrix MUST call MatAssemblyBegin() and MatAssemblyEnd() the SAME NUMBER of times". So I understand that petsc simply does not support unequal number of assembly calls.
>>
>>
>> My question evolves then. I have a problem at hand where I do not know how many calls each process will make to MatAssembly routines. Any suggestions to make this work?
>>
>> On 28-01-2018 22:09, Ali Berk Kahraman wrote:
>>> Hello All,
>>>
>>>
>>> The code takes a parallel matrix and calls a function using that matrix. That function fills the specified row of that matrix with the id of the process that part of the matrix belongs in. You can see the short code in the attachment, it is about 80 lines.
>>>
>>>
>>> The problem is that the code gets into a deadlock at some point, usually at the last row of each process except for the last process (greatest pid). I use petsc with configure options "--with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-mpich --download-fblaslapack".
>>>
>>>
>>> I am a beginner with MPI, so I do not know what may be causing this. My apologies in advance if this is a very trivial problem.
>>>
>>>
>>> Best Regards to All,
>>>
>>>
>>> Ali Berk Kahraman
>>>
>>> M.Sc. Student, Mechanical Eng.
>>>
>>> Bogazici Uni., Istanbul, Turkey
>>>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: PetscRealProblem.c
Type: text/x-csrc
Size: 4580 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180128/33b68a9d/attachment-0001.bin>
More information about the petsc-users
mailing list