[petsc-users] Use block Jacobi preconditioner with SNES
Ali Reza Khaz'ali
arkhazali at cc.iut.ac.ir
Sat Sep 8 12:23:21 CDT 2018
I am using MatCreateAIJ. After applying you advice, my code is working
without problems. Thanks again!
Best Wishes,
Ali
On 9/8/2018 1:11 AM, Smith, Barry F. wrote:
>
>> On Sep 7, 2018, at 3:22 PM, Ali Reza Khaz'ali <arkhazali at cc.iut.ac.ir> wrote:
>>
>> Thanks a lot. It is working as expected. However, splitting a block matrix over MPI processes, sometimes results in splitting one block over 2 (or possibly more) processors. It causes the block Jacobi preconditioner (+Linear solver) to fail. Is there any way in PETSc to enforce a pre-specified array/matrix distribution over MPI processes? I want to distribute the array/matrix entries, so that no block is divided between two MPI processes.
> How are you creating the matrix? If you use MatSetSizes() or MatCreateMPIAIJ() then just pass in the local size, not the global size and so long as you set an appropriate local size you can insure each block is associated with a single process. Of course you need to use the same local size when creating the vectors.
>
>
> Barry
>
>>
>> Many Thanks,
>>
>> Ali
>>
>>
>> On 9/7/2018 3:06 AM, Smith, Barry F. wrote:
>>> It is ready in the branch barry/add-mpivpbjacobi; please let me know if you have any difficulties
>>>
>>> Barry
>>>
>>>
>>>> On Sep 6, 2018, at 4:48 PM, arkhazali at cc.iut.ac.ir wrote:
>>>>
>>>> Yes, I'd be grateful if MPI support could be added, please.
>>>>
>>>> Many thanks,
>>>> Ali
>>>>
>>>> ------ Original message------
>>>> From: Smith, Barry F.
>>>> Date: Fri, Sep 7, 2018 01:45
>>>> To: Ali Reza Khaz'ali;
>>>> Cc: PETSc;
>>>> Subject:Re: [petsc-users] Use block Jacobi preconditioner with SNES
>>>>
>>>> I have not yet added this support, do you need it now?
>>>>
>>>> Barry
>>>>
>>>>
>>>>> On Sep 6, 2018, at 4:12 PM, Ali Reza Khaz'ali
>>>> wrote:
>>>>> I have to apologize for making this topic so lengthy. Is vpbjacobi supposed to work in parallel mode (with MPI)?
>>>>>
>>>>>
>>>>> On 8/30/2018 1:32 AM, Smith, Barry F. wrote:
>>>>>>> On Aug 29, 2018, at 3:48 PM, Ali Reza Khaz'ali
>>>> wrote:
>>>>>>>> Can you confirm if your code ran successfully with vpbjacobi and if the convergence history was very similar to that achieved using
>>>>>>>> bjacobi ?
>>>>>>> They have a very very very small difference (which is probably due to round-off errors), they generally behave the same. My code run successfully with both of them with the same convergence history. However, vpbjacobi seems a little faster.
>>>>>> Great, this is exactly what to expect. I will put the branch into next for testing and then it will be in the master branch and in the next release.
>>>>>>
>>>>>> Barry
>>>>>>
>>>>>>> Many thanks,
>>>>>>> Ali
>>>>> --
>>>>> Ali Reza Khaz’ali
>>>>> Assistant Professor of Petroleum Engineering,
>>>>> Department of Chemical Engineering
>>>>> Isfahan University of Technology
>>>>> Isfahan, Iran
>>>>>
>> --
>> Ali Reza Khaz’ali
>> Assistant Professor of Petroleum Engineering,
>> Department of Chemical Engineering
>> Isfahan University of Technology
>> Isfahan, Iran
>>
--
Ali Reza Khaz’ali
Assistant Professor of Petroleum Engineering,
Department of Chemical Engineering
Isfahan University of Technology
Isfahan, Iran
More information about the petsc-users
mailing list