[petsc-users] Iterative solver behavior with increasing number of mpi
Matthew Knepley
knepley at gmail.com
Wed Apr 17 11:35:16 CDT 2019
On Wed, Apr 17, 2019 at 11:59 AM Marian Greg via petsc-users <
petsc-users at mcs.anl.gov> wrote:
> Thanks Satish for the reply. However, I also observed the same behavior
> with gamg and sor preconditioners and ksp_type bcgs as well as gmres. Could
> you tell which solver and preconditioners would behave same on whatever
> number of mpi I use?
1) SOR is parallel will also be Block Jacobi-SOR
2) Jacobi will be invariant
3) Chebyshev will be invariant
4) GAMG will be invariant if you have an elliptic equation. So for instance
you can use GAMG on SNES ex5 or ex12 and the iterates will not increase
Thanks,
Matt
> Thanks, Mari
>
>
> On Wednesday, April 17, 2019, Balay, Satish <balay at mcs.anl.gov> wrote:
>
>> Yes - the default preconditioner is block-jacobi - with one block on
>> each processor.
>>
>> So when run on 1 proc vs 8 proc - the preconditioner is different
>> (with 1block for bjacobi vs 8blocks for bjacobi)- hence difference in
>> convergence.
>>
>> Satish
>>
>> On Wed, 17 Apr 2019, Marian Greg via petsc-users wrote:
>>
>> > Hi All,
>> >
>> > I am facing strange behavior of the ksp solvers with increasing number
>> of
>> > MPI. The solver is taking more and more iterations with increase in
>> number
>> > of MPIs. Is that a normal situation? I was expecting to get the same
>> number
>> > of iteration with whatever number of MPIs I use.
>> >
>> > E.g.
>> > My matrix has about 2 million dofs
>> > Solving with np 1 takes about 3500 iteration while solving with np 4
>> takes
>> > 6500 iterations for the same convergence criteria.
>> >
>> > Thanks
>> > Mari
>> >
>>
>>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190417/2bb16beb/attachment.html>
More information about the petsc-users
mailing list