[petsc-users] Strange strong scaling result
Mark Adams
mfadams at lbl.gov
Tue Jul 12 09:11:22 CDT 2022
You may get more memory bandwidth with 32 processors vs 1, as Ce mentioned.
Depends on the architecture.
Do you get the whole memory bandwidth on one processor on this machine?
On Tue, Jul 12, 2022 at 8:53 AM Matthew Knepley <knepley at gmail.com> wrote:
> On Tue, Jul 12, 2022 at 7:32 AM Ce Qin <qince168 at gmail.com> wrote:
>
>>
>>
>>>>> The linear system is complex-valued. We rewrite it into its real form
>>>>> and solve it using FGMRES and an optimal block-diagonal
>>>>> preconditioner.
>>>>> We use CG and the AMS preconditioner implemented in HYPRE to solve the
>>>>> smaller real linear system arised from applying the block
>>>>> preconditioner.
>>>>> The iteration number of FGMRES and CG keep almost constant in all the
>>>>> runs.
>>>>>
>>>>
>>>> So those blocks decrease in size as you add more processes?
>>>>
>>>>
>>>
>> I am sorry for the unclear description of the block-diagonal
>> preconditioner.
>> Let K be the original complex system matrix, A = [Kr, -Ki; -Ki, -Kr] is
>> the equivalent
>> real form of K. Let P = [Kr+Ki, 0; 0, Kr+Ki], it can beproved that P is
>> an optimal
>> preconditioner for A. In our implementation, only Kr, Ki and Kr+Ki
>> are explicitly stored as MATMPIAIJ. We use MATSHELL to represent A and P.
>> We use FGMRES + P to solve Ax=b, and CG + AMS to
>> solve (Kr+Ki)y=c. So the block size is never changed.
>>
>
> Then we have to break down the timings further. I suspect AMS is not
> taking as long, since
> all other operations scale like N.
>
> Thanks,
>
> Matt
>
>
>
>> Best,
>> Ce
>>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220712/7313a720/attachment.html>
More information about the petsc-users
mailing list