[petsc-users] GAMG processor reduction

Dave May dave.mayhem23 at gmail.com
Thu Nov 21 16:13:16 CST 2013

I argue it does matter as I've seen runs on 32k cores where a huge amount
of time is spent in those global reductions. I can provide an
implementation which uses a sub comm (PCSemiRedundant) if someone thinks
doing reductions on less cores is beneficial.

I'd like to hear from other users running mg like algs with large core
counts where they have run into this same issue.


On Thursday, 21 November 2013, Matthew Knepley wrote:

> On Thu, Nov 21, 2013 at 3:43 PM, Dave May <dave.mayhem23 at gmail.com<javascript:_e({}, 'cvml', 'dave.mayhem23 at gmail.com');>
> > wrote:
>> Is using the "big communicator" really the right way to go? What happens
>> when I call VecNorm() when the local size on most ranks =0.. the global
>> reduction still has to be performed and all ranks in the original
>> communicator associated with the fine get participate.
>> I thought the primary advantage/reason to use less ranks with small
>> distributed systems was to avoid seeing the network latency when there is
>> little computational work. I don't see how using the big communicator
>> avoids this.
> Its not just this. You do not want to get to the point where you have 1 or
> < 1 point per process, so
> you rebalance to put a reasonable number of unknowns per process and leave
> others empty. You
> could create a subcomm with only the nonzero procs to use in the solve.
> Not sure if this is worth it.
>    Matt
>> Am I missing something?
>> Cheers,
>>   Dave
>> On Thursday, 21 November 2013, Jed Brown wrote:
>>> John Mousel <john.mousel at gmail.com> writes:
>>> > Thanks Jed. How does this represent itself in the KSPView output?
>>> I'm afraid it's not there, though you can extract the ownership ranges
>>> From code.
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20131121/16890ec3/attachment.html>

More information about the petsc-users mailing list