[mpich-discuss] MPI_Allgatherv complexity
Justin Luitjens
luitjens at cs.utah.edu
Mon Nov 2 18:11:33 CST 2009
Here is a graph of the time for an allgather as a function of message size.
We would like to scale up to 98K processors but as you can see the time for
all gather dramatically increases at 49K processors. At 98K the time
increases by around a factor of 4 for the few datapoints that I have tested
(they are not in the graph).
Thanks,
Justin
On Mon, Nov 2, 2009 at 5:08 PM, Rajeev Thakur <thakur at mcs.anl.gov> wrote:
> You can see the algorithms used for different messages sizes in MPICH2
> here:
> https://svn.mcs.anl.gov/repos/mpi/mpich2/trunk/src/mpi/coll/allgatherv.c,
> but Cray could be using a modified version. What is the total message size
> being gathered and how many processes are you running on?
>
> Rajeev
>
>
> ------------------------------
> *From:* mpich-discuss-bounces at mcs.anl.gov [mailto:
> mpich-discuss-bounces at mcs.anl.gov] *On Behalf Of *Justin Luitjens
> *Sent:* Monday, November 02, 2009 5:59 PM
> *To:* mpich-discuss at mcs.anl.gov
> *Subject:* [mpich-discuss] MPI_Allgatherv complexity
>
> Hi,
>
> I'm wondering what is the parallel complexity of MPI_Allgatherv. I would
> suspect in the worst case it would be something like N log P but from the
> timings i'm collecting this doesn't seem to be the case. I'm running tests
> on Kraken (http://www.nics.tennessee.edu/computing-resources/kraken
> ) and it looks like it might be something like P log P.
>
> Thanks,
>
>
> _______________________________________________
> mpich-discuss mailing list
> mpich-discuss at mcs.anl.gov
> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/mpich-discuss/attachments/20091102/9bcf37d0/attachment.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: allgather.png
Type: image/png
Size: 18128 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/mpich-discuss/attachments/20091102/9bcf37d0/attachment.png>
More information about the mpich-discuss
mailing list