[mpich-discuss] MPI_REDUCE and complex numbers
Darius Buntinas
buntinas at mcs.anl.gov
Fri Aug 19 10:02:55 CDT 2011
There isn't a max buffer size (aside from the running out of memory, but you'd get an error for that). Can you send us a short sample program that demonstrates this? How many processes are you using?
Thanks,
-d
On Aug 18, 2011, at 11:04 PM, Brian Dushaw wrote:
> I recently ran across what may be a bug in MPI_REDUCE using the latest version of mpich2-1.4., (although I can't be sure...)
>
> I have fortran 90 code that calculates a complex array of size
> about 400X6000. Each row of the array is calculated independently,
> so I apportion subsets of rows across a small cluster for calculating.
> On each node, the rows not calculated are just filled with zeros.
>
> when all done, I called:
>
> call MPI_REDUCE(psif,psiff,itemp,MPI_DOUBLE_COMPLEX,MPI_SUM,0,MPI_COMM_WORLD,ierr)
>
> which just adds up all the arrays from each node, giving me the final
> filled up array complex psiff on the master node (myid=0). itemp is size
> ncol*nrow of psiff (or about 400*6000).
>
> This worked fine for smaller sized arrays, but for the larger size
> of 400X6000 (which doesn't seem all that large to me), I would just get
> almost all zeros in the psiff array.
>
> I wondered if the array size was so large as to overflow a maximum buffer
> size for MPI_REDUCE.
>
> In any case, I rewrote the above to send back real and imaginary parts
> separately, calling instead:
>
> call MPI_REDUCE(rpsif,rpsiff,itemp,MPI_DOUBLE_PRECISION,MPI_SUM,0,MPI_COMM_WORLD,ierr)
> call MPI_REDUCE(ipsif,ipsiff,itemp,MPI_DOUBLE_PRECISION,MPI_SUM,0,MPI_COMM_WORLD,ierr)
>
> And this seems to work fine.
>
> Does anybody have any comments or wise words to enlighten this situation?
> The code takes quite a while to run, so it is slow to debug. I suppose I could write a small test case to check more simply whether this is a bug
> or not. Does MPI_REDUCE have a maximum buffer size that one should worry about?
>
> Thanks -
>
> _______________________________________________
> mpich-discuss mailing list
> mpich-discuss at mcs.anl.gov
> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
More information about the mpich-discuss
mailing list