[mpich-discuss] Invalid buffer pointer - Buffers must not be aliased error from MPI_Reduce

Dave Goodell goodell at mcs.anl.gov
Wed Oct 21 14:27:47 CDT 2009


This is fixed by r5518: https://trac.mcs.anl.gov/projects/mpich2/changeset/5518

Thanks again for reporting this.

-Dave

On Oct 14, 2009, at 12:50 PM, Dave Goodell wrote:

> This looks like a bug to me.  Reading the current trunk source shows  
> that the bug is almost certainly still present, although the code is  
> slightly different than what was there in 1.0.8.
>
> I'll fix this up shortly.  Thanks for reporting it.
>
> -Dave
>
> On Oct 14, 2009, at 12:08 PM, Inghram, Kenneth wrote:
>
>> Running mpich2-1.0.8 on x86_64 on RHEL 5.2  
>> (2.6.18-124.el5.HOTFIX.IT239760)
>>
>> Receiving the following error running the code below. The aliased  
>> pointer check should only occur on the receiving (root) node,  
>> should it not?
>>
>> Looked through the changes up to 1.2, but I did not see anything  
>> that addressed this issue.
>>
>> Is this a bug in 1.0.8?
>>
>> -ken
>>
>>> /opt/mpich2/bin/mpicc mpich2-bug.c -o bug-mpi
>>> export MPD_USE_ROOT_MPD=1
>>> /opt/mpich2/bin/mpiexec -n 2 ./bug-mpi
>>
>> Fatal error in MPI_Reduce: Invalid buffer pointer, error stack:
>> MPI_Reduce(850): MPI_Reduce(sbuf=0x7fffffff5c60, rbuf=0x7fffffff5c60,
>> count=10, MPI_INT, MPI_SUM, root=0, MPI_COMM_WORLD) failed
>> MPI_Reduce(808): Buffers must not be aliased[cli_1]: aborting job:
>> Fatal error in MPI_Reduce: Invalid buffer pointer, error stack:
>> MPI_Reduce(850): MPI_Reduce(sbuf=0x7fffffff5c60, rbuf=0x7fffffff5c60,
>> count=10, MPI_INT, MPI_SUM, root=0, MPI_COMM_WORLD) failed
>> MPI_Reduce(808): Buffers must not be aliased
>>
>> The simple work around is to change the MPI_Reduce line...
>>
>> from
>>
>> MPI_Reduce(a,a,SIZE,MPI_INT,MPI_SUM,0,MPI_COMM_WORLD);
>>
>> to
>>
>> MPI_Reduce(a,NULL,SIZE,MPI_INT,MPI_SUM,0,MPI_COMM_WORLD);
>>
>>
>> ------  mpich2-bug.c
>>
>> #include <stdio.h>
>> #include <mpi.h>
>>
>> #define SIZE 10
>>
>> int main( int argc, char *argv[])
>> {
>>
>> MPI_Init( &argc, &argv);
>>
>> int rank, size;
>>
>> MPI_Comm_size( MPI_COMM_WORLD, &size);
>> MPI_Comm_rank( MPI_COMM_WORLD, &rank);
>>
>> printf( "Hello from %4d of %4d\n", rank, size);
>>
>> int i, a[SIZE];
>>
>> for( i = 0; i < SIZE; i++)
>>   a[i] = 1;
>>
>> if( rank == 0)
>>   MPI_Reduce(MPI_IN_PLACE,a,SIZE,MPI_INT,MPI_SUM,0,MPI_COMM_WORLD);
>> else
>>   MPI_Reduce(a,a,SIZE,MPI_INT,MPI_SUM,0,MPI_COMM_WORLD);
>>
>> if ( rank == 0)
>>   for( i = 0; i < SIZE; i++)
>>     printf( "%3d %6d %6d\n", i a[i], i * size);
>>
>> MPI_Finalize();
>>
>> return 0;
>> }
>> _______________________________________________
>> mpich-discuss mailing list
>> mpich-discuss at mcs.anl.gov
>> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
>
> _______________________________________________
> mpich-discuss mailing list
> mpich-discuss at mcs.anl.gov
> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss



More information about the mpich-discuss mailing list