[mpich-discuss] memcpy argument memory ranges overlap
Dave Goodell
goodell at mcs.anl.gov
Mon Mar 1 11:04:26 CST 2010
Do you have a small test program that we can use to reproduce this?
Or does the failure only occur when using CAM?
How many processes are in the MPI job? (it looks like at least 7)
If you just need to get going right now, you can re-configure MPICH2
with "CPPFLAGS=-DNDEBUG" to disable assertions. However it would be
really nice to be able to find and fix the underlying bug here.
-Dave
On Mar 1, 2010, at 9:25 AM, Shouben Zhou wrote:
> I am new in this list and need some help.
>
> I tried a bug report ( I think it is a bug related), but failed to
> do so. So I am post my question here if some body can help.
>
> I see a bug report #1006 reported same error I am getting.
>
> One of my customers who is using CAM (the Community Atmosphere
> Model) is able to run it with MPI1 and MPI2-1.1,but fails with
> MPI2-1.2 and above
>
> Here is the breif list of error
>
> 0: Assertion failed in file helper_fns.c at line 337: 0
> 0: memcpy argument memory ranges overlap, dst_=0xd350718
> src_=0xd350718 len_=4
> 0: 0: internal ABORT - process 0
> 6: Fatal error in PMPI_Bcast: Other MPI error, error stack:
> 6: PMPI_Bcast(1302)......................: MPI_Bcast(buf=0x969e4e0,
> count=256, MPI_CHARACTER, root=0, comm=0xc4000002) failed
> 6: MPIR_Bcast(998).......................: 5: Fatal error in
> PMPI_Bcast: Other MPI error, error stack:
> 6: MPIR_Bcast_scatter_ring_allgather(842): 5:
> PMPI_Bcast(1302)......................: MPI_Bcast(buf=0x969e4e0,
> count=256, MPI_CHARACTER, root=0, comm=0x84000006) failed
> 6: MPIR_Bcast_binomial(157)..............: 5:
> MPIR_Bcast(998).......................: 6:
> MPIC_Recv(83).........................: 5:
> MPIR_Bcast_scatter_ring_allgather(849): 6:
> MPIC_Wait(513)........................: 5:
> MPIR_Bcast_binomial(157)..............: 6:
> MPIDI_CH3I_Progress(150)..............: 5:
> MPIC_Recv(83).........................: 6:
> MPID_nem_mpich2_blocking_recv(917)....: 5:
> MPIC_Wait(513)........................: 7: Fatal error in
> PMPI_Bcast: Other MPI error, error stack:
> 6: MPID_nem_tcp_connpoll(1709)...........: Communication error
>
>
> --
> --
> Shouben Zhou
> Science Systems and Applications Inc.(SSAI)
> 1 Enterprise Pkwy, Hampton, VA 23666
> Tel: (757)951-1905 Fax: (757)951-1900
> Email: Shouben.Zhou at nasa.gov
>
> _______________________________________________
> mpich-discuss mailing list
> mpich-discuss at mcs.anl.gov
> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
More information about the mpich-discuss
mailing list