[mpich-discuss] (no subject)

Anthony Chan chan at mcs.anl.gov
Fri Dec 17 19:17:31 CST 2010


The send and receive buffers are identical in this MPI_Gatherv call
in your CFD code. MPI_IN_PLACE should be used instead.

http://www.mcs.anl.gov/research/projects/mpi/mpi-standard/mpi-report-2.0/node145.htm#Node147

A.Chan

----- Original Message -----
> Hi,
> I have used MPICH2 on several computers and clusters in the past to
> run the Fire Dynamics Simulator (FDS) CFD software but am having
> issues with a new computer. The below message is generated:
> 
> "
> Fatal error in PMPI_Gatherv: Internal MPI error!, error stack:
> PMPI_Gatherv(376).....: MPI_Gatherv failed(sbuf=000000003C07F858,
> scount=1, MPI_
> DOUBLE_PRECISION, rbuf=000000003C07F858, rcnts=000000003C0EA8D8,
> displs=00000000
> 3C0EA998, MPI_DOUBLE_PRECISION, root=0, MPI_COMM_WORLD) failed
> MPIR_Gatherv_impl(189):
> MPIR_Gatherv(102).....:
> MPIR_Localcopy(349)...: memcpy arguments alias each other,
> dst=000000003C07F858
> src=000000003C07F858 len=8
> "
> 
> Can you tell me what this means or what settings may need to be
> changed to allow MPICH2 to run? I am able to bypass the MPICH2 file
> path and run FDS on a single core so I am fairly certain that the
> issue is not with the FDS software.
> 
> Thanks,
> David
> 
> 
> _______________________________________________
> mpich-discuss mailing list
> mpich-discuss at mcs.anl.gov
> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss


More information about the mpich-discuss mailing list