[mpich-discuss] Problems with Pcontrol and MPE2 -- fixed, please accept this patch

Anthony Chan chan at mcs.anl.gov
Mon May 3 22:56:32 CDT 2010


Cool.  Let me know if you see the problem resurface again...

A.Chan

----- "Brian Wainscott" <brian at lstc.com> wrote:

> Hi Anthony,
> 
> Well, as far as I can see it looks like you got it that time -- my
> real
> application runs just fine with these changes.  Thank you!
> 
> Brian
> 
> 
> On 05/01/10 22:44, Anthony Chan wrote:
> > 
> > Hi Brian,
> > 
> > Can you try log_mpi_core.c again ?
> > 
> >
> https://svn.mcs.anl.gov/repos/mpi/mpich2/trunk/src/mpe2/src/wrappers/src/log_mpi_core.c
> > 
> > You may also need the update in slog2sdk, i.e. jumpshot and
> clog2TOslog2 code.
> > 
> >
> https://svn.mcs.anl.gov/repos/mpi/mpich2/trunk/src/mpe2/src/slog2sdk/lib/jumpshot.jar
> >
> https://svn.mcs.anl.gov/repos/mpi/mpich2/trunk/src/mpe2/src/slog2sdk/lib/clog2TOslog2.jar
> > 
> > A.Chan
> > 
> > ----- "Anthony Chan" <chan at mcs.anl.gov> wrote:
> > 
> >> Great.  I will look into the bug with your test program.
> >>
> >> Thanks,
> >> A.Chan
> >>
> >> ----- "Brian Wainscott" <brian at lstc.com> wrote:
> >>
> >>> Hi Anthony,
> >>>
> >>> OK, I've created a short program that does this:
> >>>
> >>>  1 -- creates and dups some communicators
> >>>  2 -- frees some of them
> >>>  3 -- uses some of them, then frees them.
> >>>
> >>> It can be compiled without any calls to MPI_Pcontrol, in which
> case
> >> it
> >>> runs fine
> >>> and produces a log file.  It can also be compiled with calls to
> >>> MPI_Pcontrol(0)
> >>> just before (1) and a call to MPI_Pcontrol(1) between (2) and
> (3).
> >>> (see
> >>> WITH_PCONTROL at the top of the source file).
> >>>
> >>> If I build it with the MPI_Pcontrol calls in place, it segfaults
> >> with
> >>> your latest
> >>> changes.  Hopefully this will help you figure out what is going
> on.
> >>>
> >>> I'm using OpenMPI with MPE2, but I really don't think that
> matters. 
> >> I
> >>> built it
> >>> with this command:
> >>>
> >>> mpecc -mpilog -o dynasim dynasim.c
> >>>
> >>> and ran it on 4 processors.
> >>>
> >>> Good luck, and let me know what you find out.
> >>>
> >>> Brian


More information about the mpich-discuss mailing list