[mpich-discuss] Using g95 to test mpich2-1.4/test file problems
Anthony Chan
chan at mcs.anl.gov
Wed Aug 8 09:52:59 CDT 2012
Can you try the latest stable release, 1.4.1p1 ?
----- Original Message -----
> Hi,
> I'm sorry if I'm asking too many times.
> My case will need using mpi_init, so I install the MPICH2-1.4.
> I'm using WindosXP, Cygwin, g95 and single-node systems.
> And thank you, Jayesh Krishna. Answering my questions to let me know I
> should
> install latest Cygwin version.
>
> So, I installed the latest Cygwin version on another computer.
> And then the Cygwin Information was CYGWIN_NT-5.1 PC03
> 1.7.16(0.262/5/3)
> 2012-07-20 22:55 i686 Cygwin.
> Now, I can use mpiexec running cpi. And I can use -n <number> equal or
> greater than 4.
>
> daizy at PC03 /cygdrive/c/cygwin/mpich2-1.4
> $ mpiexec -n 4 ./examples/cpi
> Process 0 of 4 is on PC03
> pi is approximately 3.1415926544231239, Error is 0.0000000008333307
> wall clock time = 0.000000
> Process 1 of 4 is on PC03
> Process 2 of 4 is on PC03
> Process 3 of 4 is on PC03
>
> daizy at PC03 /cygdrive/c/cygwin/mpich2-1.4
> $ mpiexec -n 5 ./examples/cpi
> Process 0 of 5 is on PC03
> pi is approximately 3.1415926544231230, Error is 0.0000000008333298
> wall clock time = 0.000000
> Process 1 of 5 is on PC03
> Process 2 of 5 is on PC03
> Process 3 of 5 is on PC03
> Process 4 of 5 is on PC03
>
> I would appreciate if someone can help me with the following problem:
> When I use command make testing in mpich2-1.4/test file, I still got
> error messages (detail saves in testing.txt).
> Processing directory coll
> Looking in ./coll/testlist
> Unexpected output in allred: Fatal error in MPI_Init_thread: Other MPI
> error, error stack:
> Unexpected output in allred: MPIR_Init_thread(388).................:
> Unexpected output in allred: MPID_Init(139)........................:
> channel initialization failed
> Unexpected output in allred: MPIDI_CH3_Init(38)....................:
> Unexpected output in allred: MPID_nem_init(196)....................:
> Unexpected output in allred: MPIDI_CH3I_Seg_commit(366)............:
> Unexpected output in allred: MPIU_SHMW_Hnd_deserialize(324)........:
> Unexpected output in allred: MPIU_SHMW_Seg_open(863)...............:
> Unexpected output in allred: MPIU_SHMW_Seg_create_attach_templ(637):
> open
> failed - Device or resource busy
> Program allred exited without No Errors
>
> Processing directory info
> Looking in ./info/testlist
> Processing directory init
> Looking in ./init/testlist
> Processing directory pt2pt
> Looking in ./pt2pt/testlist
> Unexpected output in sendflood: Fatal error in MPI_Init: Other MPI
> error,
> error stack:
> Unexpected output in sendflood:
> MPIR_Init_thread(388).................:
> Unexpected output in sendflood:
> MPID_Init(139)........................:
> channel initialization failed
> Unexpected output in sendflood:
> MPIDI_CH3_Init(38)....................:
> Unexpected output in sendflood:
> MPID_nem_init(196)....................:
> Unexpected output in sendflood:
> MPIDI_CH3I_Seg_commit(369)............:
> Unexpected output in sendflood:
> MPIU_SHMW_Seg_attach(925).............:
> Unexpected output in sendflood:
> MPIU_SHMW_Seg_create_attach_templ(637):
> open failed - Permission denied
> Unexpected output in sendflood: [proxy:0:0 at PC03] HYDU_sock_read
> (./utils/sock/sock.c:272): read error (Software caused connection
> abort)
> Unexpected output in sendflood: [proxy:0:0 at PC03] pmi_cb
> (./pm/pmiserv/pmip_cb.c:228): unable to read PMI command
> Unexpected output in sendflood: [proxy:0:0 at PC03]
> HYDT_dmxu_poll_wait_for_event (./tools/demux/demux_poll.c:77):
> callback
> returned error status
> Unexpected output in sendflood: [proxy:0:0 at PC03] main
> (./pm/pmiserv/pmip.c:226): demux engine error waiting for event
>
> Are these error messages related with single-node systems?
> Can you give me some advices to solve the problem?
>
> Best regards,
> Chi-Che Hsieh
>
> _______________________________________________
> mpich-discuss mailing list mpich-discuss at mcs.anl.gov
> To manage subscription options or unsubscribe:
> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
More information about the mpich-discuss
mailing list