[mpich-discuss] MPICH2 for ARM
Harald Schuster
schuster at technikum-wien.at
Mon Nov 14 07:35:59 CST 2011
Ok I used now the mpich2-1.3.2 version with hydra and start the program
with the parameter "-launcher fork" and it works. I don't get a
segmentation fault any more and thats fine. But I have one question
about the fork launcher. Is it possible to start a programm on different
nodes with this launcher or must i use the ssh launcher for this?
Because with the ssh launcher I got an error but I think this error
depends on the configuration of ssh on my node?!
[mpiexec at da830] HYDU_create_process
(/mnt/ti_am1707_root/home/default/mpich2-1.3.2/src/pm/hydra/utils/launch/launch.c:69):
execvp error on file ssh (No such file or directory)
best regards
Harald
On 14.11.2011 12:59, Harald Schuster wrote:
> Hi
>
> When I use the version mpich2-1.3.2 with the process manager hydra and
> I run he "mpirun /bin/hostname" command i got the following error:
>
> [mpiexec at da830] HYDU_create_process
> (/mnt/ti_am1707_root/home/default/mpich2-1.3.2/src/pm/hydra/utils/launch/launch.c:69):
> execvp error on file ssh (No such file or directory)
>
> Do I need ssh when I use hydra and ch3:sock?!
>
> Harald
>
> On 11.11.2011 22:51, Pavan Balaji wrote:
>>
>> Does the following command work:
>>
>> % mpirun /bin/hostname
>>
>> If this gets a segmentation fault, the problem might be with mpirun.
>> In this case, you can try to debug it using:
>>
>> % gdb mpirun
>> (gdb) r /bin/hostname
>>
>> -- Pavan
>>
>> On 11/11/2011 03:31 AM, Harald Schuster wrote:
>>> Hi
>>>
>>> I now tried the cpi example from the mpich2 sources and I also got a
>>> segmentation fault. I used the following configuration:
>>>
>>> CFLAGS="-static" CC=arm-linux-gcc ./../configure
>>> --prefix=/mnt/ti_am1707_root/usr/mpich2-install --with-pm=hydra
>>> --disable-f77 --disable-fc --disable-cxx --enable-g=dbg,mem,log
>>> --with-device=ch3:sock --host=arm-linux
>>>
>>> How is it possible to trace the execution of the program?!
>>>
>>>
>>>
>>> On 10.11.2011 16:33, Harald Schuster wrote:
>>>> I added the config.log to this mail but i changed the pm from hydra to
>>>> gforker only for testing and know the helloWorld example runs but when
>>>> I tried a example with a broadcast the programm stops without any
>>>> error but it not finishes. The gforker and hydra uses sockets or?
>>>>
>>>> On 10.11.2011 16:12, Jeff Hammond wrote:
>>>>> I think it will help if you post config.log.
>>>>>
>>>>> Can you run the test that fails within GDB or Valgrind? I do not
>>>>> think that mpirun itself is segfaulting, but rather that it activates
>>>>> the code path that does. Maybe this much was clear to you already.
>>>>>
>>>>> Jeff
>>>>>
>>>>> On Thu, Nov 10, 2011 at 3:33 AM, Harald Schuster
>>>>> <schuster at technikum-wien.at> wrote:
>>>>>> Hi
>>>>>>
>>>>>> I use MPICH2 on my normal Linux PC and it works fine. My next goal
>>>>>> is to run
>>>>>> MPICH2 on a arm926 (am1707) which is simulated in SimSoc and where a
>>>>>> normal
>>>>>> linux OS is running. The linux is compiled with the buildroot
>>>>>> 2010.05 and I
>>>>>> use the arm-linux-gcc 4.3.4. I tried to make the the mpich2 with the
>>>>>> arm-linux-gcc and it works. I used the following configuration:
>>>>>>
>>>>>> CFLAGS="-static" CC=arm-linux-gcc ./../configure
>>>>>> --prefix=/mnt/ti_am1707_root/usr/mpich2-install --with-pm=hydra
>>>>>> --disable-f77 --disable-fc --disable-cxx --disable-sharedlibs
>>>>>> --host=arm-linux --with-device=ch3:sock
>>>>>>
>>>>>> Also the make install process works fine. Then I compiled a small
>>>>>> helloWorld
>>>>>> example with the mpicc for arm and it also works. When I run the
>>>>>> helloWorld
>>>>>> example on the simulated processor I always got an segmentation
>>>>>> fault. For
>>>>>> starting the program I use the following command:
>>>>>>
>>>>>> mpirun -np 1 helloWorld
>>>>>>
>>>>>> When I start the helloWorld program without the mpirun command I got
>>>>>> the
>>>>>> right output. So I think the segmentation fault is caused by the
>>>>>> mpirun
>>>>>> command but I don't know where and what I can do to solve the
>>>>>> problem.
>>>>>>
>>>>>> Best regards
>>>>>> Harald
>>>>>> _______________________________________________
>>>>>> mpich-discuss mailing list mpich-discuss at mcs.anl.gov
>>>>>> To manage subscription options or unsubscribe:
>>>>>> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
>>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>>
>>>> _______________________________________________
>>>> mpich-discuss mailing listmpich-discuss at mcs.anl.gov
>>>> To manage subscription options or unsubscribe:
>>>> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
>>>
>>>
>>>
>>>
>>> _______________________________________________
>>> mpich-discuss mailing list mpich-discuss at mcs.anl.gov
>>> To manage subscription options or unsubscribe:
>>> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
>>
>
> _______________________________________________
> mpich-discuss mailing list mpich-discuss at mcs.anl.gov
> To manage subscription options or unsubscribe:
> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
More information about the mpich-discuss
mailing list