[mpich-discuss] Problem with mpich2 installation

Gus Correa gus at ldeo.columbia.edu
Wed Apr 22 10:48:54 CDT 2009


Hi Ankush

You sent the same question to the OpenMPI list.
It "was working perfectly before" with OpenMPI, not with MPICH2.

Which MPI flavor do you want to use?
They have different installation and runtime requirements,
different libraries, different executables, etc.
What they share in common is the same basic API and
programming language bindings, which enables code portability.

Also, they may get overwritten if you install both using RPMS or yum.
Better install from source, using --prefix to choose a different 
directory for each one.

I hope this helps.

Gus Correa
---------------------------------------------------------------------
Gustavo Correa
Lamont-Doherty Earth Observatory - Columbia University
Palisades, NY, 10964-8000 - USA
---------------------------------------------------------------------

Dave Goodell wrote:
> What do you mean by "was working perfectly before"?  Was it working with 
> a previous installation of MPICH2 or a different MPI implementation?  
> Was it working when you just used one host but not with multiple hosts?
> 
> This looks like you have not yet created root's mpd.conf file.  If you 
> insist on running mpd as root, you should follow the directions in 
> section 5.1.6 of the Installer's Guide that I pointed out before.  It 
> tells you that you need to create root's mpd.conf file in /etc/mpd.conf 
> instead of $HOME/.mpd.conf.
> 
> -Dave
> 
> On Apr 22, 2009, at 1:01 AM, Ankush Kaul wrote:
> 
>> We are facing another problem,
>> whenever we try to run mpirun command (which was working perfectly 
>> before) we get this error:
>>
>> usr/local/bin/mpdroot: open failed for root's mpd conf filempdtrace 
>> (__init__ 1190): forked process failed; status=255
>> is it anything to do with mpich? how can we undo it?
>>
>>
>>
>>
>> On Tue, Apr 21, 2009 at 9:12 PM, Dave Goodell <goodell at mcs.anl.gov> 
>> wrote:
>> There are two problems here.  The first is that you do not have an 
>> .mpd.conf file setup.  Doing what mpd tells you to do when you run it 
>> will fix this.
>>
>> The second is that you are trying to run mpd as root.  This is 
>> possible and supported, but not really recommended unless you know 
>> what you are doing.  See section 5.1.6 of the MPICH2 Installer's Guide 
>> [1] for help with running mpd as root.
>>
>> -Dave
>>
>> [1] 
>> http://www.mcs.anl.gov/research/projects/mpich2/documentation/files/mpich2-1.0.8-installguide.pdf 
>>
>>
>>
>> On Apr 21, 2009, at 10:18 AM, Ankush Kaul wrote:
>>
>> After installing mpich2 according to the steps given in the README 
>> file it asks to test using
>> mpd &
>> mpdtrace
>> mpdallexit
>>
>> mpd & gives output :
>> [2] 21367
>> [root at ccomp2 etc]# configuration file /etc/mpd.conf not found
>> A file named .mpd.conf file must be present in the user's home
>> directory (/etc/mpd.conf if root) with read and write access
>> only for the user, and must contain at least a line with:
>> MPD_SECRETWORD=<secretword>
>> One way to safely create this file is to do the following:
>>  cd $HOME
>>  touch .mpd.conf
>>  chmod 600 .mpd.conf
>> and then use an editor to insert a line like
>>  MPD_SECRETWORD=mr45-j9z
>> into the file.  (Of course use some other secret word than mr45-j9z.)
>>
>> and mpdtrace gives error :
>> /usr/local/bin/mpdroot: open failed for root's mpd conf filempdtrace 
>> (__init__ 1190): forked process failed; status=255
>>
>> this is the forst time i am doing a project in linux clustering n cant 
>> figure out What is the problem here?
>>
>>
>>



More information about the mpich-discuss mailing list