[mpich-discuss] Questions about "Configure"

kaz_wada_JCOM kaz_wada at jcom.home.ne.jp
Sat Sep 10 20:30:14 CDT 2011


Dear Mr. Rajjev Thakur,Thank you so much, Rajeev.I could completed the test 
trial.Wada>If you are running on a single machine, you don't need a 
machinefile. Just do mpiexec -n 3 hostname.
>
>On Sep 10, 2011, at 7:54 PM, kaz_wada_JCOM wrote:
>
> Hi, A Chan,
>
> Thank you very much for your comments.
>
> Following your suggestions and using MPICH2-1.4.1p1, I used the follwoing 
> shell:
>   cd /tmp/wadakazusige/mpich2-1.4.1p1
>  /home/wadakazusige/libraries/mpich2-1.4.1p1/configure 
> CC=gcc --disable-f77 --disable-fc --build=i686-pc-cygwin --prefix=/home/wadakazusige/mpich2-install 
> 2>&1 | tee c.txt
>
> and the configuration was completed.
>
> Then, I built MPICH2 and install commands.  And after putting PATH, I 
> checked
>  which mpicc
>  which mpiexec
> They showed the appropriate location of the two codes, and I believe they 
> were completed.
>
> In the next step, for testing the setup, I entered
>  mpiexec -f machinefile -n 3 hostname
>
> The response was as follows:
> ------------------------------------------------------------
> [mpiexec at Inspiron510m] HYDU_parse_hostfile 
> (/home/wadakazusige/libraries/mpich2-1.4.1p1/src/pm/hydra/utils/args/args.c:303): 
> unable to open host file: machinefile
> [mpiexec at Inspiron510m] mfile_fn 
> (/home/wadakazusige/libraries/mpich2-1.4.1p1/src/pm/hydra/ui/mpich/utils.c:223): 
> error parsing hostfile
> [mpiexec at Inspiron510m] match_arg 
> (/home/wadakazusige/libraries/mpich2-1.4.1p1/src/pm/hydra/utils/args/args.c:115): 
> match handler returned error
> [mpiexec at Inspiron510m] HYDU_parse_array 
> (/home/wadakazusige/libraries/mpich2-1.4.1p1/src/pm/hydra/utils/args/args.c:140): 
> argument matching returned error
> [mpiexec at Inspiron510m] parse_args 
> (/home/wadakazusige/libraries/mpich2-1.4.1p1/src/pm/hydra/ui/mpich/utils.c:1387): 
> error parsing input array
> [mpiexec at Inspiron510m] HYD_uii_mpx_get_parameters 
> (/home/wadakazusige/libraries/mpich2-1.4.1p1/src/pm/hydra/ui/mpich/utils.c:1438): 
> unable to parse user arguments
>
> Usage: ./mpiexec [global opts] [exec1 local opts] : [exec2 local opts] : 
> ...
>
> Global options (passed to all executables):
>
>   Global environment options:
>     -genv {name} {value}             environment variable name and value
>     -genvlist {env1,env2,...}        environment variable list to pass
>     -genvnone                        do not pass any environment variables
>     -genvall                         pass all environment variables not 
> managed
>                                           by the launcher (default)
>
>   Other global options:
>     -f {name}                        file containing the host names
>     -hosts {host list}               comma separated host list
>     -wdir {dirname}                  working directory to use
>     -configfile {name}               config file containing MPMD launch 
> options
>
>
> Local options (passed to individual executables):
>
>   Local environment options:
>     -env {name} {value}              environment variable name and value
>     -envlist {env1,env2,...}         environment variable list to pass
>     -envnone                         do not pass any environment variables
>     -envall                          pass all environment variables 
> (default)
>
>   Other local options:
>     -n/-np {value}                   number of processes
>     {exec_name} {args}               executable name and arguments
>
>
> Hydra specific options (treated as global):
>
>   Launch options:
>     -launcher                        launcher to use ( ssh rsh fork slurm 
> ll lsf sge manual persist)
>     -launcher-exec                   executable to use to launch processes
>     -enable-x/-disable-x             enable or disable X forwarding
>
>   Resource management kernel options:
>     -rmk                             resource management kernel to use ( 
> user slurm ll lsf sge pbs)
>
>   Hybrid programming options:
>     -ranks-per-proc                  assign so many ranks to each process
>
>   Processor topology options:
>     -binding                         process-to-core binding mode
>     -topolib                         processor topology library ()
>
>   Checkpoint/Restart options:
>     -ckpoint-interval                checkpoint interval
>     -ckpoint-prefix                  checkpoint file prefix
>     -ckpoint-num                     checkpoint number to restart
>     -ckpointlib                      checkpointing library (none)
>
>   Demux engine options:
>     -demux                           demux engine ( poll select)
>
>   Other Hydra options:
>     -verbose                         verbose mode
>     -info                            build information
>     -print-all-exitcodes             print exit codes of all processes
>     -iface                           network interface to use
>     -ppn                             processes per node
>     -profile                         turn on internal profiling
>     -prepend-rank                    prepend rank to output
>     -prepend-pattern                 prepend pattern to output
>     -outfile-pattern                 direct stdout to file
>     -errfile-pattern                 direct stderr to file
>     -nameserver                      name server information (host:port 
> format)
>     -disable-auto-cleanup            don't cleanup processes on error
>     -disable-hostname-propagation    let MPICH2 auto-detect the hostname
>     -order-nodes                     order nodes as ascending/descending 
> cores
>
> Please see the intructions provided at
> http://wiki.mcs.anl.gov/mpich2/index.php/Using_the_Hydra_Process_Manager
> for further details
> -------------------------------------------------------------
>
> Could you let me know the reason?
>
> Wada
> >I copied and pasted
>  your commmand without looking carefully, you should
> >be using 1.4.1p1 instead of 1.4.1.
> >
> >A.Chan
>
> ----- Original Message -----
> >
>  That is odd! The same test in top-level mpich2 worked but the one
>
> >
>  in src/openpa failed! Both config.guess in src/openpa and top-level
>
> >
>  contain entry to cygwin (openpa's config.guess is newer). Can you redo
>
> >
>  configure as follows:
>
> >
>
>
> >
>  > cd /tmp/wadakazusige/mpich2-1.4.1
>
> >
>  > /home/wadakazusige/libraries/mpich2-1.4.1/configure CC=gcc
>
> >
>  > --prefix=/home/wadakazusige/mpich2-install 2>&1 | tee c.txt
>
> >
>
>
> >
>  See if it works. If it fails similarly, add "--build=i686-pc-cygwin"
>
> >
>  and see if
>
> >
>  things work.
>
> >
>
>
> >
>  A.Chan
>
> >
>
>
> >
>  ----- Original Message -----
>
> >
>  > Dear Sirs and Madams:
>
> >
>  >
>
> >
>  > I am tring to install MPICH2-1.4.1p1 to my stand-alone PC based on
>
> >
>  > Windows
>
> >
>  > XP.
>
> >
>  > While configuring MPICH2, following your "MPICH2 Installer's Guide",
>
> >
>  > I got error message: "OpenPA configure failed", as you can see
>
> >
>  > attached
>
> >
>  > c.txt.
>
> >
>  >
>
> >
>  > I would like to use MPICH2 under Cygwin I already installed.
>
> >
>  > Used commands to configure MPICH2 are as follows:
>
> >
>  > cd /tmp/wadakazusige/mpich2-1.4.1
>
> >
>  > env CC="gcc"
>
> >
>  > /home/wadakazusige/libraries/mpich2-1.4.1/configure
>
> >
>  > -prefix=/home/wadakazusige/mpich2-install
>
> >
>  > 2>&1 | tee c.txt
>
> >
>  >
>
> >
>  > Could you please teach me how to fix this problem.
>
> >
>  >
>
> >
>  > Thank you very much for your support in advance.
>
> >
>  >
>
> >
>  > Best regards,
>
> >
>  >
>
> >
>  > Kazushige Wada
>
> >
>  >
>
> >
>  > _______________________________________________
>
> > > mpich-discuss mailing list mpich-discuss at mcs.anl.gov
> >
>  > To manage subscription options or unsubscribe:
>
> > > https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss
> _______________________________________________
> mpich-discuss mailing list     mpich-discuss at mcs.anl.gov
> To manage subscription options or unsubscribe:
> https://lists.mcs.anl.gov/mailman/listinfo/mpich-discuss

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/mpich-discuss/attachments/20110911/78719990/attachment-0001.htm>


More information about the mpich-discuss mailing list