[petsc-dev] Compiling petsc with superlu_dist, hypre and mumps on hopper II

Barry Smith bsmith at mcs.anl.gov
Thu Mar 10 21:51:38 CST 2011


On Mar 10, 2011, at 8:54 PM, Satish Balay wrote:

> On Thu, 10 Mar 2011, Barry Smith wrote:
> 
>> 
>> On Mar 10, 2011, at 4:46 PM, fabien delalondre wrote:
>> 
>>> Hi,
>>> 
>>> Which argument should I pass to pass to compile petsc on hopper II with superlu_dist, hypre and mumps (download) ? Without external package --with-mpi=0 works fine but then the external packages complain about it (no mpi specified). I tried to use and also modify the arch-cray-xt5-opt.py configure file but I did not go through it yet (Invalid mpiexec specified: /usr/common/acts/PETSc/3.0.0/bin/mpiexec.aprun). Instead of wasting time, I guess you guys will provide an answer pretty quickly.
>> 
>>   Satish,
>> 
>>    Why does --with-mpiexec have to be set on hopper? And why to this crazy location that may not always exist? Requiring the user to set this beast to some strange thing at configure time is not reasonable. We need a model that doesn't require strange knowledge like this.
>> 
> 
> The explanation for the strange thing in arch-cray-xt5-opt.py is: I
> started off with a reconfigure.py file from a prior install by nersc
> folks - so had some of this stuff in from there. [mpiexec.aprun is a
> derivative of one of the scripts we have in PETSC_DIR/bin/mpiexec.*]
> 
> Looks like --with-mpiexec=/bin/false would also work on this machine.
> 
> currently mpiexec is checked by configure primarily for for 'make
> test' stuff and 'check-mpi-shared' test.
> 
> We now have --with-batch=1 that generally works for machines where
> things can't be run directly. Here - on cray - configure seem to work
> directly - without batch - and without mpiexec.
> 
> Perhaps we can have configure - not abort if 'mpiexec' not found?

  Sounds good. Have it not abort. perhaps a warning message? perhaps not no warning message needed.

  Barry

> 
> Satish
> 
> -----
> diff -r 88e278a08859 config/packages/MPI.py
> --- a/config/packages/MPI.py	Sat Mar 05 14:24:18 2011 -0600
> +++ b/config/packages/MPI.py	Thu Mar 10 20:30:24 2011 -0600
> @@ -155,7 +155,7 @@
>       self.mpiexec = 'Not_appropriate_for_batch_systems'
>       self.addMakeMacro('MPIEXEC',self.mpiexec)
>       return
> -    mpiexecs = ['mpiexec -n 1', 'mpirun -n 1', 'mprun -n 1', 'mpiexec', 'mpirun', 'mprun']
> +    mpiexecs = ['mpiexec -n 1', 'mpirun -n 1', 'mprun -n 1', 'mpiexec', 'mpirun', 'mprun', '/bin/false']
>     path    = []
>     if 'with-mpi-dir' in self.framework.argDB:
>       path.append(os.path.join(os.path.abspath(self.framework.argDB['with-mpi-dir']), 'bin'))




More information about the petsc-dev mailing list