[petsc-users] Configure takes very long

Hom Nath Gharti hng.email at gmail.com
Wed Apr 26 11:27:06 CDT 2017


Yes, I am compiling on a cluster. Thanks for the advice!

On Wed, Apr 26, 2017 at 12:22 PM, Satish Balay <balay at mcs.anl.gov> wrote:
> Great! Glad it work.
>
> With Intel MPI - I normally use mpiexec.hydra. However you must be
> using a cluster - and presumably 'srun' is the way to scedule mpi jobs
> on it.
>
> Satish
>
> On Wed, 26 Apr 2017, Hom Nath Gharti wrote:
>
>> Yes indeed! Thanks a lot, Satish! I am using intel MPI. Now I replace
>> mipexec with srun, and it configures quickly.
>>
>> Hom Nath
>>
>> On Wed, Apr 26, 2017 at 11:38 AM, Satish Balay <balay at mcs.anl.gov> wrote:
>> > Perhaps mpiexec is hanging.
>> >
>> > What MPI are you using? Are you able to manually run jobs with
>> > mpiexec?
>> >
>> > Satish
>> >
>> > On Wed, 26 Apr 2017, Hom Nath Gharti wrote:
>> >
>> >> Dear all,
>> >>
>> >> With version > 3.7.4, I notice that the configure takes very long
>> >> about 24 hours!
>> >>
>> >> Configure process hangs at the line:
>> >>
>> >> TESTING: configureMPIEXEC from
>> >> config.packages.MPI(config/BuildSystem/config/packages/MPI.py:143)
>> >>
>> >> Following is my configure command:
>> >>
>> >> ./configure -with-blas-lapack-dir=/opt/intel/compilers_and_libraries_2017.2.174/linux/mkl/lib/intel64
>> >> --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90
>> >> --with-mpiexec=mpiexec --with-debugging=1 --download-scalapack
>> >> --download-mumps --download-pastix --download-superlu
>> >> --download-superlu_dist --download-metis --download-parmetis
>> >> --download-ptscotch --download-hypre
>> >> ===============================================================================
>> >>              Configuring PETSc to compile on your system
>> >> ===============================================================================
>> >> TESTING: configureMPIEXEC from
>> >> config.packages.MPI(config/BuildSystem/config/packages/MPI.py:143)
>> >>
>> >> Am I doing something wrong?
>> >>
>> >> Thanks,
>> >> Hom Nath
>> >>
>> >
>>
>


More information about the petsc-users mailing list