[petsc-users] petsc without MPI

Povolotskyi, Mykhailo mpovolot at purdue.edu
Tue Nov 19 14:09:26 CST 2019


Thank you. It is clear now.

On 11/19/2019 3:07 PM, Balay, Satish wrote:
> Not sure why you are looking at this flag and interpreting it - PETSc code uses the flag PETSC_HAVE_MPIUNI to check for a sequential build.
>
> [this one states the module MPI similar to BLASLAPACK etc  in configure is enabled]
>
> Satish
>
> On Tue, 19 Nov 2019, Povolotskyi, Mykhailo via petsc-users wrote:
>
>> Let me explain the problem.
>>
>> This log file has
>>
>> #ifndef PETSC_HAVE_MPI
>> #define PETSC_HAVE_MPI 1
>> #endif
>>
>> while I need to have PETSC without MPI.
>>
>> On 11/19/2019 2:55 PM, Matthew Knepley wrote:
>> The log you sent has configure completely successfully. Please retry and send the log for a failed run.
>>
>>    Thanks,
>>
>>       Matt
>>
>> On Tue, Nov 19, 2019 at 2:53 PM Povolotskyi, Mykhailo via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:
>> Why it did not work then?
>>
>> On 11/19/2019 2:51 PM, Balay, Satish wrote:
>>> And I see from configure.log - you are using the correct option.
>>>
>>> Configure Options: --configModules=PETSc.Configure --optionsModule=config.compilerOptions --with-scalar-type=real --with-x=0 --with-hdf5=0 --with-single-library=1 --with-shared-libraries=0 --with-log=0 --with-mpi=0 --with-clanguage=C++ --with-cxx-dialect=C++11 --CXXFLAGS="-fopenmp -fPIC" --CFLAGS="-fopenmp -fPIC" --with-fortran=0 --FFLAGS="-fopenmp -fPIC" --with-64-bit-indices=0 --with-debugging=0 --with-cc=gcc --with-fc=gfortran --with-cxx=g++ COPTFLAGS= CXXOPTFLAGS= FOPTFLAGS= --download-metis=0 --download-superlu_dist=0 --download-parmetis=0 --with-valgrind-dir=/apps/brown/valgrind/3.13.0_gcc-4.8.5 --download-mumps=1 --with-mumps-serial=1 --with-fortran-kernels=0 --with-blaslapack-lib="-Wl,-rpath,/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64  -L/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_gnu_thread -lmkl_core " --with-blacs-lib=/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64/libmkl_blacs_intelmpi_lp64.so --with-blacs-include=/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/include --with-scalapack=0
>>> <<<<<<<
>>>
>>> And configure completed successfully. What issue are you encountering? Why do you think its activating MPI?
>>>
>>> Satish
>>>
>>>
>>> On Tue, 19 Nov 2019, Balay, Satish via petsc-users wrote:
>>>
>>>> On Tue, 19 Nov 2019, Povolotskyi, Mykhailo via petsc-users wrote:
>>>>
>>>>> Hello,
>>>>>
>>>>> I'm trying to build PETSC without MPI.
>>>>>
>>>>> Even if I specify --with_mpi=0, the configuration script still activates
>>>>> MPI.
>>>>>
>>>>> I attach the configure.log.
>>>>>
>>>>> What am I doing wrong?
>>>> The option is --with-mpi=0
>>>>
>>>> Satish
>>>>
>>>>
>>>>> Thank you,
>>>>>
>>>>> Michael.
>>>>>
>>>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>
>>



More information about the petsc-users mailing list