<div dir="ltr">The log you sent has configure completely successfully. Please retry and send the log for a failed run.<div><br></div><div>  Thanks,</div><div><br></div><div>     Matt</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Tue, Nov 19, 2019 at 2:53 PM Povolotskyi, Mykhailo via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Why it did not work then?<br>
<br>
On 11/19/2019 2:51 PM, Balay, Satish wrote:<br>
> And I see from configure.log - you are using the correct option.<br>
><br>
> Configure Options: --configModules=PETSc.Configure --optionsModule=config.compilerOptions --with-scalar-type=real --with-x=0 --with-hdf5=0 --with-single-library=1 --with-shared-libraries=0 --with-log=0 --with-mpi=0 --with-clanguage=C++ --with-cxx-dialect=C++11 --CXXFLAGS="-fopenmp -fPIC" --CFLAGS="-fopenmp -fPIC" --with-fortran=0 --FFLAGS="-fopenmp -fPIC" --with-64-bit-indices=0 --with-debugging=0 --with-cc=gcc --with-fc=gfortran --with-cxx=g++ COPTFLAGS= CXXOPTFLAGS= FOPTFLAGS= --download-metis=0 --download-superlu_dist=0 --download-parmetis=0 --with-valgrind-dir=/apps/brown/valgrind/3.13.0_gcc-4.8.5 --download-mumps=1 --with-mumps-serial=1 --with-fortran-kernels=0 --with-blaslapack-lib="-Wl,-rpath,/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64  -L/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_gnu_thread -lmkl_core " --with-blacs-lib=/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/lib/intel64/libmkl_blacs_intelmpi_lp64.so --with-blacs-include=/apps/cent7/intel/compilers_and_libraries_2017.1.132/linux/mkl/include --with-scalapack=0<br>
> <<<<<<<<br>
><br>
> And configure completed successfully. What issue are you encountering? Why do you think its activating MPI?<br>
><br>
> Satish<br>
><br>
><br>
> On Tue, 19 Nov 2019, Balay, Satish via petsc-users wrote:<br>
><br>
>> On Tue, 19 Nov 2019, Povolotskyi, Mykhailo via petsc-users wrote:<br>
>><br>
>>> Hello,<br>
>>><br>
>>> I'm trying to build PETSC without MPI.<br>
>>><br>
>>> Even if I specify --with_mpi=0, the configuration script still activates<br>
>>> MPI.<br>
>>><br>
>>> I attach the configure.log.<br>
>>><br>
>>> What am I doing wrong?<br>
>> The option is --with-mpi=0<br>
>><br>
>> Satish<br>
>><br>
>><br>
>>> Thank you,<br>
>>><br>
>>> Michael.<br>
>>><br>
>>><br>
<br>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr" class="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div>