[petsc-dev] Petsc cannot be configure

Matthew Knepley knepley at gmail.com
Mon Jun 9 04:44:07 CDT 2025


Yes, send the configure log.

It seems that you did not build the C++ bindings for MPICH. You can shut of
C++ in PETSc using

  --with-cxx=0

  Thanks,

     Matt

On Mon, Jun 9, 2025 at 5:42 AM Jose E. Roman via petsc-dev <
petsc-dev at mcs.anl.gov> wrote:

> You should always attach the configure.log file.
>
> Thanks.
> Jose
>
>
> > El 9 jun 2025, a las 11:14, David Jiawei LUO LIANG <
> 12431140 at mail.sustech.edu.cn> escribió:
> >
> > ./configure --with-x=0 -with-pic --with-make-np=4 --with-mpi-compilers=1
> --with-mpi-dir=/Users/lawkawai/lib/mpich-4.2.3-opt/ --with-scalar-type=real
> --with-precision=double --with-mumps=1 --download-mumps --with-scalapack=1
> --download-scalapack --with-blacs=1 --download-blacs --download-fblaslapack
> --download-metis --download-hdf5 --with-debugging=no --download-slepc
> --prefix=/Users/lawkawai/lib/petsc-3.23.3-opt
> > the error:
> >
> =============================================================================================
> >                          Configuring PETSc to compile on your system
> >
> =============================================================================================
> >
> =============================================================================================
> >                                      ***** WARNING *****
> >   Found environment variable: FFLAGS=-w -fallow-argument-mismatch -O2.
> Ignoring it! Use
> >   "./configure FFLAGS=$FFLAGS" if you really want to use this value
> >
> =============================================================================================
> >
> =============================================================================================
> >                                      ***** WARNING *****
> >   Using default C optimization flags "-g -O3". You might consider
> manually setting optimal
> >   optimization flags for your system with COPTFLAGS="optimization flags"
> see
> >   config/examples/arch-*-opt.py for examples
> >
> =============================================================================================
> >
> =============================================================================================
> >                                      ***** WARNING *****
> >   Using default Cxx optimization flags "-g -O3". You might consider
> manually setting
> >   optimal optimization flags for your system with
> CXXOPTFLAGS="optimization flags" see
> >   config/examples/arch-*-opt.py for examples
> >
> =============================================================================================
> >
> =============================================================================================
> >                                      ***** WARNING *****
> >   Using default FC optimization flags "-g -O". You might consider
> manually setting optimal
> >   optimization flags for your system with FOPTFLAGS="optimization flags"
> see
> >   config/examples/arch-*-opt.py for examples
> >
> =============================================================================================
> >
> =============================================================================================
> >                                      ***** WARNING *****
> >   You have a version of GNU make older than 4.0. It will work, but may
> not support all the
> >   parallel testing options. You can install the latest GNU make with
> your package manager,
> >   such as Brew or MacPorts, or use the --download-make option to get the
> latest GNU make
> >
> =============================================================================================
> > TESTING: CxxMPICheck from
> config.packages.MPI(config/BuildSystem/config/packages/MPI.py:673)
> >
> *********************************************************************************************
> >            UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for
> details):
> >
> ---------------------------------------------------------------------------------------------
> >                        C++ error! MPI_Finalize() could not be located!
> >
> *********************************************************************************************
> >
> >
> > Iam sure my mpich is good, it has been tested.  But the petsc configure
> still fail.
> >
> > David Jiawei LUO LIANG南方科技大学/学生/研究生/2024广东省深圳市南山区学苑大道1088号
> >
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!fKKJCjEC8oiQMoqTM7lEB44Tq2mqDYkX6njATmEWaJgNYfoUjx6m9itWTssO0HWX_FdNsuMbPvBnqRxC6d7J$  <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!fKKJCjEC8oiQMoqTM7lEB44Tq2mqDYkX6njATmEWaJgNYfoUjx6m9itWTssO0HWX_FdNsuMbPvBnqU4oSjKY$ >
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20250609/e7ce987a/attachment-0001.html>


More information about the petsc-dev mailing list