[petsc-users] Undefined reference in PETSc 3.13+ with old MPI version

Junchao Zhang junchao.zhang at gmail.com
Mon Apr 12 09:20:22 CDT 2021


On Mon, Apr 12, 2021 at 8:09 AM Satish Balay <balay at mcs.anl.gov> wrote:

> Whats the oldest version of mpich or openmpi we should test with?
>
> OpenMPI-1.6.5 or mpich2-1.5, which are their latest support of MPI-2.2

We can modify one of the tests to use that version of tarball with
> --download-mpich=URL [or --download-openmpi=URL]
>
> Satish
>
> On Sun, 11 Apr 2021, Junchao Zhang wrote:
>
> > Danyang,
> >   I pushed another commit to the same branch jczhang/fix-mpi3-win to
> guard
> > uses of MPI_Iallreduce.
> >
> >   Satish, it seems we need an MPI-2.2 CI to say petsc does not need
> MPI-3.0?
> >
> > --Junchao Zhang
> >
> >
> > On Sun, Apr 11, 2021 at 1:45 PM Danyang Su <danyang.su at gmail.com> wrote:
> >
> > > Hi Junchao,
> > >
> > >
> > >
> > > I also ported the changes you have made to PETSc 3.13.6 and configured
> > > with Intel 14.0 and OpenMPI 1.6.5, it works too.
> > >
> > > There is a similar problem in PETSc 3.14+ version as MPI_Iallreduce is
> > > only available in OpenMPI V1.7+. I would not say this is a bug, it just
> > > requires a newer MPI version.
> > >
> > >
> > >
> > >
> /home/danyangs/soft/petsc/petsc-3.14.6/intel-14.0.2-openmpi-1.6.5/lib/libpetsc.so:
> > > undefined reference to `MPI_Iallreduce'
> > >
> > >
> > >
> > > Thanks again for all your help,
> > >
> > >
> > >
> > > Danyang
> > >
> > > *From: *Junchao Zhang <junchao.zhang at gmail.com>
> > > *Date: *Sunday, April 11, 2021 at 7:54 AM
> > > *To: *Danyang Su <danyang.su at gmail.com>
> > > *Cc: *Barry Smith <bsmith at petsc.dev>, "petsc-users at mcs.anl.gov" <
> > > petsc-users at mcs.anl.gov>
> > > *Subject: *Re: [petsc-users] Undefined reference in PETSc 3.13+ with
> old
> > > MPI version
> > >
> > >
> > >
> > > Thanks, Glad to know you have a workaround.
> > >
> > > --Junchao Zhang
> > >
> > >
> > >
> > >
> > >
> > > On Sat, Apr 10, 2021 at 10:06 PM Danyang Su <danyang.su at gmail.com>
> wrote:
> > >
> > > Hi Junchao,
> > >
> > >
> > >
> > > I cannot configure your branch with same options due to the error in
> > > sowing. I had similar error before on other clusters with very old
> openmpi
> > > version. Problem was solved when openmpi was updated to a newer one.
> > >
> > >
> > >
> > > At this moment, I configured a PETSc version with Openmpi 2.1.6 version
> > > and it seems working properly.
> > >
> > >
> > >
> > > Thanks and have a good rest of the weekend,
> > >
> > >
> > >
> > > Danyang
> > >
> > >
> > >
> > > *From: *Danyang Su <danyang.su at gmail.com>
> > > *Date: *Saturday, April 10, 2021 at 4:08 PM
> > > *To: *Junchao Zhang <junchao.zhang at gmail.com>
> > > *Cc: *Barry Smith <bsmith at petsc.dev>, "petsc-users at mcs.anl.gov" <
> > > petsc-users at mcs.anl.gov>
> > > *Subject: *Re: [petsc-users] Undefined reference in PETSc 3.13+ with
> old
> > > MPI version
> > >
> > >
> > >
> > > Hi Junchao,
> > >
> > >
> > >
> > > The configuration is successful. The error comes from the last step
> when I
> > > run
> > >
> > >
> > >
> > > make PETSC_DIR=/home/danyangs/soft/petsc/petsc-3.13.6
> > > PETSC_ARCH=linux-intel-openmpi check
> > >
> > >
> > >
> > > ********************Error detected during compile or
> > > link!********************
> > >
> > > *See http://www.mcs.anl.gov/petsc/documentation/faq.html
> > > <http://www.mcs.anl.gov/petsc/documentation/faq.html>*
> > >
> > > */home/danyangs/soft/petsc/petsc-3.13.6/src/snes/tutorials ex5f*
> > >
> > > ***********************************************************
> > >
> > > mpif90 -fPIC -O3 -march=native -mtune=nativels
> > > -I/home/danyangs/soft/petsc/petsc-3.13.6/include
> > > -I/home/danyangs/soft/petsc/petsc-3.13.6/linux-intel-openmpi/include
> > > ex5f.F90
> > >
> -Wl,-rpath,/home/danyangs/soft/petsc/petsc-3.13.6/linux-intel-openmpi/lib
> > > -L/home/danyangs/soft/petsc/petsc-3.13.6/linux-intel-openmpi/lib
> > >
> -Wl,-rpath,/home/danyangs/soft/petsc/petsc-3.13.6/linux-intel-openmpi/lib
> > > -L/home/danyangs/soft/petsc/petsc-3.13.6/linux-intel-openmpi/lib
> > >
> -Wl,-rpath,/global/software/intel/composer_xe_2013_sp1.2.144/mkl/lib/intel64
> > > -L/global/software/intel/composer_xe_2013_sp1.2.144/mkl/lib/intel64
> > >
> -Wl,-rpath,/global/software/intel/composer_xe_2013_sp1.2.144/compiler/lib/intel64
> > >
> -L/global/software/intel/composer_xe_2013_sp1.2.144/compiler/lib/intel64
> > > -Wl,-rpath,/global/software/openmpi-1.6.5/intel/lib64
> > > -L/global/software/openmpi-1.6.5/intel/lib64
> > > -Wl,-rpath,/global/software/intel/composerxe/mkl/lib/intel64
> > > -L/global/software/intel/composerxe/mkl/lib/intel64
> > > -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.4.7
> > > -L/usr/lib/gcc/x86_64-redhat-linux/4.4.7
> > > -Wl,-rpath,/global/software/intel/composerxe/lib/intel64 -lpetsc
> -lHYPRE
> > > -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lscalapack
> > > -lsuperlu -lflapack -lfblas -lX11 -lhdf5hl_fortran -lhdf5_fortran
> -lhdf5_hl
> > > -lhdf5 -lparmetis -lmetis -lstdc++ -ldl -lmpi_f90 -lmpi_f77 -lmpi -lm
> > > -lnuma -lrt -lnsl -lutil -limf -lifport -lifcore -lsvml -lipgo -lintlc
> > > -lpthread -lgcc_s -lirc_s -lstdc++ -ldl -o ex5f
> > >
> > > ifort: command line warning #10159: invalid argument for option '-m'
> > >
> > >
> /home/danyangs/soft/petsc/petsc-3.13.6/linux-intel-openmpi/lib/libpetsc.so:
> > > undefined reference to `MPI_Win_allocate'
> > >
> > >
> /home/danyangs/soft/petsc/petsc-3.13.6/linux-intel-openmpi/lib/libpetsc.so:
> > > undefined reference to `MPI_Win_attach'
> > >
> > >
> /home/danyangs/soft/petsc/petsc-3.13.6/linux-intel-openmpi/lib/libpetsc.so:
> > > undefined reference to `MPI_Win_create_dynamic'
> > >
> > > gmake[4]: *** [ex5f] Error 1
> > >
> > >
> > >
> > > Thanks,
> > >
> > >
> > >
> > > Danyang
> > >
> > >
> > >
> > > *From: *Junchao Zhang <junchao.zhang at gmail.com>
> > > *Date: *Saturday, April 10, 2021 at 3:57 PM
> > > *To: *Danyang Su <danyang.su at gmail.com>
> > > *Cc: *Barry Smith <bsmith at petsc.dev>, "petsc-users at mcs.anl.gov" <
> > > petsc-users at mcs.anl.gov>
> > > *Subject: *Re: [petsc-users] Undefined reference in PETSc 3.13+ with
> old
> > > MPI version
> > >
> > >
> > >
> > > You sent a wrong one. This configure.log was from a successful
> > > configuration. Note FOPTFLAGS="-O3 -march=native -mtune=nativels" looks
> > > suspicious.
> > >
> > >
> > >
> > > --Junchao Zhang
> > >
> > >
> > >
> > >
> > >
> > > On Sat, Apr 10, 2021 at 5:32 PM Danyang Su <danyang.su at gmail.com>
> wrote:
> > >
> > >
> > >
> > > Hi Junchao,
> > >
> > >
> > >
> > > Thanks for looking into this problem. The configuration log is
> attached.
> > >
> > >
> > >
> > > All the best,
> > >
> > >
> > >
> > > Danyang
> > >
> > > *From: *Junchao Zhang <junchao.zhang at gmail.com>
> > > *Date: *Saturday, April 10, 2021 at 2:36 PM
> > > *To: *Danyang Su <danyang.su at gmail.com>
> > > *Cc: *Barry Smith <bsmith at petsc.dev>, "petsc-users at mcs.anl.gov" <
> > > petsc-users at mcs.anl.gov>
> > > *Subject: *Re: [petsc-users] Undefined reference in PETSc 3.13+ with
> old
> > > MPI version
> > >
> > >
> > >
> > > Hi, Danyang,
> > >
> > >
> > >
> > > Send the configure.log.  Also, PETSc does not need MPI_Win_allocate
> etc to
> > > work. I will have a look.
> > >
> > >
> > > --Junchao Zhang
> > >
> > >
> > >
> > >
> > >
> > > On Sat, Apr 10, 2021 at 2:47 PM Danyang Su <danyang.su at gmail.com>
> wrote:
> > >
> > > Hi Barry,
> > >
> > >
> > >
> > > I tried this option before but get ‘Error running configure on OpenMPI’
> > >
> > >
> > >
> > >
> > >
> *******************************************************************************
> > >
> > >          UNABLE to CONFIGURE with GIVEN OPTIONS    (see configure.log
> for
> > > details):
> > >
> > >
> > >
> -------------------------------------------------------------------------------
> > >
> > > Error running configure on OPENMPI
> > >
> > >
> > >
> *******************************************************************************
> > >
> > >   File
> > > "/global/home/danyangs/soft/petsc/petsc-3.14.6/config/configure.py",
> line
> > > 456, in petsc_configure
> > >
> > >     framework.configure(out = sys.stdout)
> > >
> > >   File
> > >
> "/global/home/danyangs/soft/petsc/petsc-3.14.6/config/BuildSystem/config/framework.py",
> > > line 1253, in configure
> > >
> > >     self.processChildren()
> > >
> > >   File
> > >
> "/global/home/danyangs/soft/petsc/petsc-3.14.6/config/BuildSystem/config/framework.py",
> > > line 1242, in processChildren
> > >
> > >     self.serialEvaluation(self.childGraph)
> > >
> > >   File
> > >
> "/global/home/danyangs/soft/petsc/petsc-3.14.6/config/BuildSystem/config/framework.py",
> > > line 1217, in serialEvaluation
> > >
> > >     child.configure()
> > >
> > >   File
> > >
> "/global/home/danyangs/soft/petsc/petsc-3.14.6/config/BuildSystem/config/package.py",
> > > line 1144, in configure
> > >
> > >     self.executeTest(self.configureLibrary)
> > >
> > >   File
> > >
> "/global/home/danyangs/soft/petsc/petsc-3.14.6/config/BuildSystem/config/base.py",
> > > line 140, in executeTest
> > >
> > >     ret = test(*args,**kargs)
> > >
> > >   File
> > >
> "/global/home/danyangs/soft/petsc/petsc-3.14.6/config/BuildSystem/config/package.py",
> > > line 902, in configureLibrary
> > >
> > >     for location, directory, lib, incl in self.generateGuesses():
> > >
> > >   File
> > >
> "/global/home/danyangs/soft/petsc/petsc-3.14.6/config/BuildSystem/config/package.py",
> > > line 476, in generateGuesses
> > >
> > >     d = self.checkDownload()
> > >
> > >   File
> > >
> "/global/home/danyangs/soft/petsc/petsc-3.14.6/config/BuildSystem/config/packages/OpenMPI.py",
> > > line 56, in checkDownload
> > >
> > >     return self.getInstallDir()
> > >
> > >   File
> > >
> "/global/home/danyangs/soft/petsc/petsc-3.14.6/config/BuildSystem/config/package.py",
> > > line 365, in getInstallDir
> > >
> > >     installDir = self.Install()
> > >
> > >   File
> > >
> "/global/home/danyangs/soft/petsc/petsc-3.14.6/config/BuildSystem/config/packages/OpenMPI.py",
> > > line 63, in Install
> > >
> > >     installDir = config.package.GNUPackage.Install(self)
> > >
> > >   File
> > >
> "/global/home/danyangs/soft/petsc/petsc-3.14.6/config/BuildSystem/config/package.py",
> > > line 1667, in Install
> > >
> > >     raise RuntimeError('Error running configure on ' + self.PACKAGE)
> > >
> > >
> > >
> ================================================================================
> > >
> > > Finishing configure run at Sat, 10 Apr 2021 11:57:20 -0700
> > >
> > >
> > >
> ================================================================================
> > >
> > >
> > >
> > > Thanks,
> > >
> > >
> > >
> > > Danyang
> > >
> > >
> > >
> > > *From: *Barry Smith <bsmith at petsc.dev>
> > > *Date: *Saturday, April 10, 2021 at 10:31 AM
> > > *To: *Danyang Su <danyang.su at gmail.com>
> > > *Cc: *"petsc-users at mcs.anl.gov" <petsc-users at mcs.anl.gov>
> > > *Subject: *Re: [petsc-users] Undefined reference in PETSc 3.13+ with
> old
> > > MPI version
> > >
> > >
> > >
> > >
> > >
> > >   Depending on the network you can remove the ./configure
> > > options --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90  and use
> instead
> > > --with-cc=icc --with-cxx=icpc and--with-fc=ifort --download-openmpi
> > >
> > >
> > >
> > >   Barry
> > >
> > >
> > >
> > >
> > >
> > > On Apr 10, 2021, at 12:18 PM, Danyang Su <danyang.su at gmail.com> wrote:
> > >
> > >
> > >
> > > Dear PETSc developers and users,
> > >
> > >
> > >
> > > I am trying to install the latest PETSc version on an ancient cluster.
> The
> > > OpenMPI version is 1.6.5 and Compiler is Intel 14.0, which are the
> newest
> > > on that cluster. I have no problem to install PETSc up to version
> 3.12.5.
> > > However, if I try to use PETSc 3.13+, there are three undefined
> reference
> > > errors in MPI_Win_allocate, MPI_Win_attach and MPI_Win_create_dynamic.
> I
> > > know these three functions are available from OpenMPI 2.0+. Because the
> > > cluster is not in technical support anymore, there is no way I can
> install
> > > new OpenMPI version or do some update. Is it possible to disable these
> > > three functions in PETSc 3.13+ version?
> > >
> > >
> > >
> > > The errors occur in ‘make check’ step:
> > >
> > > /home/dsu/soft/petsc/petsc-3.13.0/linux-intel-openmpi/lib/libpetsc.so:
> > > undefined reference to `MPI_Win_allocate'
> > >
> > > /home/dsu/soft/petsc/petsc-3.13.0/linux-intel-openmpi/lib/libpetsc.so:
> > > undefined reference to `MPI_Win_attach'
> > >
> > > /home/dsu/soft/petsc/petsc-3.13.0/linux-intel-openmpi/lib/libpetsc.so:
> > > undefined reference to `MPI_Win_create_dynamic'
> > >
> > >
> > >
> > > The configuration used is shown below:
> > >
> > > ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90
> > > --download-mumps --download-scalapack --download-parmetis
> --download-metis
> > > --download-fblaslapack --download-hypre --download-superlu
> > > --download-hdf5=yes --with-debugging=0 COPTFLAGS="-O3 -march=native
> > > -mtune=native" CXXOPTFLAGS="-O3 -march=native -mtune=native"
> FOPTFLAGS="-O3
> > > -march=native -mtune=nativels"
> > >
> > >
> > >
> > > Thanks,
> > >
> > >
> > >
> > > Danyang
> > >
> > >
> > >
> > >
> >
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210412/4c7962fb/attachment-0001.html>


More information about the petsc-users mailing list