[petsc-users] [petsc-maint] Issues linking petsc header files and lib from FORTRAN codes

Jianbo Long longtuteng249 at gmail.com
Tue Nov 8 09:57:28 CST 2022


Here are the ldd outputs:
>> ldd petsc_3.18_gnu/arch-linux-c-debug/lib/libpetsc.so
linux-vdso.so.1 =>  (0x00007f23e5ff2000)
libflexiblas.so.3 =>
/cluster/software/FlexiBLAS/3.0.4-GCC-11.2.0/lib/libflexiblas.so.3
(0x00007f23e1b60000)
libpthread.so.0 => /usr/lib64/libpthread.so.0 (0x00007f23e1944000)
libm.so.6 => /usr/lib64/libm.so.6 (0x00007f23e1642000)
libdl.so.2 => /usr/lib64/libdl.so.2 (0x00007f23e143e000)
libmpi_usempif08.so.40 =>
/cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_usempif08.so.40
(0x00007f23e5fb0000)
libmpi_usempi_ignore_tkr.so.40 =>
/cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_usempi_ignore_tkr.so.40
(0x00007f23e5fa2000)
libmpi_mpifh.so.40 =>
/cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_mpifh.so.40
(0x00007f23e5f2a000)
libmpi.so.40 => /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi.so.40
(0x00007f23e5e18000)
libgfortran.so.5 => /cluster/software/GCCcore/11.2.0/lib64/libgfortran.so.5
(0x00007f23e1191000)
libgcc_s.so.1 => /cluster/software/GCCcore/11.2.0/lib64/libgcc_s.so.1
(0x00007f23e5dfe000)
libquadmath.so.0 => /cluster/software/GCCcore/11.2.0/lib64/libquadmath.so.0
(0x00007f23e1149000)
libc.so.6 => /usr/lib64/libc.so.6 (0x00007f23e0d7b000)
/lib64/ld-linux-x86-64.so.2 (0x00007f23e5dd3000)
libopen-rte.so.40 =>
/cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-rte.so.40
(0x00007f23e0cbf000)
libopen-orted-mpir.so =>
/cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-orted-mpir.so
(0x00007f23e5df9000)
libopen-pal.so.40 =>
/cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-pal.so.40
(0x00007f23e0c0b000)
librt.so.1 => /lib64/librt.so.1 (0x00007f23e09f6000)
libutil.so.1 => /lib64/libutil.so.1 (0x00007f23e07f3000)
libhwloc.so.15 =>
/cluster/software/hwloc/2.5.0-GCCcore-11.2.0/lib/libhwloc.so.15
(0x00007f23e0796000)
libpciaccess.so.0 =>
/cluster/software/libpciaccess/0.16-GCCcore-11.2.0/lib/libpciaccess.so.0
(0x00007f23e078b000)
libxml2.so.2 =>
/cluster/software/libxml2/2.9.10-GCCcore-11.2.0/lib/libxml2.so.2
(0x00007f23e0617000)
libz.so.1 => /cluster/software/zlib/1.2.11-GCCcore-11.2.0/lib/libz.so.1
(0x00007f23e05fe000)
liblzma.so.5 => /cluster/software/XZ/5.2.5-GCCcore-11.2.0/lib/liblzma.so.5
(0x00007f23e05d6000)
libevent_core-2.0.so.5 => /lib64/libevent_core-2.0.so.5 (0x00007f23e03ab000)
libevent_pthreads-2.0.so.5 => /lib64/libevent_pthreads-2.0.so.5
(0x00007f23e01a8000)

And /cluster/software/GCCcore/11.2.0 is pretty recent (around 2020/2021).
You can see that I am using openmpi. Now I am trying compiling petsc
without MPI.


On Tue, Nov 8, 2022 at 4:43 PM Satish Balay <balay at mcs.anl.gov> wrote:

> On Tue, 8 Nov 2022, Satish Balay via petsc-users wrote:
>
> > You don't see 'libstdc++' in the output from 'ldd libptsc.so' below - so
> there is no reference
> > to libstdc++ from petsc
> >
> > Try a clean build of PETSc and see if you still have these issues.
> >
> > ./configure --with-cc=gcc --with-cxx=0 --with-fc=gfortran
> --download-fblaslapack --download-mpich
>
> Perhaps good to also add: --with-hwloc=0
>
> Satish
>
> >
> > Another way to avoid this issue is to use /usr/bin/gcc, gfortran - i.e
> avoid using tools from /cluster/software/GCCcore
> > Are they super old versions - that are not suitable?
> >
> > Satish
> >
> >
> >
> > On Tue, 8 Nov 2022, Jianbo Long wrote:
> >
> > > I am suspecting something else as well ...
> > >
> > > Could you elaborate more about "mixing c++ codes compiled with
> /usr/bin/g++
> > > and compilers in /cluster/software/GCCcore/11.2.0" ? My own Fortran
> code
> > > does not have any c++ codes, and for some reason, the compiled petsc
> > > library is dependent on this libstdc++.so.6. I am sure about this
> because
> > > without linking the petsc, I don't have this libstdc++ trouble.
> > >
> > > Thanks,
> > > Jianbo
> > >
> > > On Mon, Nov 7, 2022 at 7:10 PM Satish Balay <balay at mcs.anl.gov> wrote:
> > >
> > > > Likely due to mixing c++ codes compiled with /usr/bin/g++ and
> compilers in
> > > > /cluster/software/GCCcore/11.2.0
> > > >
> > > > if you still get this with --with-cxx=0 - then the issue with some
> other
> > > > [non-petsc library]
> > > >
> > > > Satish
> > > >
> > > > On Mon, 7 Nov 2022, Jianbo Long wrote:
> > > >
> > > > > Hi Satish,
> > > > >
> > > > > I wonder if you know anything about another issue: after compiling
> petsc
> > > > on
> > > > > a cluster, when I tried to link my Fortran code with compiled
> > > > libpetsc.so,
> > > > > the shared library, I got the following errors:
> > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold:
> > > > > /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required
> by
> > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold)
> > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold:
> > > > > /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found
> (required by
> > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold)
> > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold:
> > > > > /lib64/libstdc++.so.6: version `GLIBCXX_3.4.29' not found
> (required by
> > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold)
> > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold:
> > > > > /lib64/libstdc++.so.6: version `GLIBCXX_3.4.20' not found
> (required by
> > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold)
> > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold:
> > > > > /lib64/libstdc++.so.6: version `CXXABI_1.3.8' not found (required
> by
> > > > > /cluster/software/binutils/2.37-GCCcore-11.2.0/bin/ld.gold)
> > > > >
> > > > > Not sure if it is related to discussion in this post (
> > > > > https://gitlab.com/petsc/petsc/-/issues/997), but after I tried
> the
> > > > > configure option --with-cxx=0, I still got the same errors.
> > > > > My make.log file for compiling petsc is attached here. Also, the
> > > > > dependencies of the compiled petsc is:
> > > > >
> > > > > >>: ldd arch-linux-c-debug/lib/libpetsc.so
> > > > > linux-vdso.so.1 =>  (0x00007ffd80348000)
> > > > > libflexiblas.so.3 =>
> > > > > /cluster/software/FlexiBLAS/3.0.4-GCC-11.2.0/lib/libflexiblas.so.3
> > > > > (0x00007f6e8b93f000)
> > > > > libpthread.so.0 => /usr/lib64/libpthread.so.0 (0x00007f6e8b723000)
> > > > > libm.so.6 => /usr/lib64/libm.so.6 (0x00007f6e8b421000)
> > > > > libdl.so.2 => /usr/lib64/libdl.so.2 (0x00007f6e8b21d000)
> > > > > libmpi_usempif08.so.40 =>
> > > > >
> /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_usempif08.so.40
> > > > > (0x00007f6e8fd92000)
> > > > > libmpi_usempi_ignore_tkr.so.40 =>
> > > > >
> > > >
> /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_usempi_ignore_tkr.so.40
> > > > > (0x00007f6e8fd84000)
> > > > > libmpi_mpifh.so.40 =>
> > > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi_mpifh.so.40
> > > > > (0x00007f6e8fd0c000)
> > > > > libmpi.so.40 =>
> > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libmpi.so.40
> > > > > (0x00007f6e8fbfa000)
> > > > > libgfortran.so.5 =>
> > > > /cluster/software/GCCcore/11.2.0/lib64/libgfortran.so.5
> > > > > (0x00007f6e8af70000)
> > > > > libgcc_s.so.1 =>
> /cluster/software/GCCcore/11.2.0/lib64/libgcc_s.so.1
> > > > > (0x00007f6e8fbe0000)
> > > > > libquadmath.so.0 =>
> > > > /cluster/software/GCCcore/11.2.0/lib64/libquadmath.so.0
> > > > > (0x00007f6e8af28000)
> > > > > libc.so.6 => /usr/lib64/libc.so.6 (0x00007f6e8ab5a000)
> > > > > /lib64/ld-linux-x86-64.so.2 (0x00007f6e8fbb3000)
> > > > > libopen-rte.so.40 =>
> > > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-rte.so.40
> > > > > (0x00007f6e8aa9e000)
> > > > > libopen-orted-mpir.so =>
> > > > >
> /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-orted-mpir.so
> > > > > (0x00007f6e8fbdb000)
> > > > > libopen-pal.so.40 =>
> > > > > /cluster/software/OpenMPI/4.1.1-GCC-11.2.0/lib/libopen-pal.so.40
> > > > > (0x00007f6e8a9ea000)
> > > > > librt.so.1 => /lib64/librt.so.1 (0x00007f6e8a7d5000)
> > > > > libutil.so.1 => /lib64/libutil.so.1 (0x00007f6e8a5d2000)
> > > > > libhwloc.so.15 =>
> > > > > /cluster/software/hwloc/2.5.0-GCCcore-11.2.0/lib/libhwloc.so.15
> > > > > (0x00007f6e8a575000)
> > > > > libpciaccess.so.0 =>
> > > > >
> /cluster/software/libpciaccess/0.16-GCCcore-11.2.0/lib/libpciaccess.so.0
> > > > > (0x00007f6e8a56a000)
> > > > > libxml2.so.2 =>
> > > > > /cluster/software/libxml2/2.9.10-GCCcore-11.2.0/lib/libxml2.so.2
> > > > > (0x00007f6e8a3f6000)
> > > > > libz.so.1 =>
> /cluster/software/zlib/1.2.11-GCCcore-11.2.0/lib/libz.so.1
> > > > > (0x00007f6e8a3dd000)
> > > > > liblzma.so.5 =>
> > > > /cluster/software/XZ/5.2.5-GCCcore-11.2.0/lib/liblzma.so.5
> > > > > (0x00007f6e8a3b5000)
> > > > > libevent_core-2.0.so.5 => /lib64/libevent_core-2.0.so.5
> > > > (0x00007f6e8a18a000)
> > > > > libevent_pthreads-2.0.so.5 => /lib64/libevent_pthreads-2.0.so.5
> > > > > (0x00007f6e89f87000)
> > > > >
> > > > > Thanks very much,
> > > > > Jianbo
> > > > >
> > > > > On Mon, Nov 7, 2022 at 6:01 PM Satish Balay <balay at mcs.anl.gov>
> wrote:
> > > > >
> > > > > > Glad you have it working. Thanks for the update.
> > > > > >
> > > > > > Satish
> > > > > >
> > > > > > On Mon, 7 Nov 2022, Jianbo Long wrote:
> > > > > >
> > > > > > > Hi Satish and Barry,
> > > > > > >
> > > > > > > Thanks very much for the feedback !
> > > > > > >
> > > > > > > It looks like my include file path was not correct !
> > > > > > >
> > > > > > > Bests,
> > > > > > > Jianbo
> > > > > > >
> > > > > > >
> > > > > > > On Fri, Nov 4, 2022 at 6:08 AM Satish Balay <balay at mcs.anl.gov
> >
> > > > wrote:
> > > > > > >
> > > > > > > > For ex83f.F90:
> > > > > > > >
> > > > > > > > >>>>>
> > > > > > > > balay at p1 /home/balay/test
> > > > > > > > $ ls
> > > > > > > > ex83f.F90
> > > > > > > > balay at p1 /home/balay/test
> > > > > > > > $ ls
> > > > > > > > ex83f.F90
> > > > > > > > balay at p1 /home/balay/test
> > > > > > > > $ export PETSC_DIR=$HOME/petsc
> > > > > > > > balay at p1 /home/balay/test
> > > > > > > > $ cp $PETSC_DIR/src/ksp/ksp/tests/makefile .
> > > > > > > > balay at p1 /home/balay/test
> > > > > > > > $ make ex83f
> > > > > > > > mpif90 -fPIC -Wall -ffree-line-length-none
> -ffree-line-length-0
> > > > > > > > -Wno-lto-type-mismatch -Wno-unused-dummy-argument -g -O0
> > > > > > > >  -I/home/balay/petsc/include
> > > > > > > > -I/home/balay/petsc/arch-linux-c-debug/include     ex83f.F90
> > > > > > > > -Wl,-rpath,/home/balay/petsc/arch-linux-c-debug/lib
> > > > > > > > -L/home/balay/petsc/arch-linux-c-debug/lib
> > > > > > > > -Wl,-rpath,/home/balay/soft/mpich-4.0.1/lib
> > > > > > > > -L/home/balay/soft/mpich-4.0.1/lib
> > > > > > > > -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/12
> > > > > > > > -L/usr/lib/gcc/x86_64-redhat-linux/12 -lpetsc -llapack
> -lblas -lm
> > > > -lX11
> > > > > > > > -lstdc++ -ldl -lmpifort -lmpi -lgfortran -lm -lgfortran -lm
> -lgcc_s
> > > > > > > > -lquadmath -lstdc++ -ldl -o ex83f
> > > > > > > > balay at p1 /home/balay/test
> > > > > > > > $
> > > > > > > > <<<<<<
> > > > > > > >
> > > > > > > > Also when you are adding PETSc to your current project - are
> you
> > > > using
> > > > > > > > source files with .f or .f90 suffix? If so rename them to .F
> or
> > > > .F90
> > > > > > suffix.
> > > > > > > >
> > > > > > > > If you still have issues send more details - As Barry
> indicated -
> > > > the
> > > > > > > > makefile [with the sources compiled by this makefile] - and
> the
> > > > > > compile log
> > > > > > > > when you attempt to build these sources with this makefile.
> > > > > > > >
> > > > > > > > Satish
> > > > > > > >
> > > > > > > > On Thu, 3 Nov 2022, Barry Smith wrote:
> > > > > > > >
> > > > > > > > >
> > > > > > > > >  Please send your attempted makefile and we'll see if we
> can get
> > > > it
> > > > > > > > working.
> > > > > > > > >
> > > > > > > > >   I am not sure if we can organize the include files as
> Fortran
> > > > > > compiler
> > > > > > > > include files easily. We've always used the preprocessor
> approach.
> > > > The
> > > > > > > > Intel compiler docs indicate the procedure for finding the
> Fortran
> > > > > > compiler
> > > > > > > > include files
> > > > > > > >
> > > > > >
> > > >
> https://www.intel.com/content/www/us/en/develop/documentation/fortran-compiler-oneapi-dev-guide-and-reference/top/program-structure/use-include-files.html
> > > > > > > > is the same as for the preprocessor include files so I don't
> > > > > > understand how
> > > > > > > > the using the Fortran compiler include file approach would
> make the
> > > > > > > > makefiles any simpler for users?
> > > > > > > > >
> > > > > > > > >
> > > > > > > > >   Barry
> > > > > > > > >
> > > > > > > > >
> > > > > > > > > > On Nov 3, 2022, at 8:58 PM, Jianbo Long <
> > > > longtuteng249 at gmail.com>
> > > > > > > > wrote:
> > > > > > > > > >
> > > > > > > > > > Hello,
> > > > > > > > > >
> > > > > > > > > > I'm struggling to make my FORTRAN code work with petsc
> as I
> > > > cannot
> > > > > > > > link the required header files (e.g., petscksp.h) and
> compiled
> > > > library
> > > > > > > > files to my FORTRAN code.
> > > > > > > > > >
> > > > > > > > > > Compiling petsc was not a problem. However, even with the
> > > > fortran
> > > > > > > > examples (see those on
> https://petsc.org/main/docs/manual/fortran/
> > > > )
> > > > > > and
> > > > > > > > the guide on using petsc in c++ and fortran codes (see
> Section
> > > > "Writing
> > > > > > > > C/C++ or Fortran Applications" at
> > > > > > > > https://petsc.org/main/docs/manual/getting_started/), I
> still
> > > > cannot
> > > > > > make
> > > > > > > > my FORTRAN code work.
> > > > > > > > > >
> > > > > > > > > > The Fortran test code is exactly the example code
> ex83f.F90
> > > > (see
> > > > > > > > attached files). Aftering following the 2nd method in the
> Guide
> > > > (see
> > > > > > the
> > > > > > > > picture below), I still get errors:
> > > > > > > > > >
> > > > > > > > > > petsc/finclude/petscksp.h: No such file or directory
> > > > > > > > > >
> > > > > > > > > > Even if I set up the path of the header file correctly
> in my
> > > > own
> > > > > > > > makefile without using environment variables, I still can
> only
> > > > find the
> > > > > > > > file "petscksp.h" for my code. Of course, the trouble is
> that all
> > > > other
> > > > > > > > headers files required by KSP are recursively included in
> this
> > > > > > petscksp.h
> > > > > > > > file, and I have no way to link them together for my Fortran
> code.
> > > > > > > > > >
> > > > > > > > > > So, here are my questions:
> > > > > > > > > > 1) in the Guide, how exactly are we supposed to set up
> the
> > > > > > environment
> > > > > > > > variables  PETSC_DIR  and PETSC_ARCH ? More details and
> examples
> > > > would
> > > > > > be
> > > > > > > > extremely helpful !
> > > > > > > > > > 2) Is there a way to get rid of the preprocessor
> statement
> > > > > > > > > >  #include <petsc/finclude/petscvec.h>
> > > > > > > > > > when using c++/Fortran codes ?
> > > > > > > > > >
> > > > > > > > > > For example, when using MUMPS package in a Fortran code,
> we can
> > > > > > simply
> > > > > > > > use compiler 'include', rather than a preprocessor, to
> extract all
> > > > > > required
> > > > > > > > variables for the user's codes :
> > > > > > > > > >   INCLUDE 'zmumps_struc.h'
> > > > > > > > > > where the header file zmumps_struc.h is already provided
> in the
> > > > > > > > package. Similarly, I think it's much more portable and
> easier when
> > > > > > using
> > > > > > > > petsc in other codes if we can make it work to use petsc.
> > > > > > > > > >
> > > > > > > > > > (Note: similar issues were discussed before, see
> > > > > > > >
> > > > > >
> > > >
> https://lists.mcs.anl.gov/mailman/htdig/petsc-users/2019-January/037499.html
> > > > > > .
> > > > > > > > Unfortunately, I have no clue about the solution archived
> there
> > > > ...)
> > > > > > > > > >
> > > > > > > > > > Any thoughts and solutions would be much appreciated !
> > > > > > > > > >
> > > > > > > > > > Thanks,
> > > > > > > > > > Jianbo Long
> > > > > > > > > >
> > > > > > > > > > <image.png>
> > > > > > > > > > <ex83f.F90>
> > > > > > > > >
> > > > > > > > >
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > > > >
> > > > >
> > > >
> > > >
> > >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20221108/c9a0a0a4/attachment-0001.html>


More information about the petsc-users mailing list