[petsc-users] PETSc with Julia Binary Builder

Satish Balay balay at mcs.anl.gov
Fri Jul 2 10:35:38 CDT 2021


On Fri, 2 Jul 2021, Matthew Knepley wrote:

> On Fri, Jul 2, 2021 at 2:05 AM Patrick Sanan <patrick.sanan at gmail.com>
> wrote:
> 
> > As you mention in [4], the proximate cause of the configure failure is
> > this link error [8]:
> >
> 
> That missing function was introduced in  GCC 7.0, and is there only for
> i686, not x86_64. This looks like a bad GCC install to me.

>>>>>>>
Checking for program /opt/bin/i686-linux-gnu-libgfortran3-cxx11/cc...found

Executing: cc  -o /tmp/petsc-wfp3a1w4/config.setCompilers/conftest    /tmp/petsc-wfp3a1w4/config.setCompilers/conftest.o  -lpetsc-ufod4vtr9mqHvKIQiVAm
Possible ERROR while running linker: exit code 1
stderr:
/opt/i686-linux-gnu/bin/../lib/gcc/i686-linux-gnu/6.1.0/../../../../i686-linux-gnu/bin/ld: cannot find -lpetsc-ufod4vtr9mqHvKIQiVAm
collect2: error: ld returned 1 exit status
Running Executable WITHOUT threads to time it out
Executing: cc --version
stdout:
i686-linux-gnu-gcc (GCC) 6.1.0

Checking for program /opt/bin/i686-linux-gnu-libgfortran3-cxx11/c++...found
/workspace/destdir/lib/libstdc++.so: undefined reference to `__divmoddi4 at GCC_7.0.0'
<<<<<<

Yeah - its strange that there is a reference to @GCC_7.0.0 symbol from /workspace/destdir/lib/libstdc++.so - which appears to be gcc-6.1.0 install.

And I'm confused by multiple paths - so its not clear if they all belong to the same compiler install.

/opt/bin/i686-linux-gnu-libgfortran3-cxx11
/opt/i686-linux-gnu/bin/../lib/gcc/i686-linux-gnu/6.1.0/../../../../i686-linux-gnu/bin/ -> /opt/i686-linux-gnu/i686-linux-gnu/bin/
/workspace/destdir/lib/

So yeah - the compiler install is likely broken. Something to try is --with-cxx=0

Satish

> 
>    Matt
> 
> 
> > Naively, that looks like a problem to be resolved at the level of the C++
> > compiler and MPI.
> >
> > Unless there are wrinkles of this build process that I don't understand
> > (likely), this [6] looks non-standard to me:
> >
> >         includedir="${prefix}/include"
> >         ...
> >         ./configure --prefix=${prefix} \
> >                 ...
> >                 -with-mpi-include="${includedir}" \
> >                 ...
> >
> >
> > Is it possible to configure using  --with-mpi-dir, instead of the separate
> > --with-mpi-include and --with-mpi-lib commands?
> >
> >
> > As an aside, maybe Satish can say more, but I'm not sure if it's advisable
> > to override variables in the make command [7].
> >
> > [8]
> > https://gist.github.com/jkozdon/c161fb15f2df23c3fbc0a5a095887ef8#file-configure-log-L7795
> > [6]
> > https://gist.github.com/jkozdon/c161fb15f2df23c3fbc0a5a095887ef8#file-build_tarballs-jl-L45
> > [7]
> > https://gist.github.com/jkozdon/c161fb15f2df23c3fbc0a5a095887ef8#file-build_tarballs-jl-L55
> >
> >
> > > Am 02.07.2021 um 06:25 schrieb Kozdon, Jeremy (CIV) <jekozdon at nps.edu>:
> > >
> > > I have been talking with Boris Kaus and Patrick Sanan about trying to
> > revive the Julia PETSc interface wrappers. One of the first things to get
> > going is to use Julia's binary builder [1] to wrap more scalar, real, and
> > int type builds of the PETSc library; the current distribution is just
> > Real, double, Int32. I've been working on a PR for this [2] but have been
> > running into some build issues on some architectures [3].
> > >
> > > I doubt that anyone here is an expert with Julia's binary builder
> > system, but I was wondering if anyone who is better with the PETSc build
> > system can see anything obvious from the configure.log [4] that might help
> > me sort out what's going on.
> > >
> > > This exact script worked on 2020-08-20 [5] to build the libraries, se
> > something has obviously changed with either the Julia build system and/or
> > one (or more!) of the dependency binaries.
> > >
> > > For those that don't know, Julia's binary builder system essentially
> > allows users to download binaries directly from the web for any system that
> > the Julia Programing language distributes binaries for. So a (desktop) user
> > can get MPI, PETSc, etc. without the headache of having to build anything
> > from scratch; obviously on clusters you would still want to use system MPIs
> > and what not.
> > >
> > > ----
> > >
> > > [1] https://github.com/JuliaPackaging/BinaryBuilder.jl
> > > [2] https://github.com/JuliaPackaging/Yggdrasil/pull/3249
> > > [3]
> > https://github.com/JuliaPackaging/Yggdrasil/pull/3249#issuecomment-872698681
> > > [4]
> > https://gist.github.com/jkozdon/c161fb15f2df23c3fbc0a5a095887ef8#file-configure-log
> > > [5]
> > https://github.com/JuliaBinaryWrappers/PETSc_jll.jl/releases/tag/PETSc-v3.13.4%2B0
> >
> >
> 
> 



More information about the petsc-users mailing list