[petsc-dev] Fwd: [petsc-users] configure error on Titan with Intel

Mark Adams mfadams at lbl.gov
Thu May 14 17:15:04 CDT 2015


I've commented the autodect lines out and am waiting for a compute node.

Here is what you wanted.

Thanks,

On Thu, May 14, 2015 at 11:27 AM, Satish Balay <balay at mcs.anl.gov> wrote:

> You mention multiple failures here - but include log for only one case.
>
> Ideally lanugage compilers (c,c++,fortran) should interoperate with
> each other - but without it - one has to know the 'compatibility
> libraries' to use.
>
> -lstdc++ is a gnu/c++ library. For PGI - the library names could be
>  diffferent.
>
> --with-cxxlib-autodetect=1 - which is configure default - tries to
> determine this list - but with cray modules - this causes multiple
> listing of all libraries that are setup via modules - sometimes
> causing grief - hence you are using --with-cxxlib-autodetect=0 on cray
> builds.
>
> Can you send me the following info?
>
> cd src/benchmarks
> cc -v sizeof.c
> CC -v sizeof.c
>
> Also you can try --with-cxxlib-autodetect=1 - and see if it works.
>
> Satish
>
> On Thu, 14 May 2015, Mark Adams wrote:
>
> > We have Titan working but we have a problem that we need lstdc++ to get
> > PETSc to configure.  Otherwise it fails to build fortran.  This seems to
> > only be needed for GNU.  This is for PGI.  If we remove this C++ works,
> > otherwise there is a missing external.  I guess I can just deploy two
> > version, on for fortran and one for C++ (I have a C++ and a fortran code
> > that I support).
> >
> >
> > On Tue, May 12, 2015 at 9:36 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> >
> > >
> > >   Remind me again how much the US tax payers spend on this machine each
> > > year.
> > >
> > >
> > > Begin forwarded message:
> > >
> > > *From: *Mark Adams <mfadams at lbl.gov>
> > > *Subject: **Re: [petsc-users] configure error on Titan with Intel*
> > > *Date: *May 12, 2015 at 11:12:55 AM CDT
> > > *To: *Barry Smith <bsmith at mcs.anl.gov>
> > > *Cc: *Jed Brown <jed at jedbrown.org>, petsc-users <
> petsc-users at mcs.anl.gov>,
> > > "David Trebotich" <treb at lbl.gov>, "D'Azevedo, Ed F." <
> dazevedoef at ornl.gov>
> > >
> > >
> > >
> > >>   Notes: ./configure ran fine and detected -lhwloc in some standard
> > >> system install location under normal circumstances it couldn't just
> > >> disappear for a different example.
> > >>
> > >>
> > > I configured in an interactive shell, so on a compute node.  I tried to
> > > 'make ex56' on a login node, as usual.  So I am guessing that it would
> have
> > > worked if I had made it a compute node.  They have an inconstancy, I'm
> > > guessing. I can try it all on a compute node ....
> > >
> > > Mark
> > >
> > >
> > >>
> > >> > /
> > >> >
> > >> > On Fri, May 8, 2015 at 5:51 PM, Jed Brown <jed at jedbrown.org> wrote:
> > >> > Satish Balay <balay at mcs.anl.gov> writes:
> > >> >
> > >> > > Also - Perhaps with intel compilers - the recommended blas is
> > >> > > something other than ACML? [like MKL?]. Something to check..
> > >> >
> > >> > MKL (and anything built using the Intel compiler) has the "run
> slower on
> > >> > AMD" feature.  For any given operation, ACML won't necessarily be
> > >> > faster, but at least it's not intentionally crippled.  For most of
> what
> > >> > PETSc does, any such MKL/ACML difference is irrelevant.
> > >> >
> > >> > <configure.log><make.log>
> > >>
> > >>
> > >
> > >
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20150514/b2842f11/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: out
Type: application/octet-stream
Size: 16088 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20150514/b2842f11/attachment.obj>


More information about the petsc-dev mailing list