[petsc-dev] xSDK @ OLCF: PETSc: cannot find C preprocessor
Barry Smith
bsmith at mcs.anl.gov
Mon Feb 13 22:38:25 CST 2017
> On Feb 13, 2017, at 10:28 PM, Satish Balay <balay at mcs.anl.gov> wrote:
>
> On Mon, 13 Feb 2017, Barry Smith wrote:
>
>>
>>> On Feb 13, 2017, at 10:08 PM, Satish Balay <balay at mcs.anl.gov> wrote:
>>>
>>> Ah - yes - I now see the new logs.
>>>
>>> wrt:
>>>
>>>>>> setCompilers.py has the code below that I think is messing with our minds (if the bin directory is not there then we ignore --with-mpi-dir and and try all kinds of random
>>>>>> shit. Probably if the bin directory is missing we should just error out immedately!!!!!!
>>>
>>> --with-mpi-dir has this overloaded feature - where it tries to provide
>>> compile wrappers for some folk - and in other cases - just provide
>>> include and lib detection. For this second case - one can provide a
>>> compiler with --with-cc or let it guess for a compiler.
>>
>> But this particular else if case actually doesn't provide -I and -L information, it just chugs along totally ignoring the --with-mpi-dir directory information hence I argue that it is wrong and needs to be fixed. If it use the -with-mpi-dir to look for includes and libraries that would be a different story but it doesn't.
>
> It didn't get to MPI stage where this would happen. Its stuck at compiler and cpp tests [and the extra --with-cpp is causing not helping]
I understand, but that is a different issue. Once it has determined that --with-mpi-dir has not bin directory it goes off and searches all the paths for a compiler (without setting -I and -L directories related to --with-mpi-dir). That IMHO is goofy and needs to be fixed in PETSc configure (simplest fix is to error out, better than what it does now).
Now the fact that spack on OLCF provides a worthless --with-mpi-dir and its gcc wrapper fails is a different issue. This needs to be debugged separately but doesn't mean we shouldn't fix the PETSc configure bug!
Barry
>
>>
>>>
>>> Sure - all these combinations don't genrally make sense - but we
>>> didn't try to put a restricton on this [if configure tests pass - than
>>> that should be fine?]
>>>
>>> Wrt spack - I think it should always provide --with-cc=mpicc and not --with-mpi-dir option.
>>
>> I'm not sure that spack always has this information properly available. Hence my comment in the previous email. So I am not sure it can always know what those compiler names (and paths) are. I don't disagree with you but I cannot refactor spack on my one :-)
>
> from printenv
>
> SPACK_CC=/opt/cray/craype/2.5.5/bin/cc
> MPICC=/autofs/nccs-svm1_home1/mbt/spack/lib/spack/env/gcc/gcc
>
> presumably both are set by spack - and perhaps MPICC is what we should be using?
>
> https://github.com/LLNL/spack/pull/1807/files/7d8ec7ecb8b70436c3d376faf36443a5571a8d01
> modifed to: (for mpi case)
>
> '--with-cc=%s' % os.environ['MPICC'],
> '--with-cxx=%s' % (os.environ['MPICXX']
> '--with-fc=%s' % (os.environ['MPIF77']
> Satish
>
>>
>>
>>>
>>> Satish
>>>
>>>
>>> On Mon, 13 Feb 2017, Barry Smith wrote:
>>>
>>>>
>>>> Hmm, I think you are looking at the old (bogus) configure.log, in the most recent one from Mark I see
>>>>
>>>> ================================================================================
>>>> Starting Configure Run at Mon Feb 13 19:57:11 2017
>>>> Configure Options: --configModules=PETSc.Configure --optionsModule=config.compilerOptions --prefix=/autofs/nccs-svm1_home1/mbt/spack/opt/spack/cray-CNL-interlagos/gcc-5.3.0/petsc-develop-n4yqikl4ncgz226t53da65ngxloyzzj2 --with-ssl=0 --with-x=0 --download-c2html=0 --download-sowing=0 --download-hwloc=0 --with-mpi=1 --with-mpi-dir=/opt/cray/mpt/7.4.0/gni/mpich-gnu/5.1 --with-precision=double --with-scalar-type=real --with-shared-libraries=1 --with-debugging=0 --with-64-bit-indices=0 --with-blas-lapack-lib=/autofs/nccs-svm1_home1/mbt/spack/opt/spack/cray-CNL-interlagos/gcc-5.3.0/openblas-0.2.19-imvsrsipx5l3q6ix7ccbaecrszko6pkc/lib/libopenblas.so --with-cxx-dialect=C++11 --with-metis=1 --with-metis-dir=/autofs/nccs-svm1_home1/mbt/spack/opt/spack/cray-CNL-interlagos/gcc-5.3.0/metis-5.1.0-z5n5qrxvvpla6mm6diryhma2pknxglr4 --with-boost=1 --with-boost-dir=/sw/xk6/boost/1.60.0/sles11.3_gnu4.9.0_shared_python343 --with-hdf5=1 --with-hdf5-dir=/opt/cray/hdf5-parallel/1.8.14/GNU/5.1 --with-hypre=1 --with-hypre-dir=/autofs/nccs-svm1_home1/mbt/spack/opt/spack/cray-CNL-interlagos/gcc-5.3.0/hypre-develop-gfzuro2unxym7liqtngi3kh2t4qyfvbm --with-parmetis=1 --with-parmetis-dir=/autofs/nccs-svm1_home1/mbt/spack/opt/spack/cray-CNL-interlagos/gcc-5.3.0/parmetis-4.0.3-kccq4aaihke6skdpv6t3x5oirs3sdzo6 --with-mumps=0 --with-scalapack=0 --with-trilinos=1 --with-trilinos-dir=/autofs/nccs-svm1_home1/mbt/spack/opt/spack/cray-CNL-interlagos/gcc-5.3.0/trilinos-develop-te55jvsqxwsquvwgrcjfown6rufrp5cl --with-superlu_dist-include=/autofs/nccs-svm1_home1/mbt/spack/opt/spack/cray-CNL-interlagos/gcc-5.3.0/superlu-dist-develop-wsx2kmbc25yrxxezzarurdhv7bs2wlpe/include --with-superlu_dist-lib=/autofs/nccs-svm1_home1/mbt/spack/opt/spack/cray-CNL-interlagos/gcc-5.3.0/superlu-dist-develop-wsx2kmbc25yrxxezzarurdhv7bs2wlpe/lib/libsuperlu_dist.a --with-superlu_dist=1
>>>> Working directory: /tmp/mbt/spack-stage/spack-stage-Qpwg5g/petsc
>>>> Machine platform:
>>>> ('Linux', 'titan-ext3', '3.0.101-0.47.86.1.11753.0.PTF-default', '#1 SMP Wed Oct 19 14:11:00 UTC 2016 (56c73f1)', 'x86_64', 'x86_64')
>>>> Python version:
>>>> 2.7.9 (default, May 6 2016, 10:34:30)
>>>> [GCC 4.9.0 20140422 (Cray Inc.)]
>>>>
>>>> In his email he said that previously he had changed some things to try get a successful compiler; I suspect he added the --with-cc stuff. When I run spack I never can get it to pass --with-cc to PETSc configure.
>>>>
>>>> BTW: we should have configure.log always include a hash of PETSc dir so that we can immediately see if someone change some files and hence got us totally confused.
>>>>
>>>> We'll learn more once he responds.
>>>>
>>>> Barry
>>>>
>>>>
>>>>
>>>>> On Feb 13, 2017, at 9:48 PM, Satish Balay <balay at mcs.anl.gov> wrote:
>>>>>
>>>>> Barry,
>>>>>
>>>>> I don't have an account. Hong might still have her account active. We can check tomorrow.
>>>>>
>>>>> from configure.log
>>>>>>>>>>>>>>>
>>>>> Starting Configure Run at Mon Feb 13 16:17:20 2017
>>>>> Configure Options: --configModules=PETSc.Configure --optionsModule=config.compilerOptions --prefix=/autofs/nccs-svm1_home1/mbt/spack/opt/spack/cray-CNL-interlagos/gcc-5.3.0/petsc-develop-n4yqikl4ncgz226t53da65ngxloyzzj2 --with-ssl=0 --with-x=0 --download-c2html=0 --download-sowing=0 --download-hwloc=0 --with-cc=/autofs/nccs-svm1_home1/mbt/spack/lib/spack/env/gcc/gcc --with-cxx=/autofs/nccs-svm1_home1/mbt/spack/lib/spack/env/gcc/g++ --with-fc=/autofs/nccs-svm1_home1/mbt/spack/lib/spack/env/gcc/gfortran --with-mpi=1 --with-mpi-dir=/opt/cray/mpt/7.4.0/gni/mpich-gnu/5.1 --with-cpp=cpp --with-cxxcpp=cpp --with-precision=double --with-scalar-type=real --with-shared-libraries=1 --with-debugging=0 --with-64-bit-indices=0 --with-blas-lapack-lib=/autofs/nccs-svm1_home1/mbt/spack/opt/spack/cray-CNL-interlagos/gcc-5.3.0/openblas-0.2.19-imvsrsipx5l3q6ix7ccbaecrszko6pkc/lib/libopenblas.so --with-cxx-dialect=C++11 --with-metis=1 --with-metis-dir=/autofs/nccs-svm1_home1/mbt/spack/opt/spack/cray-CNL-interlagos/gcc-5.3.0/metis-5.1.0-z5n5qrxvvpla6mm6diryhma2pknxglr4 --with-boost=1 --with-boost-dir=/sw/xk6/boost/1.60.0/sles11.3_gnu4.9.0_shared_python343 --with-hdf5=1 --with-hdf5-dir=/opt/cray/hdf5-parallel/1.8.14/GNU/5.1 --with-hypre=1 --with-hypre-dir=/autofs/nccs-svm1_home1/mbt/spack/opt/spack/cray-CNL-interlagos/gcc-5.3.0/hypre-develop-gfzuro2unxym7liqtngi3kh2t4qyfvbm --with-parmetis=1 --with-parmetis-dir=/autofs/nccs-svm1_home1/mbt/spack/opt/spack/cray-CNL-interlagos/gcc-5.3.0/parmetis-4.0.3-kccq4aaihke6skdpv6t3x5oirs3sdzo6 --with-mumps=0 --with-scalapack=0 --with-trilinos=1 --with-trilinos-dir=/autofs/nccs-svm1_home1/mbt/spack/opt/spack/cray-CNL-interlagos/gcc-5.3.0/trilinos-develop-te55jvsqxwsquvwgrcjfown6rufrp5cl --with-superlu_dist-include=/autofs/nccs-svm1_home1/mbt/spack/opt/spack/cray-CNL-interlagos/gcc-5.3.0/superlu-dist-develop-wsx2kmbc25yrxxezzarurdhv7bs2wlpe/include --with-superlu_dist-lib=/autofs/nccs-svm1_home1/mbt/spack/opt/spack/cray-CNL-interlagos/gcc-5.3.0/superlu-dist-develop-wsx2kmbc25yrxxezzarurdhv7bs2wlpe/lib/libsuperlu_dist.a --with-superlu_dist=1
>>>>> Working directory: /tmp/mbt/spack-stage/spack-stage-c8EKUl/petsc
>>>>> <<<<<<<<
>>>>>
>>>>> spack is passing in --with-cc - as this stuff is printed way ahead before any of this code is executed.
>>>>>
>>>>> When '--with-cc --with-mpi-dir' options are specified - configure uses 'cc' - and then tries to guess include and library list from mpi-dir location.
>>>>>
>>>>> Satish
>>>>>
>>>>> On Mon, 13 Feb 2017, Barry Smith wrote:
>>>>>
>>>>>>
>>>>>> Do you have any account on this crazy OLCF machine? It will be much faster if we do things directly rather than work through a third person who knows even less than us.
>>>>>> Earlier I think he was fucking around with the --with-cc options himself, which is why they appeared in the command line options for ./configure when PETSc and Spack
>>>>>> would not have set them.
>>>>>>
>>>>>> setCompilers.py has the code below that I think is messing with our minds (if the bin directory is not there then we ignore --with-mpi-dir and and try all kinds of random
>>>>>> shit. Probably if the bin directory is missing we should just error out immedately!!!!!!
>>>>>>
>>>>>>
>>>>>> elif self.useMPICompilers() and 'with-mpi-dir' in self.argDB and os.path.isdir(os.path.join(self.argDB['with-mpi-dir'], 'bin')):
>>>>>> self.usedMPICompilers = 1
>>>>>> yield os.path.join(self.argDB['with-mpi-dir'], 'bin', 'mpicc')
>>>>>> yield os.path.join(self.argDB['with-mpi-dir'], 'bin', 'mpcc')
>>>>>> yield os.path.join(self.argDB['with-mpi-dir'], 'bin', 'hcc')
>>>>>> yield os.path.join(self.argDB['with-mpi-dir'], 'bin', 'mpcc_r')
>>>>>> self.usedMPICompilers = 0
>>>>>> raise RuntimeError('MPI compiler wrappers in '+self.argDB['with-mpi-dir']+'/bin do not work. See http://www.mcs.anl.gov/petsc/documentation/faq.html#mpi-compilers')
>>>>>> else:
>>>>>> if self.useMPICompilers():
>>>>>> self.usedMPICompilers = 1
>>>>>>
>>>>>> In this case I don't think spack should be providing this "bogus" --with-mpi-dir but that is a separate issue. Spack needs to be smarter about knowing what is and what is
>>>>>> not a MPI compiler wrapper.
>>>>>>
>>>>>> Barry
>>>>>>
>>>>>> I agree with you that it might be better for spack to pass the MPI compiler names rather than --with-mpi-dir the problem is that spack still hasn't totally bought into
>>>>>> the "always use the MPI compiler wrappers as the compilers" model so it might falsely provide the non-MPI compilers
>>>>>>
>>>>>>
>>>>>> Begin forwarded message:
>>>>>>
>>>>>> From: "Berrill, Mark A." <berrillma at ornl.gov>
>>>>>> Subject: Re: xSDK @ OLCF: PETSc: cannot find C preprocessor
>>>>>> Date: February 13, 2017 at 7:03:43 PM CST
>>>>>> To: Satish Balay <balay at mcs.anl.gov>, Barry Smith <bsmith at mcs.anl.gov>
>>>>>> Cc: "McInnes, Lois Curfman" <curfman at mcs.anl.gov>, "Sarich, Jason J." <sarich at mcs.anl.gov>
>>>>>>
>>>>>> Everybody,
>>>>>>
>>>>>> I did rebuild and I get consistent behavior.
>>>>>>
>>>>>> Barry, unfortunately, the slow build times are partially due to the filesystems, none of which are particularly fast. Spack uses the tmp folder and I am using NFS.
>>>>>> I wanted to verify that everything else continued to work, and even without Trilinos it still takes quite a while to build from the start.
>>>>>>
>>>>>> I ran the updated version and I am still seeing the same error. Note that there is a slight difference in the configure command due to changes I had made trying to
>>>>>> fix the issue. The latest copy (attached) does not have any of my changes and is the current git version of barry/xsdk.
>>>>>>
>>>>>> Thank you for your help,
>>>>>> Mark
>>>>>>
>>>>>> ________________________________________
>>>>>> From: Satish Balay <balay at mcs.anl.gov>
>>>>>> Sent: Monday, February 13, 2017 7:11 PM
>>>>>> To: Barry Smith
>>>>>> Cc: Berrill, Mark A.; McInnes, Lois Curfman; Sarich, Jason J.
>>>>>> Subject: Re: xSDK @ OLCF: PETSc: cannot find C preprocessor
>>>>>>
>>>>>> --with-mpi-dir is an overloaded option. And we wanted configure do do
>>>>>> everything it can - before giving up. I guess we could say :
>>>>>> --with-mpi-dir and --with-cc are specified - flag an error. [but this
>>>>>> is basic functionality for other packages]
>>>>>>
>>>>>> Wrt --with-cc - I see it at:
>>>>>> https://github.com/LLNL/spack/pull/1807/files/7d8ec7ecb8b70436c3d376faf36443a5571a8d01
>>>>>>
>>>>>> Satish
>>>>>>
>>>>>> On Mon, 13 Feb 2017, Barry Smith wrote:
>>>>>>
>>>>>>
>>>>>> Satish,
>>>>>>
>>>>>> The /autofs/nccs-svm1_home1/mbt/spack/lib/spack/env/gcc/gcc is actually just a wrapper that calls the underlying compiler. It looks like there is a
>>>>>> problem with either the wrapper or the underlying compiler in that it prints messages about missing files. On the other hand it looks like the compiler
>>>>>> actually works. Even with the CPP since the include file is printed it seems the cpp works but the warning messages that are printed cause PETSc's
>>>>>> configure to conclude that the cpp failed.
>>>>>>
>>>>>> The use of mpi_dir predates my use of spack for PETSc
>>>>>>
>>>>>> b909da75 var/spack/repos/builtin/packages/petsc/package.py (Denis Davydov 2016-03-30 19:26:03 +0200 139) compiler_opts = [
>>>>>> b909da75 var/spack/repos/builtin/packages/petsc/package.py (Denis Davydov 2016-03-30 19:26:03 +0200 140) '--with-mpi=1',
>>>>>> b909da75 var/spack/repos/builtin/packages/petsc/package.py (Denis Davydov 2016-03-30 19:26:03 +0200 141) '--with-mpi-dir=%s' %
>>>>>> self.spec['mpi'].prefix,
>>>>>> b909da75 var/spack/repos/builtin/packages/petsc/package.py (Denis Davydov 2016-03-30 19:26:03 +0200 142) ]
>>>>>>
>>>>>> If --with-mpi-dir doesn't work in PETSc then we should remove it; otherwise how are you going to prevent people from using it, you can't police everyone
>>>>>> in the world not to use it. It should work.
>>>>>>
>>>>>> I cannot figure out where the --with-cc stuff is coming from. I think it has to be spack but I cannot see where. The petsc spack file is not setting the
>>>>>> --with-cc that I can see. I need to debug.
>>>>>>
>>>>>> Barry
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Feb 13, 2017, at 4:24 PM, Satish Balay <balay at mcs.anl.gov> wrote:
>>>>>>
>>>>>> Barry,
>>>>>>
>>>>>> although using cpp is a bad workarround to the original problem [would
>>>>>> have to look at the configure.log that triggred that change] - the
>>>>>> current issue appears to be a broken 'gcc' [as indicated below]
>>>>>>
>>>>>> Satish
>>>>>>
>>>>>> On Mon, 13 Feb 2017, Barry Smith wrote:
>>>>>>
>>>>>>
>>>>>> This is the same problem jason had (Jason this is why it is important to get bugs you find fixed upstream as fast as possible and
>>>>>> not sit on them). The PETSc spack file has this horrible thing
>>>>>>
>>>>>> if sys.platform != "darwin":
>>>>>> compiler_opts.extend([
>>>>>> '--with-cpp=cpp',
>>>>>> '--with-cxxcpp=cpp',
>>>>>> ])
>>>>>>
>>>>>>
>>>>>> Which was put in by
>>>>>>
>>>>>> 88af9f78 var/spack/repos/builtin/packages/petsc/package.py (Erik Schnetter 2016-09-26 13:38:02 -0400 139) if
>>>>>> sys.platform != "darwin":
>>>>>> 88af9f78 var/spack/repos/builtin/packages/petsc/package.py (Erik Schnetter 2016-09-26 13:38:02 -0400 140)
>>>>>> compiler_opts.extend([
>>>>>> 00bac8f2 var/spack/repos/builtin/packages/petsc/package.py (Erik Schnetter 2016-09-20 13:05:57 -0400 141)
>>>>>> '--with-cpp=cpp',
>>>>>> 00bac8f2 var/spack/repos/builtin/packages/petsc/package.py (Erik Schnetter 2016-09-20 13:05:57 -0400 142)
>>>>>> '--with-cxxcpp=cpp',
>>>>>> 88af9f78 var/spack/repos/builtin/packages/petsc/package.py (Erik Schnetter 2016-09-26 13:38:02 -0400 143)
>>>>>>
>>>>>> Please also send me your user ID on GitHub so I can mark issues/pull requests with your name.
>>>>>>
>>>>>> I have commented on the pull request that introduced this problem: https://github.com/LLNL/spack/pull/1807
>>>>>>
>>>>>>
>>>>>> I have pushed code that comments out this bad code. Please do in the spack directory
>>>>>>
>>>>>> git checkout developer
>>>>>> git branch -D barry/xsdk
>>>>>> git pull
>>>>>> git checkout barry/xsdk
>>>>>>
>>>>>> ./bin/spack install petsc
>>>>>>
>>>>>> and again send the configure.log to petsc-maint at mcs.anl.gov if it fails. If it fails it should fail for a different reason.
>>>>>>
>>>>>>
>>>>>> Barry
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Feb 13, 2017, at 4:01 PM, Satish Balay <balay at mcs.anl.gov> wrote:
>>>>>>
>>>>>> Looks like the 'gcc' compiler installed by spack is broken - or misbehaving
>>>>>>
>>>>>> Executing: /autofs/nccs-svm1_home1/mbt/spack/lib/spack/env/gcc/gcc -c -o
>>>>>> /tmp/petsc-RkoVhw/config.setCompilers/conftest.o -I/tmp/petsc-RkoVhw/config.setCompilers
>>>>>> /tmp/petsc-RkoVhw/config.setCompilers/conftest.c
>>>>>> Possible ERROR while running compiler:
>>>>>> stderr:
>>>>>> /autofs/nccs-svm1_home1/mbt/spack/lib/spack/env/gcc/gcc: line 358:
>>>>>> /autofs/nccs-svm1_home1/mbt/spack/bin/spack-cc-petsc at develop%gcc at 5.3.0~boost~complex~debug+double+hdf5+hypre~int64+metis+mpi~mumps+shared+superlu-dist+trilinos
>>>>>> arch=cray-CNL-interlagos /n4yqikl.in.log: No such file or directory
>>>>>> /autofs/nccs-svm1_home1/mbt/spack/lib/spack/env/gcc/gcc: line 359:
>>>>>> /autofs/nccs-svm1_home1/mbt/spack/bin/spack-cc-petsc at develop%gcc at 5.3.0~boost~complex~debug+double+hdf5+hypre~int64+metis+mpi~mumps+shared+superlu-dist+trilinos
>>>>>> arch=cray-CNL-interlagos /n4yqikl.out.log: No such file or directory
>>>>>> Source:
>>>>>> #include "confdefs.h"
>>>>>> #include "conffix.h"
>>>>>>
>>>>>> int main() {
>>>>>> ;
>>>>>> return 0;
>>>>>> }
>>>>>> <<<<<<<<<<<
>>>>>> A couple of notes to Barry,
>>>>>>
>>>>>> The following is suspicious:
>>>>>>
>>>>>> --with-mpi-dir=/opt/cray/mpt/7.4.0/gni/mpich-gnu/5.1 --with-cpp=cpp --with-cxxcpp=cpp
>>>>>>
>>>>>> For one - we use cpp only when the compiler cpp is broken. [then it also needs mpi.h somehow]
>>>>>>
>>>>>> And for a package manager like spack - its best that it does not use package-dir options (esp MPI).
>>>>>>
>>>>>> --with-mpi-dir=/opt/cray/mpt/7.4.0/gni/mpich-gnu/5.1
>>>>>> --with-metis-dir=/autofs/nccs-svm1_home1/mbt/spack/opt/spack/cray-CNL-interlagos/gcc-5.3.0/metis-5.1.0-z5n5qrxvvpla6mm6diryhma2pknxglr4
>>>>>> --with-boost-dir=/sw/xk6/boost/1.60.0/sles11.3_gnu4.9.0_shared_python343
>>>>>> --with-hdf5-dir=/opt/cray/hdf5-parallel/1.8.14/GNU/5.1
>>>>>> --with-hypre-dir=/autofs/nccs-svm1_home1/mbt/spack/opt/spack/cray-CNL-interlagos/gcc-5.3.0/hypre-develop-gfzuro2unxym7liqtngi3kh2t4qyfvbm
>>>>>> --with-parmetis-dir=/autofs/nccs-svm1_home1/mbt/spack/opt/spack/cray-CNL-interlagos/gcc-5.3.0/parmetis-4.0.3-kccq4aaihke6skdpv6t3x5oirs3sdzo6
>>>>>> --with-trilinos-dir=/autofs/nccs-svm1_home1/mbt/spack/opt/spack/cray-CNL-interlagos/gcc-5.3.0/trilinos-develop-te55jvsqxwsquvwgrcjfown6rufrp5cl
>>>>>>
>>>>>> Satish
>>>>>>
>>>>>> On Mon, 13 Feb 2017, Berrill, Mark A. wrote:
>>>>>>
>>>>>> Satihs,
>>>>>>
>>>>>> Please see the attached tarball which contains the log file. I am rebuilding (with a fresh checkout of
>>>>>> the barry/xsdk branch) to verify I didn't somehow introduce a bug. This will take several hours.
>>>>>>
>>>>>> Thanks,
>>>>>> Mark
>>>>>>
>>>>>> ________________________________________
>>>>>> From: Satish Balay <balay at mcs.anl.gov>
>>>>>> Sent: Monday, February 13, 2017 4:30 PM
>>>>>> To: McInnes, Lois Curfman
>>>>>> Cc: Berrill, Mark A.; Smith, Barry F.; Sarich, Jason J.
>>>>>> Subject: Re: xSDK @ OLCF: PETSc: cannot find C preprocessor
>>>>>>
>>>>>> Mark,
>>>>>>
>>>>>> Could you send us configure.log from the petsc build - for this CPP failure?
>>>>>>
>>>>>> Thanks,
>>>>>> Satihs
>>>>>>
>>>>>> On Mon, 13 Feb 2017, McInnes, Lois Curfman wrote:
>>>>>>
>>>>>>
>>>>>> Hi Mark: I see from updated notes here that you’ve made it pas the compilers.yaml problem and
>>>>>> now have a problem with PETSc not finding a C
>>>>>> preprocessor:
>>>>>> https://docs.google.com/document/d/10VzhJbQRVml9fLozZc4YrXM8-kze91RwMwbrqnX0hns/edit#
>>>>>>
>>>>>> Barry, Satish, Jason: Can you help Mark with these PETSc/xSDK issues at OLCF?
>>>>>>
>>>>>> Thanks,
>>>>>> Lois
>>>>>>
>>>>>> From: Lois Curfman McInnes <curfman at mcs.anl.gov>
>>>>>> Date: Monday, February 13, 2017 at 9:20 AM
>>>>>> To: "Berrill, Mark A." <berrillma at ornl.gov>, Todd Gamblin <gamblin2 at llnl.gov>
>>>>>> Cc: Jason Sarich <jason.sarich at gmail.com>, "Klinvex, Alicia Marie" <amklinv at sandia.gov>,
>>>>>> "Willenbring, James M" <jmwille at sandia.gov>, "Smith, Barry
>>>>>> F." <bsmith at mcs.anl.gov>
>>>>>> Subject: xSDK @ OLCF: Spack hang reading compiles.yam file
>>>>>>
>>>>>>
>>>>>> Hi Mark — Thanks for your work on this and for the update. Forwarding to Todd, who is the
>>>>>> expert for all things Spack :-) … And to Jason and
>>>>>> Alicia, who have done xSDK testing at ALCF and NERSC and have reported various Spack problems,
>>>>>> + Jim and Barry, in case they have comments.
>>>>>>
>>>>>> Todd encourages use to report xSDK Spack problems using the xSDK tag for Spack issues; then he
>>>>>> and other Spack developers can help more
>>>>>> effectively, as the info is visible to everyone.
>>>>>>
>>>>>> All: Mark’s problem is: spack appears to hang reading the compilers.yaml file.
>>>>>>
>>>>>> Todd: Please provide any guidance that Mark would need to report this. I seem to recall that
>>>>>> some permission was needed when Jason first
>>>>>> reported a Spack issue with the xSDK tag.
>>>>>>
>>>>>> All: Please speak up if you have comments.
>>>>>>
>>>>>> Mark: I put a comment in the xSDK Test Plan file:
>>>>>> https://docs.google.com/document/d/10VzhJbQRVml9fLozZc4YrXM8-kze91RwMwbrqnX0hns/edit#
>>>>>> We will use this file in tomorrow’s xSDK meeting. So we could put updates about progress
>>>>>> here.
>>>>>>
>>>>>> Thanks to everyone.
>>>>>> Lois
>>>>>>
>>>>>> From: "Berrill, Mark A." <berrillma at ornl.gov>
>>>>>> Date: Monday, February 13, 2017 at 9:08 AM
>>>>>> To: Lois Curfman McInnes <curfman at mcs.anl.gov>
>>>>>> Subject: Re: xSDK @ OLCF: how's it going?
>>>>>>
>>>>>> Lois,
>>>>>>
>>>>>>
>>>>>> I was able to figure out the CMake issue, and made significant progress. I decided to
>>>>>> verify everything with a fresh build and now
>>>>>> spack appears to hang reading the compilers.yaml file. Do you know if anybody is seeing
>>>>>> this, or who would be the best person to
>>>>>> contact. This is preventing me from doing anything at this point and I am afraid I am
>>>>>> lost.
>>>>>>
>>>>>>
>>>>>> Thanks,
>>>>>>
>>>>>> Mark
>>>>>>
>>>>>>
>>>>>> _____________________________________________________________________________________________________________________________________________________
>>>>>> From: McInnes, Lois Curfman <curfman at mcs.anl.gov>
>>>>>> Sent: Wednesday, February 08, 2017 3:22 PM
>>>>>> To: Berrill, Mark A.
>>>>>> Subject: Re: xSDK @ OLCF: how's it going?
>>>>>>
>>>>>> Hi Mark -- no worries. Thanks for the update and all your work.
>>>>>>
>>>>>> Could you please put a few notes about the problem either in email to this list of people or
>>>>>> in the google doc? They may be able to
>>>>>> help, and especially since we're trying hard to get to the point where we can claim 'alpha'
>>>>>> release status at cse17, people are on call
>>>>>> to pitch in as needed.
>>>>>>
>>>>>> Thanks so much.
>>>>>> Lois
>>>>>>
>>>>>> On Feb 8, 2017, at 2:52 PM, Berrill, Mark A. <berrillma at ornl.gov> wrote:
>>>>>>
>>>>>> Lois,
>>>>>>
>>>>>>
>>>>>> Sorry I wasn't able to call in yesterday. I am still figuring out the CDash issue. I am
>>>>>> not sure who the best person to ask
>>>>>> is, I have been trying to follow up with UA here to see if they have seen the issue.
>>>>>>
>>>>>>
>>>>>> Thanks,
>>>>>>
>>>>>> Mark
>>>>>>
>>>>>>
>>>>>>
>>>>>> _____________________________________________________________________________________________________________________________________________________
>>>>>> From: McInnes, Lois Curfman <curfman at mcs.anl.gov>
>>>>>> Sent: Wednesday, February 08, 2017 1:51 PM
>>>>>> To: Berrill, Mark A.
>>>>>> Cc: Willenbring, James M; Smith, Barry F.; Heroux, Michael A; Bernholdt, David E.; Todd
>>>>>> Gamblin
>>>>>> Subject: xSDK @ OLCF: how's it going?
>>>>>>
>>>>>> Hi Mark — I enjoyed talking with you last week in Knoxville.
>>>>>>
>>>>>> I’m writing to touch base on how things are progressing with xSDK testing at OLCF. Our goal
>>>>>> is to have the Spack install for all
>>>>>> xSDK packages fully tested and working on key machines by the time of our next xSDK telecon on
>>>>>> Tues, Feb 14. How are things going
>>>>>> at OLCF? If you are encountering any difficulties, please do let us know so that people can
>>>>>> pitch in as needed.
>>>>>>
>>>>>> All comments welcome.
>>>>>>
>>>>>> Many thanks,
>>>>>> Lois
>>>>>>
>>>>>> From: <ideas-xsdk-bounces at lists.mcs.anl.gov> on behalf of Lois Curfman McInnes
>>>>>> <curfman at mcs.anl.gov>
>>>>>> Date: Monday, February 6, 2017 at 2:11 PM
>>>>>> To: IDEAS xSDK <ideas-xsdk at lists.mcs.anl.gov>
>>>>>> Subject: [ideas-xsdk] xSDK telecon: Tues, Feb 7, 11:30 am PT
>>>>>>
>>>>>>
>>>>>> Dear xSDK folks:
>>>>>>
>>>>>> Recall that we'll meet on Tues, Feb 7, 11:30 am PT to touch base on xSDK release planning
>>>>>> (emphasis: Spack + testing).
>>>>>> Recall the timeline:
>>>>>> * Feb 7: target date for resolving issues with the build
>>>>>> * Feb 14: target date for xSDK release
>>>>>> * Feb 23: need a complete poster for SIAM CSE17 meeting
>>>>>> We’re continuing to take notes in the following Google doc: IDEAS Project > xSDK > xSDK Test
>>>>>> Plan:
>>>>>>
>>>>>> direct link:
>>>>>> https://docs.google.com/document/d/10VzhJbQRVml9fLozZc4YrXM8-kze91RwMwbrqnX0hns/edit#heading=h.yc78ebcn4w7u
>>>>>>
>>>>>> We encourage anyone who has updates to be prepared to discuss at the telecon and add notes to
>>>>>> the doc.
>>>>>>
>>>>>> Thanks,
>>>>>> Lois
>>>>>>
>>>>>> -------
>>>>>>
>>>>>> Video conference information:
>>>>>>
>>>>>> To join or start the meeting, go to:
>>>>>>
>>>>>> *
>>>>>>
>>>>>> https://bluejeans.com/765958151?g=mn2xeztnmfxea3ldomxgc3tmfztw65q=
>>>>>>
>>>>>> *
>>>>>>
>>>>>> Just want to dial in? (http://bluejeans.com/numbers)
>>>>>>
>>>>>> 1) Enter meeting phone number: +1 408 740 7256
>>>>>>
>>>>>> +1 888 240 2560 (US Toll Free)
>>>>>>
>>>>>> +1 408 317 9253 (Alternate Number)
>>>>>>
>>>>>> 2) Enter Meeting ID: 765958151
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>
>>>>
>>
More information about the petsc-dev
mailing list