[petsc-dev] 4.2 and PETSc
Matthew Knepley
knepley at gmail.com
Fri Apr 27 11:06:00 CDT 2012
On Fri, Apr 27, 2012 at 11:53 AM, Daniel Lowell <redratio1 at gmail.com> wrote:
> Even with the new build system I still had to go through and eliminate the
> -g3 in petscvariables
>
> CC_FLAGS = -Wall -Wwrite-strings -Wno-strict-aliasing
> -Wno-unknown-pragmas -g3 -fno-inline -O0
> PCC_LINKER_FLAGS = -Wall -Wwrite-strings -Wno-strict-aliasing
> -Wno-unknown-pragmas -g3 -fno-inline -O0
> PCC_FLAGS = -Wall -Wwrite-strings -Wno-strict-aliasing
> -Wno-unknown-pragmas -g3 -fno-inline -O0
>
I just ran this last night. There were no more -g3's. As Satish said, send
logs.
Matt
> On Mon, Apr 23, 2012 at 4:43 PM, Daniel Lowell <redratio1 at gmail.com>wrote:
>
>> Sending the make.log to petsc-maint now.
>>
>>
>> On Mon, Apr 23, 2012 at 4:38 PM, Satish Balay <balay at mcs.anl.gov> wrote:
>>
>>> but you missed copy/pasting the actual compile command. So incomplete
>>> info here..
>>>
>>> satish
>>>
>>> On Mon, 23 Apr 2012, Daniel Lowell wrote:
>>>
>>> > I added -v in the local folder's makefile to produce verbose output.
>>> >
>>> > On Mon, Apr 23, 2012 at 4:33 PM, Satish Balay <balay at mcs.anl.gov>
>>> wrote:
>>> >
>>> > > Hm - we've seen these errors with -g3 but not with -g. Are you sure
>>> > > there is no '-g3' somewhere on the compile command?
>>> > >
>>> > > With latest petsc-dev/buildsystem one should not see -g3 [set by
>>> configure]
>>> > >
>>> > > Also - why is there 'cicc' in a petsc build?
>>> > >
>>> > > Satish
>>> > >
>>> > > On Mon, 23 Apr 2012, Daniel Lowell wrote:
>>> > >
>>> > > > Hi all,
>>> > > >
>>> > > >
>>> > > > Has anyone run into any problems with NVCC 4.1 or .42 on PETSc with
>>> > > > arch=sm_20?
>>> > > > I have having real issues with it. I keep getting this:
>>> > > >
>>> > > > #$ cicc -arch compute_20 -m64 -ftz=0 -prec_div=1 -prec_sqrt=1
>>> -fmad=1 -g
>>> > > > -O0 "/tmp/tmpxft_00000b8b_00000000-10_vecgpu"
>>> > > > "/tmp/tmpxft_00000b8b_00000000-7_vecgpu.cpp3.i" -o
>>> > > > "/tmp/tmpxft_00000b8b_00000000-2_vecgpu.ptx"
>>> > > > <built-in>(2): error: "__STDC_HOSTED__" is predefined; attempted
>>> > > > redefinition ignored
>>> > > >
>>> > > > <built-in>(8): error: "__WCHAR_TYPE__" is predefined; attempted
>>> > > > redefinition ignored
>>> > > >
>>> > > > <built-in>(115): error: "__x86_64" is predefined; attempted
>>> redefinition
>>> > > > ignored
>>> > > >
>>> > > > <built-in>(116): error: "__x86_64__" is predefined; attempted
>>> > > redefinition
>>> > > > ignored
>>> > > >
>>> > > > <built-in>(126): error: "__linux__" is predefined; attempted
>>> redefinition
>>> > > > ignored
>>> > > >
>>> > > > <built-in>(128): error: "__unix__" is predefined; attempted
>>> redefinition
>>> > > > ignored
>>> > > >
>>> > > > 6 errors detected in the compilation of
>>> > > > "/tmp/tmpxft_00000b8b_00000000-7_vecgpu.cpp3.i".
>>> > > > # --error 0x1 --
>>> > > >
>>> > > >
>>> > > > 4.0 works fine, no issues.
>>> > > > 4.0 version uses the older nvopencc, in 4.1 and 4.2 they use
>>> something
>>> > > > called cicc, for target architecture sm_2x. Of course it does no
>>> mention
>>> > > > cicc in the updated nvcc documentation at all.
>>> > > >
>>> > > > Tried 4.2 with GCC 4.3 and 4.4 without any difference in behavior.
>>> > > >
>>> > > >
>>> > > > PETSc configuration script:
>>> > > >
>>> > > > MPI_DIR="--with-mpi-dir=$MPICH2_HOME"
>>> > > > LAD="--download-f-blas-lapack=yes"
>>> > > > CUD="--with-cuda-dir=/soft/cuda-4.2/cuda"
>>> > > > ./config/configure.py $MPI_DIR $LAD $CUD \
>>> > > > --with-debugging=1 \
>>> > > > --with-cudac="nvcc -m64 -g -G" \
>>> > > > --with-precision=double \
>>> > > > --with-clanguage=c \
>>> > > > --with-cuda-arch=sm_20
>>> > > >
>>> > > >
>>> > > >
>>> > > >
>>> > > >
>>> > >
>>> > >
>>> >
>>> >
>>> >
>>>
>>>
>>
>>
>> --
>> Daniel Lowell
>>
>>
>
>
> --
> Daniel Lowell
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20120427/45369693/attachment.html>
More information about the petsc-dev
mailing list