[mpich-discuss] Cannot build mpich2-1.0.8p1 (nemesis) with PGI 8.0-4 on Linux x86_64

Gus Correa gus at ldeo.columbia.edu
Wed Apr 1 15:26:43 CDT 2009


Hi Dave, list

Thank you again for your persistent help.
See my comments inline, please.

Dave Goodell wrote:
> On Mar 31, 2009, at 4:21 PM, Gus Correa wrote:
> 
>> I decompressed the tarball on a fresh new subdirectory,
>> configured again, ran make again.
>> However, I still get the same exact error.
>>
>> Moreover, the preprocessor macro
>> HAVE_GCC_AND_X86_64_ASM
>> continues not to be defined in the
>> src/include/mpichconf.h file,
>> and most likely continues to be the source for the error.
>>
>> By contrast, configure.in is identical to my patched
>> version (with the pgcc_atomics.patch file).
>>
>> I will send you the output of configure, config.log
>> and src/include/mpichconf.h off list.
>>
>> And here is the "diff" between the configure script from your
>> latest tarball
>> and the configure script from the 1.2.8p1 after I patched it with
>> pgcc_atomics.patch:
>>
>> [build_pgi-8.0-4]% diff ../configure 
>> /home/swinst/mpich2/1.0.8p1/mpich2-1.0.8p1/configure29449,29451c29449
>>
>> Did I do anything wrong?
> 
> 
> No, I don't think that you did anything wrong.  There are just other 
> bugs wrt to PGI in our configure logic that I haven't spotted yet.
> 
> This would all go a lot easier if I had a modern PGI compiler available 
> for testing.  

Good luck with this.
I hope DOE will put some money for a PGI license
in the Argonne new budget.
Serious.  Not April fools. :)

> In the mean time, I would recommend just editing the 
> generated mpichconf.h after the configure step and before the make step 
> to define HAVE_GCC_AND_X86_64_ASM to 1.  That ought to fix your 
> immediate problem, although there may be others that will appear once 
> you are over this hurdle.
> 

I tried to rebuild by editing mpichconf.h
in two slightly different ways.
First defining HAVE_GCC_AND_X86_64_ASM and HAVE_GCC_ATTRIBUTE to 1.
Second defining only  HAVE_GCC_AND_X86_64_ASM to 1.
In both cases I get errors with multiple definition of symbols,
like this:

/home/swinst/mpich2/1.0.8p1-pgcc/mpich2-1.0.8p1-pgcc/build_pgi-8.0-4/lib/libmpich.a(ch3_progress.o):/home/swinst/mpich2/1.0.8p1-pgcc/mpich2-1.0.8p1-pgcc/src/mpid/ch3/channels/nemesis/src/ch3_progress.c:961: 
first defined here
/home/swinst/mpich2/1.0.8p1-pgcc/mpich2-1.0.8p1-pgcc/build_pgi-8.0-4/lib/libmpich.a(ch3i_eagernoncontig.o): 
In function `MPID_nem_mpich2_blocking_recv':
/home/swinst/mpich2/1.0.8p1-pgcc/mpich2-1.0.8p1-pgcc/src/mpid/ch3/channels/nemesis/src/ch3i_eagernoncontig.c:1039: 
multiple definition of `MPID_nem_mpich2_blocking_recv'
/home/swinst/mpich2/1.0.8p1-pgcc/mpich2-1.0.8p1-pgcc/build_pgi-8.0-4/lib/libmpich.a(ch3_progress.o):/home/swinst/mpich2/1.0.8p1-pgcc/mpich2-1.0.8p1-pgcc/src/mpid/ch3/channels/nemesis/src/ch3_progress.c:1039: 
first defined here
/home/swinst/mpich2/1.0.8p1-pgcc/mpich2-1.0.8p1-pgcc/build_pgi-8.0-4/lib/libmpich.a(ch3i_eagernoncontig.o): 
In function `MPID_nem_mpich2_release_cell':
/home/swinst/mpich2/1.0.8p1-pgcc/mpich2-1.0.8p1-pgcc/src/mpid/ch3/channels/nemesis/src/ch3i_eagernoncontig.c:1170: 
multiple definition of `MPID_nem_mpich2_release_cell'
/home/swinst/mpich2/1.0.8p1-pgcc/mpich2-1.0.8p1-pgcc/build_pgi-8.0-4/lib/libmpich.a(ch3_progress.o):/home/swinst/mpich2/1.0.8p1-pgcc/mpich2-1.0.8p1-pgcc/src/mpid/ch3/channels/nemesis/src/ch3_progress.c:1170: 
first defined here
make[1]: *** [cpi] Error 2
make[1]: Leaving directory 
`/home/swinst/mpich2/1.0.8p1-pgcc/mpich2-1.0.8p1-pgcc/build_pgi-8.0-4/examples'
make: *** [all-redirect] Error 2

I may have done something wrong,
but I didn't pursue the issue any further.

> I'm not sure that there's much more I can do for you in the short-term 
> until I get my hands on a PGI compiler installation.  
> In the medium-term 
> I'll see if I can get this resolved for the 1.1.0 stable release (likely 
> some time in May).
> 
> Sorry,
> -Dave

Oh, never mind.
For now most likely I can live with the hybrid build (Gnu+PGI),
and the other builds with Intel, and Gnu.

It all depends on how the different codes that I need to compile
and run will behave.
Ideally, I would like to run them with different MPI builds,
and for each code select the build that performs better.
However, most users are happy enough when
the code simply compiles and runs correctly.

Thank you again for your help.
I look forward to the new release, with the PGI fixes.

Regards,
Gus Correa
---------------------------------------------------------------------
Gustavo Correa
Lamont-Doherty Earth Observatory - Columbia University
Palisades, NY, 10964-8000 - USA
---------------------------------------------------------------------


More information about the mpich-discuss mailing list