[MPICH] SSM channel on Solaris 64bit
Rajeev Thakur
thakur at mcs.anl.gov
Wed Apr 25 08:51:08 CDT 2007
ssm used to work on Solaris earlier, but not currently. That is why the line
"Please volunteer to fix this code" got added to the error message :-). I
will update the release notes.
Rajeev
> -----Original Message-----
> From: owner-mpich-discuss at mcs.anl.gov
> [mailto:owner-mpich-discuss at mcs.anl.gov] On Behalf Of James S Perrin
> Sent: Wednesday, April 25, 2007 5:10 AM
> To: mpich
> Subject: [MPICH] SSM channel on Solaris 64bit
>
> Hi,
> I trying to configure MPICH2 1.0.5p4 on Solaris 9 with
> the Sun Studio
> 11 compilers. I wish to use the ch3:ssm channel. From the
> release notes:
>
> "The ssm channel uses special interprocess locks (often
> assembly) that
> may not work with some compilers or machine architectures.
> ... It also
> works in Windows and Solaris environments."
>
> I'm presuming this means if using gcc (the default) under
> Solaris as I
> got the following error:
>
> configure: error: Use of inline process locks is not
> supported. Please
> volunteer to fix this code
> configure: error: Configure of src/mpid/common/locks failed!
>
> I had set the following env and run configure as:
>
> CC=cc
> CXX=CC
> CFLAGS=-xarch=v9 -xcode=pic32 -xs -O -xmemalign=8s
> CXXFLAGS=-xarch=v9 -xcode=pic32 -xs -O -xmemalign=8s
>
> ./configure --with-mpe --with-pm=mpd --with-device=ch3:ssm
> --with-thread-package=pthreads --disable-f77 --disable-f90
>
> I just needs my suspicions confirmed.
>
> Regards
> James
> --
> --------------------------------------------------------------
> ----------
> James S. Perrin, | email:
> james.perrin at manchester.ac.uk
> Manchester Visualization Centre, | web: www.mc.manchester.ac.uk
> Kilburn Building, The University, | tel: +44 161 275 6945
> Manchester, England. M13 9PL. | fax: +44 161 275 0637
> --------------------------------------------------------------
> ----------
> "The test of intellect is the refusal to belabour the obvious"
> - Alfred Bester
> --------------------------------------------------------------
> ----------
>
>
More information about the mpich-discuss
mailing list