[petsc-dev] building with mpich on OSX
Satish Balay
balay at mcs.anl.gov
Wed Jul 28 23:41:39 CDT 2010
>>>
sh: GNU Fortran (GCC) 4.2.1 (Apple Inc. build 5659) + GF 4.2.4
ar: /Users/hong/soft/petsc-dev/linux-hong/lib/libscalapack.a is a fat file (use libtool(1) or lipo(1) and ar(1) on it)
ar: /Users/hong/soft/petsc-dev/linux-hong/lib/libscalapack.a: Inappropriate file type or format
<<<
Try:
rm -rf linux-hong externalpackages
and use --with-fc='gfortran -m64'
I now recommend: gfortran from http://hpc.sourceforge.net/ [instead of
http://r.research.att.com/tools/. And if att compiler is used, it
should be 4.2.3 - not 4.2.1]
Satish
On Wed, 28 Jul 2010, Hong Zhang wrote:
> Satish:
>
> >> Sorry - you need '--download-mpich=1 --downlaod-pm=gforker'
>
> Adding '--download-mpich=1' works.
> However, '--download-scalapack' fails on this new machine.
> Log file is attached.
>
> Hong
> >
>
> >> Satish
> >>
> >> On Wed, 28 Jul 2010, Hong Zhang wrote:
> >>
> >>> Satish,
> >>> On my new MacAir (OSX 10.6.4), I got smooth configure with '--download-openmpi'
> >>> but error with '--download-mpich-pm=gforker':
> >>> ...
> >>> C++ error! MPI_Finalize() could not be located!
> >>> *******************************************************************************
> >>> File "./config/configure.py", line 270, in petsc_configure
> >>> framework.configure(out = sys.stdout)
> >>> File "/Users/hong/soft/petsc-dev/config/BuildSystem/config/framework.py",
> >>> line 946, in configure
> >>> child.configure()
> >>> File "/Users/hong/soft/petsc-dev/config/BuildSystem/config/package.py",
> >>> line 478, in configure
> >>> self.executeTest(self.configureLibrary)
> >>> File "/Users/hong/soft/petsc-dev/config/BuildSystem/config/base.py",
> >>> line 97, in executeTest
> >>> ret = apply(test, args,kargs)
> >>> File "/Users/hong/soft/petsc-dev/config/BuildSystem/config/packages/MPI.py",
> >>> line 716, in configureLibrary
> >>> self.executeTest(self.CxxMPICheck)
> >>> File "/Users/hong/soft/petsc-dev/config/BuildSystem/config/base.py",
> >>> line 97, in executeTest
> >>> ret = apply(test, args,kargs)
> >>> File "/Users/hong/soft/petsc-dev/config/BuildSystem/config/packages/MPI.py",
> >>> line 634, in CxxMPICheck
> >>> raise RuntimeError('C++ error! MPI_Finalize() could not be located!')
> >>>
> >>> The log file is attached.
> >>>
> >>> Hong
> >>>
> >>>
> >>> On Wed, Jul 28, 2010 at 4:30 PM, Satish Balay <balay at mcs.anl.gov> wrote:
> >>>> current workarround of for mpich on OSX is to use:
> >>>> --download-mpich-pm=gforker
> >>>>
> >>>> Satish
> >>>>
> >>>> On Wed, 28 Jul 2010, Wesley Smith wrote:
> >>>>
> >>>>> Does anyone know if mpich is building properly on OSX. In external
> >>>>> packages, I have mpich2-trunk-r6644. When I build through petsc, I
> >>>>> get:
> >>>>>
> >>>>>
> >>>>> ld: duplicate symbol _HYDT_bind_info in
> >>>>> /Users/wesleysmith/Documents/projects/toposynth/PetSc/petsc-dev/externalpackages/mpich2-trunk-r6644/src/pm/hydra/.libs/libhydra.a(bind_hwloc.o)
> >>>>> and /Users/wesleysmith/Documents/projects/toposynth/PetSc/petsc-dev/externalpackages/mpich2-trunk-r6644/src/pm/hydra/.libs/libhydra.a(bind.o)
> >>>>> collect2: ld returned 1 exit status
> >>>>> make[4]: *** [hydra_nameserver] Error 1
> >>>>> make[3]: *** [all-recursive] Error 1
> >>>>> make[2]: *** [all-redirect] Error 1
> >>>>> make[1]: *** [all-redirect] Error 2
> >>>>> make: *** [all-redirect] Error 2
> >>>>> *******************************************************************************
> >>>>>
> >>>>>
> >>>>> wes
> >>>>>
> >>>>
> >>>>
> >>>
> >
> >
>
More information about the petsc-dev
mailing list