building petsc on a POWER5+ machine

Barry Smith bsmith at mcs.anl.gov
Fri Mar 16 14:00:15 CDT 2007


  Thomas,

   Please send configure.log to petsc-maint at mcs.anl.gov

   Thanks

   Barry


On Fri, 16 Mar 2007, Thomas Geenen wrote:

> dear petsc users,
> 
> I build petsc on a POWER5+ machine.
> now i would like to add mumps.
> if i do that the way it should de done i end up with a version in
> which mumps hangs during the factorization phase.
> If i build mumps standalone linking to essl and pessl i get a working
> version of mumps.
> 
> I tried telling the petsc configuration to use essl and pessl instead
> of scalapack and blacs like this
> --with-scalapack=yes --with-scalapack-lib=/usr/lib/libpessl.a
> --with-blacs=yes --with-blacs-lib=/usr/lib/libessl.a
> --with-scalapack-include=/usr/include/
> with-blacs-include=/usr/include/
> 
> resulting in
> =================================================================================
> TESTING: check from
> config.libraries(/ptmp/huangwei/petsc-2.3.2-p8/python/BuildSystem/config/libraries.py:108)
> *********************************************************************************
>         UNABLE to CONFIGURE with GIVEN OPTIONS    (see configure.log
> for details):
> ---------------------------------------------------------------------------------------
> --with-blacs-lib=['/usr/lib/libessl.a'] and
> --with-blacs-include=/usr/include/ did not work
> *********************************************************************************
> 
> so how can i either link my working version of mumps to petsc without
> having a scalapack or blacs lib or include,
> or use essl and pessl as substitute for scalapack and blacs and rebuild petsc
> 
> cheers
> Thomas
> 
> complete config option list
> ./config/configure.py --with-memcmp-ok --with-endian=big
> --sizeof_char=1 --sizeof_void_p=8 --sizeof_short=2 --sizeof_int=4
> --sizeof_long=8 --sizeof_long_long=8 --sizeof_float=4
> --sizeof_double=8 --bits_per_byte=8 --sizeof_MPI_Comm=4
> --sizeof_MPI_Fint=4 --with-batch --with-shared=0
> --with-mpirun=mpirun.lsf --with-mpi-dir=/usr/lpp/ppe.poe/
> --with-mpi-shared=yes --with-fc=mpxlf_r --with-cc=mpcc_r
> --with-hypre=yes --with-mumps=yes
> --with-hypre-dir=/blhome/geenen/source/petsc-2.3.2-p8/externalpackages/hypre-1.11.1b/src/hypre/
> --with-mumps-dir=/blhome/geenen/source/petsc-2.3.2-p8/externalpackages/MUMPS_4.6.3/
> --with-scalapack=yes --with-scalapack-lib=/usr/lib/libpessl.a
> --with-blacs=yes --with-blacs-lib=/usr/lib/libessl.a
> --with-scalapack-include=/usr/include/
> with-blacs-include=/usr/include/
> 
> 




More information about the petsc-users mailing list