[petsc-dev] PETSC's compilation
Satish Balay
balay at mcs.anl.gov
Mon Mar 22 17:58:30 CDT 2010
As mentioned before - its 'mpiexec'
Satish
On Mon, 22 Mar 2010, David sheehan wrote:
> The "bin" directory has the following files:
>
> adiforfix.py mpich2version mpiexec.poe
> parkill taucc.py
> adprocess.py mpicxx mpiexec.prun
> parseargs.py TOPSGenerator.py
> chibaoutput mpiexec mpiexec.sshsync
> petsc_libtool TOPSInstaller.py
> configVars.py mpiexec.chiba mpiexec.uni
> petscmpiexec update.py
> hostnames.chiba mpiexec.gmalloc mpiexec.valgrind
> popup urlget
> matlab mpiexec.lam mpif77
> portabilitycheck.py urlget.py
> mpicc mpiexec.llrun mpif90
> processSummary.py win32fe
> which one is "mpirun"? thanks.
>
> David
>
> On Mon, Mar 22, 2010 at 5:44 PM, Satish Balay <balay at mcs.anl.gov> wrote:
>
> >
> > You should look for mpiexec. It will be in the 'bin' dir for the
> > prefix used - i.ed /home/david/petscdir/bin/mpiexec
> >
> > Satish
> >
> >
> > On Mon, 22 Mar 2010, David sheehan wrote:
> >
> > > I build up petsc successfully as follows
> > > ./configure --with-cc= --prefix=/home/david/petscdir --with-cc=gcc
> > > --with-cxx=g++ --with-fc=ifort --download-mpich=1 --download-hypre=1
> > > make
> > > make install
> > > make test
> > >
> > > Also, I can link it with my application. Now I need to run my application
> > > with the petsc, where can I find "mpirun" for the mpich downloaded to run
> > my
> > > application? thanks.
> > >
> > > David
> > >
> > >
> > >
> > > On Mon, Mar 22, 2010 at 4:45 PM, Satish Balay <balay at mcs.anl.gov> wrote:
> > >
> > > > You can't use 2 MPIs at same time. i.e use either --download-mpich=1
> > > > or --download-openmpi=1 - but not both.
> > > >
> > > > Satish
> > > >
> > > > On Mon, 22 Mar 2010, David sheehan wrote:
> > > >
> > > > > You mean,
> > > > > ./configure --with-cc=gcc --with-cxx=g++ --with-fc=ifort
> > > > --download-mpich=1
> > > > > --download-hypre=1 --download-openmpi=1
> > > > > thanks.
> > > > >
> > > > > David
> > > > >
> > > > > On Mon, Mar 22, 2010 at 3:58 PM, Matthew Knepley <knepley at gmail.com>
> > > > wrote:
> > > > >
> > > > > > On Mon, Mar 22, 2010 at 3:54 PM, David sheehan <
> > > > david.sheehanjr at gmail.com>wrote:
> > > > > >
> > > > > >> I don't have icc and icpc. I only have ifort and gnu compilers
> > such
> > > > > >> g77,gcc
> > > > > >> and g++. Since my application code works very well with ifort, can
> > I
> > > > build
> > > > > >>
> > > > > >> up PETSC plus Hypre with ifort and gnu compilers? thanks.
> > > > > >>
> > > > > >
> > > > > > It might be possible (though not guaranteed if ifort conflicts with
> > GNU
> > > > > > somehow). You
> > > > > > jsut provide these compilers to the configure, and use
> > > > --download-openmpi.
> > > > > >
> > > > > > Matt
> > > > > >
> > > > > >
> > > > > >>
> > > > > >> David
> > > > > >>
> > > > > >> On Mon, Mar 22, 2010 at 3:29 PM, Satish Balay <balay at mcs.anl.gov>
> > > > wrote:
> > > > > >>
> > > > > >>> If you have installation issues - send the relavent logs [in this
> > > > case
> > > > > >>> configure.log] to petsc-maint.
> > > > > >>>
> > > > > >>> Here you appear to try too many things.. But its not clear if you
> > are
> > > > > >>> installing openmpi/lam yourself - or using default from Red Hat
> > 3.4.
> > > > > >>> [note - default openmpi will be built with gnu compilers - so
> > > > > >>> unuseable from ifort].
> > > > > >>>
> > > > > >>> Thing to do:
> > > > > >>>
> > > > > >>> - decide on compilers you want to use.
> > > > > >>> - build PETSc and mpi with these compilers
> > > > > >>>
> > > > > >>> [for eg: configure CC=icc CXX=icpc FC=ifort --download-mpich=1
> > > > > >>> --download-hypre=1]
> > > > > >>>
> > > > > >>> Note: alternative f90 compilers seable on linux are gfortran, g95
> > > > > >>>
> > > > > >>> Satish
> > > > > >>>
> > > > > >>> On Mon, 22 Mar 2010, David sheehan wrote:
> > > > > >>>
> > > > > >>> > Hi,
> > > > > >>> > I am trying to compile the PETSC with hypre. I have Intel
> > FORTRAN,
> > > > > >>> openmpi
> > > > > >>> > and lam
> > > > > >>> > MPI (gcc version 3.4.6 (Red Hat 3.4.6-3)) available.
> > > > > >>> >
> > > > > >>> > Since my application code is in FORTRAN with dynamic memory
> > > > allocation
> > > > > >>> in
> > > > > >>> > the code,
> > > > > >>> > so I have to use Intel FORTRAN(ifort) as the compiler to link
> > PETSC
> > > > > >>> with my
> > > > > >>> > application
> > > > > >>> > code. I can build up PETSC(petsc-3.0.0-p11) plus hypre with Lam
> > MPI
> > > > but
> > > > > >>> > without ifort
> > > > > >>> > successfully. However I can not use ifort to link the PETSC
> > with my
> > > > > >>> > application success-
> > > > > >>> > fully. The link always shows erros about "undefined reference"
> > to
> > > > MPI
> > > > > >>> parts
> > > > > >>> > in my application
> > > > > >>> > code such as 'mpi_send_', 'mpi_recv_', 'mpi_waitany_',
> > > > > >>> 'mpi_get_count_',
> > > > > >>> > and etc.
> > > > > >>> >
> > > > > >>> > Can any one help me out? thanks in advance.
> > > > > >>> >
> > > > > >>> > Best Regards,
> > > > > >>> >
> > > > > >>> > David
> > > > > >>> >
> > > > > >>>
> > > > > >>>
> > > > > >>
> > > > > >
> > > > > >
> > > > > > --
> > > > > > What most experimenters take for granted before they begin their
> > > > > > experiments is infinitely more interesting than any results to
> > which
> > > > their
> > > > > > experiments lead.
> > > > > > -- Norbert Wiener
> > > > > >
> > > > >
> > > >
> > > >
> > >
> >
> >
>
More information about the petsc-dev
mailing list