mpi not working

Ben Tay zonexo at gmail.com
Tue Jan 16 01:56:26 CST 2007


Hi Pan,

I also got very big library files if I use PETSc with mpich2.

Btw, I have tried several options but I still don't understand why I can't
get mpi to work with PETSc.

The 4 processors are running together but each running its own code.

I just use


integer :: nprocs,rank,ierr

 call PetscInitialize(PETSC_NULL_CHARACTER,ierr)

 call MPI_Comm_rank(PETSC_COMM_WORLD,rank,ierr)

 call MPI_Comm_size(PETSC_COMM_WORLD,nprocs,ierr)

 print *, rank, nprocs

 call PetscFinalize(ierr)



The answers I get is 0,1 repeated 4 times instead of 0,4 1,4 2,4 3,4.

I'm using my school's server's mpich and it work if I just compile in pure
mpi.

Btw, if I need to send the job to 4 processors, I need to use a script file:

#BSUB -o std-output
#BSUB -q linux_parallel_test
#BSUB -n 4
/usr/lsf6/bin/mpijob_gm /opt/mpich/myrinet/intel/bin/mpirun a.out

I wonder if the problem lies here...



Thank you.



On 1/16/07, li pan <li76pan at yahoo.com> wrote:
>
> I did try to download and install petsc2.3.2, but end
> up with error: mpich can not be download & installed,
> please install mpich for windows manually.
> In the homepage of mpich2, I didn't choose the version
> for windows but the source code version. And compiled
> it by myself. Then, I gave the --with-mpi-dir="install
> dir". Petsc was configured, and now it's doing "make".
> One interesting thing is, I installed petsc in linux
> before. The mpi libraries were very large
> (libmpich.a==60 mb). But this time in cygwin it was
> only several mbs.
>
> best
>
> pan
>
>
> --- Ben Tay <zonexo at gmail.com> wrote:
>
> > hi,
> >
> > i install PETSc using the following command:
> >
> > ./config/configure.py --with-vendor-compilers=intel
> > --with-gnu-compilers=0
> >
> --with-blas-lapack-dir=/lsftmp/g0306332/inter/mkl/lib/32
> > --with-mpi-dir=/opt/mpich/intel/ --with-x=0
> > --with-shared
> >
> > then i got:
> >
> > Compilers:
> >
> >   C Compiler:         /opt/mpich/intel/bin/mpicc
> > -fPIC -g
> >   Fortran Compiler:   /opt/mpich/intel/bin/mpif90
> > -I. -fPIC -g -w90 -w
> > Linkers:
> >   Shared linker:   /opt/mpich/intel/bin/mpicc
> > -shared  -fPIC -g
> >   Dynamic linker:   /opt/mpich/intel/bin/mpicc
> > -shared  -fPIC -g
> > PETSc:
> >   PETSC_ARCH: linux-mpif90
> >   PETSC_DIR: /nas/lsftmp/g0306332/petsc-2.3.2-p8
> >   **
> >   ** Now build and test the libraries with "make all
> > test"
> >   **
> >   Clanguage: C
> >   Scalar type:real
> > MPI:
> >   Includes: ['/opt/mpich/intel/include']
> >   PETSc shared libraries: enabled
> >   PETSc dynamic libraries: disabled
> > BLAS/LAPACK:
> > -Wl,-rpath,/lsftmp/g0306332/inter/mkl/lib/32
> > -L/lsftmp/g0306332/inter/mkl/lib/32 -lmkl_lapack
> > -lmkl_ia32 -lguide
> >
> > i ran "make all test" and everything seems fine
> >
> > /opt/mpich/intel/bin/mpicc -c -fPIC -g
> >
> -I/nas/lsftmp/g0306332/petsc-2.3.2-p8-I/nas/lsftmp/g0306332/petsc-
> > 2.3.2-p8/bmake/linux-mpif90
> > -I/nas/lsftmp/g0306332/petsc-2.3.2-p8/include
> > -I/opt/mpich/intel/include
> > -D__SDIR__="src/snes/examples/tutorials/" ex19.c
> > /opt/mpich/intel/bin/mpicc -fPIC -g  -o ex19
> > ex19.o-Wl,-rpath,/nas/lsftmp/g0306332/petsc-
> > 2.3.2-p8/lib/linux-mpif90
> >
> -L/nas/lsftmp/g0306332/petsc-2.3.2-p8/lib/linux-mpif90
> > -lpetscsnes -lpetscksp -lpetscdm -lpetscmat
> > -lpetscvec -lpetsc
> > -Wl,-rpath,/lsftmp/g0306332/inter/mkl/lib/32
> > -L/lsftmp/g0306332/inter/mkl/lib/32 -lmkl_lapack
> > -lmkl_ia32 -lguide
> > -lPEPCF90 -Wl,-rpath,/opt/intel/compiler70/ia32/lib
> > -Wl,-rpath,/opt/mpich/intel/lib
> > -L/opt/mpich/intel/lib -Wl,-rpath,-rpath
> > -Wl,-rpath,-ldl -L-ldl -lmpich
> > -Wl,-rpath,/opt/intel/compiler70/ia32/lib
> > -Wl,-rpath,/opt/intel/compiler70/ia32/lib
> > -L/opt/intel/compiler70/ia32/lib
> > -Wl,-rpath,/usr/lib -Wl,-rpath,/usr/lib -L/usr/lib
> > -limf -lirc -lcprts -lcxa
> > -lunwind -ldl -lmpichf90 -lPEPCF90
> > -Wl,-rpath,/opt/intel/compiler70/ia32/lib
> > -L/opt/intel/compiler70/ia32/lib -Wl,-rpath,/usr/lib
> > -L/usr/lib -lintrins
> > -lIEPCF90 -lF90 -lm  -Wl,-rpath,\ -Wl,-rpath,\ -L\
> > -ldl -lmpich
> > -Wl,-rpath,/opt/intel/compiler70/ia32/lib
> > -L/opt/intel/compiler70/ia32/lib
> > -Wl,-rpath,/usr/lib -L/usr/lib -limf -lirc -lcprts
> > -lcxa -lunwind -ldl
> > /bin/rm -f ex19.o
> > C/C++ example src/snes/examples/tutorials/ex19 run
> > successfully with 1 MPI
> > process
> > C/C++ example src/snes/examples/tutorials/ex19 run
> > successfully with 2 MPI
> > processes
> > Fortran example src/snes/examples/tutorials/ex5f run
> > successfully with 1 MPI
> > process
> > Completed test examples
> >
> > I then tried to run my own parallel code. It's a
> > simple code which prints
> > the rank of each processor.
> >
> > If I compile the code using just mpif90 test.F(using
> > just mpif.h)
> >
> > I get 0,1,2,3 (4 processors).
> >
> > however, if i change the code to use petsc.h etc
> > ie.
> >
> >
> >         program ns2d_c
> >
> >         implicit none
> >
> >
> > #include "include/finclude/petsc.h"
> > #include "include/finclude/petscvec.h"
> > #include "include/finclude/petscmat.h"
> > #include "include/finclude/petscksp.h"
> > #include "include/finclude/petscpc.h"
> > #include "include/finclude/petscmat.h90"
> >
> >         integer,parameter :: size_x=8,size_y=4
> >
> >         integer ::
> > ierr,Istart_p,Iend_p,Ntot,Istart_m,Iend_m,k
> >
> >  PetscMPIInt     nprocs,rank
> >
> >
> >
> >
> > call PetscInitialize(PETSC_NULL_CHARACTER,ierr)
> >
> >         call
> > MPI_Comm_rank(PETSC_COMM_WORLD,rank,ierr)
> >
> >         call
> > MPI_Comm_size(PETSC_COMM_WORLD,nprocs,ierr)
> >
> > end program ns2d_c
> >
> >
> >
> > i then rename the filename to ex2f.F and use "make
> > ex2f"
> >
> > the result I get is something like 0,0,0,0.
> >
> >
> >
> > Why is this so?
> >
> > Thank you.
> >
>
>
>
>
>
> ____________________________________________________________________________________
> Never Miss an Email
> Stay connected with Yahoo! Mail on your mobile.  Get started!
> http://mobile.yahoo.com/services?promote=mail
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20070116/69078283/attachment.htm>


More information about the petsc-users mailing list