make test
Satish Balay
balay at mcs.anl.gov
Tue Jul 7 12:54:50 CDT 2009
> /usr/bin/mpiCC: No such file or directory
You are using --downlod-mpich with PETSc - but compiling your code
wiht mpiCC from a different MPI install? It won't work.
Is your code c++? If so - sugest building PETSc with additional options:
'--with-cxx=g++ --with-clanguage=cxx'
And then use PETSc Makefile format for your appliation code [that sets
all make variables and targets needed to build PETSc
applications]. For eg: check src/ksp/ksp/examples/tutorials/makefile
Satish
On Tue, 7 Jul 2009, Yixun Liu wrote:
> Hi,
> I use the command,
> ./config/configure.py --with-cc=gcc --with-fc=gfortran --download-f-blas-lapack=1 --download-mpich=1
>
> and make test success.
>
> But when I compile my Petsc-based application I got the following errors,
>
>
> Linking CXX executable ../../../bin/PETScSolver
> /home/scratch/yixun/petsc-3.0.0-p3/linux-gnu-c-debug/lib/libpetscvec.a(vpscat.o):
> In function `VecScatterCreateCommon_PtoS':
> /home/scratch/yixun/petsc-3.0.0-p3/src/vec/vec/utils/vpscat.c:1770:
> undefined reference to `MPI_Type_create_indexed_block'
> /home/scratch/yixun/petsc-3.0.0-p3/src/vec/vec/utils/vpscat.c:1792:
> undefined reference to `MPI_Type_create_indexed_block'
> collect2: ld returned 1 exit status
> /usr/bin/mpiCC: No such file or directory
> gmake[2]: *** [bin/PETScSolver] Error 1
> gmake[1]: ***
> [PersoPkgs/oclatzPkg/MeshRegister/CMakeFiles/PETScSolver.dir/all] Error 2
> gmake: *** [all] Error 2
>
>
>
> Does it mean that I need to set LD_LIBRARY_PATH to MPICH2 installation path?
>
> Thanks.
>
>
>
>
>
>
>
>
>
>
>
>
> Satish Balay wrote:
> > On Sun, 5 Jul 2009, Yixun Liu wrote:
> >
> >
> >> I run it on my computer.
> >>
> >> md[/home/scratch/yixun/petsc-3.0.0-p3/src/ksp/ksp/examples/tutorials>make
> >> ex2
> >>
> >> mpicc -o ex2.o -c -Wall -Wwrite-strings -Wno-strict-aliasing -g3
> >> -I/home/scratch/yixun/petsc-3.0.0-p3/src/dm/mesh/sieve
> >> -I/home/scratch/yixun/petsc-3.0.0-p3/linux-gnu-c-debug/include
> >> -I/home/scratch/yixun/petsc-3.0.0-p3/include
> >> -I/usr/lib64/mpi/gcc/openmpi/include -I/usr/lib64/mpi/gcc/openmpi/lib64
> >> -D__SDIR__="src/ksp/ksp/examples/tutorials/" ex2.c
> >> mpicc -Wall -Wwrite-strings -Wno-strict-aliasing -g3 -o ex2 ex2.o
> >> -Wl,-rpath,/home/scratch/yixun/petsc-3.0.0-p3/linux-gnu-c-debug/lib
> >> -L/home/scratch/yixun/petsc-3.0.0-p3/linux-gnu-c-debug/lib -lpetscksp
> >> -lpetscdm -lpetscmat -lpetscvec -lpetsc -lX11 -llapack -lblas
> >> -L/usr/lib64/mpi/gcc/openmpi/lib64
> >> -L/usr/lib64/gcc/x86_64-suse-linux/4.3 -L/usr/lib64 -L/lib64
> >> -L/usr/x86_64-suse-linux/lib -ldl -lmpi -lopen-rte -lopen-pal -lnsl
> >> -lutil -lgcc_s -lpthread -lmpi_f90 -lmpi_f77 -lgfortranbegin -lgfortran
> >> -lm -lm -L/usr/lib64/gcc/x86_64-suse-linux -L/usr/x86_64-suse-linux/bin
> >> -L/lib -lm -lm -ldl -lmpi -lopen-rte -lopen-pal -lnsl -lutil -lgcc_s
> >> -lpthread -ldl
> >> /bin/rm -f ex2.o
> >>
> >
> > Did you install this OpenMPI - or did someone-else/sysadmin install it for you?
> >
> >
> >> md[/home/scratch/yixun/petsc-3.0.0-p3/src/ksp/ksp/examples/tutorials>mpiexec
> >> -n 2 ./ex2
> >>
> >> DAT: library load failure: /usr/lib64/libdaplcma.so.1: undefined symbol:
> >> dat_registry_add_provider
> >> DAT: library load failure: /usr/lib64/libdaplcma.so.1: undefined symbol:
> >> dat_registry_add_provider
> >>
> >
> >
> >> --------------------------------------------------------------------------
> >>
> >> WARNING: Failed to open "OpenIB-cma"
> >> [DAT_PROVIDER_NOT_FOUND:DAT_NAME_NOT_REGISTERED].
> >> This may be a real error or it may be an invalid entry in the uDAPL
> >> Registry which is contained in the dat.conf file. Contact your local
> >> System Administrator to confirm the availability of the interfaces in
> >> the dat.conf file.
> >>
> >
> > Your mpiexec is trying to run on infiniban and failing?
> >
> >
> >> --------------------------------------------------------------------------
> >> [0,1,1]: uDAPL on host md was unable to find any NICs.
> >> Another transport will be used instead, although this may result in
> >> lower performance.
> >> --------------------------------------------------------------------------
> >> Norm of error 0.000411674 iterations 7
> >>
> >
> > And then it attempts 'sockets' - and then successfully runs the PETSc example..
> >
> > So something is wrong with your mpi usage. I guess - you'll have to
> > check with your sysadmin - how to correctly use infiniband..
> >
> > Satish
> >
> >
>
>
More information about the petsc-users
mailing list