[petsc-users] --with-mpi=0

Tabrez Ali tabrezali at gmail.com
Sat Dec 18 11:57:59 CST 2021


Satish,

If you replace PETSC_COMM_WORLD with MPI_COMM_WORLD and compile ex8f.F90
using 3.14 then it will work but if you compile it using 3.15 or 3.16 then
it fails, e.g.,

stali at i5:~$ cd /tmp/petsc-3.14.6/src/vec/vec/tutorials/

stali at i5:/tmp/petsc-3.14.6/src/vec/vec/tutorials$ grep MPI_COMM_WORLD
ex8f.F90
  call MPI_Comm_rank(MPI_COMM_WORLD,rank,ierr)
  call VecCreate(MPI_COMM_WORLD,x,ierr);CHKERRA(ierr)

stali at i5:/tmp/petsc-3.14.6/src/vec/vec/tutorials$ make ex8f
PETSC_DIR=/tmp/petsc-3.14.6
gfortran -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g
-fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g
 -I/tmp/petsc-3.14.6/include
-I/tmp/petsc-3.14.6/arch-linux2-c-debug/include     ex8f.F90
 -Wl,-rpath,/tmp/petsc-3.14.6/arch-linux2-c-debug/lib
-L/tmp/petsc-3.14.6/arch-linux2-c-debug/lib
-Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/6
-L/usr/lib/gcc/x86_64-linux-gnu/6 -Wl,-rpath,/usr/lib/x86_64-linux-gnu
-L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu
-L/lib/x86_64-linux-gnu -lpetsc -llapack -lblas -lpthread -lm -lstdc++ -ldl
-lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lstdc++ -ldl -o ex8f

stali at i5:/tmp/petsc-3.14.6/src/vec/vec/tutorials$ rm ex8f

stali at i5:/tmp/petsc-3.14.6/src/vec/vec/tutorials$ make ex8f
PETSC_DIR=/tmp/petsc-3.16.1
gfortran -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O0
  -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O0
 -I/tmp/petsc-3.16.1/include
-I/tmp/petsc-3.16.1/arch-linux2-c-debug/include     ex8f.F90
 -Wl,-rpath,/tmp/petsc-3.16.1/arch-linux2-c-debug/lib
-L/tmp/petsc-3.16.1/arch-linux2-c-debug/lib
-Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/6
-L/usr/lib/gcc/x86_64-linux-gnu/6 -lpetsc -llapack -lblas -lpthread -lm
-lstdc++ -ldl -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lstdc++
-ldl -o ex8f
ex8f.F90:29:41:

   call MPI_Comm_rank(MPI_COMM_WORLD,rank,ierr)
                                         1
Error: Symbol ‘mpi_comm_world’ at (1) has no IMPLICIT type
/tmp/petsc-3.16.1/lib/petsc/conf/test:24: recipe for target 'ex8f' failed

Regards,

Tabrez

On Sat, Dec 18, 2021 at 10:42 AM Satish Balay <balay at mcs.anl.gov> wrote:

> Do you get this error with a petsc example that calls MPI_Comm_rank()?
>
> Say src/vec/vec/tutorials/ex8f.F90
>
> Satish
>
> [balay at pj01 tutorials]$ make ex8f
> gfortran -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g
> -O0   -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O0
> -I/home/balay/petsc/include -I/home/balay/petsc/arch-linux-c-debug/include
>    ex8f.F90  -Wl,-rpath,/home/balay/petsc/arch-linux-c-debug/lib
> -L/home/balay/petsc/arch-linux-c-debug/lib
> -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/9
> -L/usr/lib/gcc/x86_64-redhat-linux/9 -lpetsc -llapack -lblas -lpthread -lm
> -lX11 -lstdc++ -ldl -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath
> -lstdc++ -ldl -o ex8f
> [balay at pj01 tutorials]$ ./ex8f
> Vec Object: 1 MPI processes
>   type: seq
> 4.
> [balay at pj01 tutorials]$ nm -Ao ex8f |grep mpi_comm_rank
> ex8f:                 U petsc_mpi_comm_rank_
> [balay at pj01 tutorials]$
>
>
>
> On Fri, 17 Dec 2021, Tabrez Ali wrote:
>
> > Hi,
> >
> > I am trying to compile Fortran code with PETSc 3.16 built without MPI,
> > i.e., --with-mpi=0, and am getting the following error:
> >
> >    call MPI_Comm_rank(MPI_COMM_WORLD,rank,ierr)
> >                                          1
> > Error: Symbol ‘mpi_comm_world’ at (1) has no IMPLICIT type
> >
> > There are no issues with PETSc 3.14 or prior versions. Any ideas as to
> what
> > could be wrong?
> >
> > I do see the following note (below) in
> > https://petsc.org/main/docs/changes/315/ but I am not sure if it's
> related:
> >
> > *Add configure option --with-mpi-f90module-visibility [default=``1``].
> > With 0, mpi.mod will not be visible in use code (via petscsys.mod) -
> > so mpi_f08 can now be used*
> >
> > Regards,
> >
> > Tabrez
> >
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20211218/0dccaab8/attachment-0001.html>


More information about the petsc-users mailing list