[petsc-users] PETSc on fedora 36

Barry Smith bsmith at petsc.dev
Tue Aug 23 10:21:31 CDT 2022


  1) Run the trivial PETSc program with 2 MPI ranks and -info (send the output)

  2) do a ldd on the PETSc program and the pure MPI program to see if they are all using the same libraries

  3) add a MPI_Initialize() immediately before the PetscInitialize() and immediately after the PetscFinalize(), does the program think it is parallel?

  4) is PETSC_HAVE_MPI_INIT_THREAD defined in $PETSC_ARCH/include/petscconf.h ?  If so this means MPI_Init_thread() is used to start up MPI instead of MPI_Init(). If you change MPI_Init() in your MPI standalone code to MPI_Init_thread() does it still run properly in parallel?





  Barry


> On Aug 23, 2022, at 6:34 AM, Rafel Amer Ramon <rafel.amer at upc.edu> wrote:
> 
> 
> Hi,
> 
> mpicc and mpirun are from the package openmpi-devel-4.1.2-3.fc36.x86_64
> and the petsc64 library is linked with  /usr/lib64/openmpi/lib/libmpi.so.40
> ~# ldd /lib64/libpetsc64.so.3.16.4 
> 	linux-vdso.so.1 (0x00007fff5becc000)
> 	libflexiblas64.so.3 => /lib64/libflexiblas64.so.3 (0x00007f3030e30000)
> 	libcgns.so.4.2 => /usr/lib64/openmpi/lib/libcgns.so.4.2 (0x00007f3030d46000)
> 	libhdf5.so.200 => /usr/lib64/openmpi/lib/libhdf5.so.200 (0x00007f303091a000)
> 	libm.so.6 => /lib64/libm.so.6 (0x00007f303083c000)
> 	libX11.so.6 => /lib64/libX11.so.6 (0x00007f30306f4000)
> 	libstdc++.so.6 => /lib64/libstdc++.so.6 (0x00007f30304c0000)
> 	libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00007f303049e000)
> 	libc.so.6 => /lib64/libc.so.6 (0x00007f303029c000)
> 	libgfortran.so.5 => /lib64/libgfortran.so.5 (0x00007f302ffcf000)
> 	libquadmath.so.0 => /lib64/libquadmath.so.0 (0x00007f302ff87000)
> 	/lib64/ld-linux-x86-64.so.2 (0x00007f30327bb000)
> 	libmpi.so.40 => /usr/lib64/openmpi/lib/libmpi.so.40 (0x00007f302fe59000)
> 	libsz.so.2 => /lib64/libsz.so.2 (0x00007f302fe50000)
> 	libz.so.1 => /lib64/libz.so.1 (0x00007f302fe34000)
> 	libxcb.so.1 => /lib64/libxcb.so.1 (0x00007f302fe08000)
> 	libopen-rte.so.40 => /usr/lib64/openmpi/lib/libopen-rte.so.40 (0x00007f302fd4d000)
> 	libopen-pal.so.40 => /usr/lib64/openmpi/lib/libopen-pal.so.40 (0x00007f302fc9f000)
> 	libhwloc.so.15 => /lib64/libhwloc.so.15 (0x00007f302fc42000)
> 	libevent_core-2.1.so.7 => /lib64/libevent_core-2.1.so.7 (0x00007f302fc07000)
> 	libevent_pthreads-2.1.so.7 => /lib64/libevent_pthreads-2.1.so.7 (0x00007f302fc02000)
> 	libXau.so.6 => /lib64/libXau.so.6 (0x00007f302fbfc000)
> 
> I think that it's correct, but it don't work.
> 
> Best regards,
> 
> Rafel Amer
> 
> El 22/8/22 a les 20:36, Barry Smith ha escrit:
>> 
>>   Are you sure the mpirun you are using matches the mpi that PETSc was built with? 
>> 
>>> On Aug 22, 2022, at 2:31 PM, Rafel Amer Ramon <rafel.amer at upc.edu <mailto:rafel.amer at upc.edu>> wrote:
>>> 
>>> Hi,
>>> 
>>> I have installed the following packages on fedora 36
>>> 
>>> ~# rpm -qa | grep petsc
>>> petsc64-3.16.4-3.fc36.x86_64
>>> petsc-3.16.4-3.fc36.x86_64
>>> petsc-openmpi-3.16.4-3.fc36.x86_64
>>> petsc-openmpi-devel-3.16.4-3.fc36.x86_64
>>> python3-petsc-openmpi-3.16.4-3.fc36.x86_64
>>> petsc-devel-3.16.4-3.fc36.x86_64
>>> petsc64-devel-3.16.4-3.fc36.x86_64
>>> 
>>> ~# rpm -qa | grep openmpi
>>> openmpi-4.1.2-3.fc36.x86_64
>>> ptscotch-openmpi-6.1.2-2.fc36.x86_64
>>> scalapack-openmpi-2.1.0-11.fc36.x86_64
>>> openmpi-devel-4.1.2-3.fc36.x86_64
>>> MUMPS-openmpi-5.4.1-2.fc36.x86_64
>>> superlu_dist-openmpi-6.1.1-9.fc36.x86_64
>>> hdf5-openmpi-1.12.1-5.fc36.x86_64
>>> hypre-openmpi-2.18.2-6.fc36.x86_64
>>> petsc-openmpi-3.16.4-3.fc36.x86_64
>>> fftw-openmpi-libs-double-3.3.10-2.fc36.x86_64
>>> fftw-openmpi-libs-long-3.3.10-2.fc36.x86_64
>>> fftw-openmpi-libs-single-3.3.10-2.fc36.x86_64
>>> fftw-openmpi-libs-3.3.10-2.fc36.x86_64
>>> fftw-openmpi-devel-3.3.10-2.fc36.x86_64
>>> petsc-openmpi-devel-3.16.4-3.fc36.x86_64
>>> python3-petsc-openmpi-3.16.4-3.fc36.x86_64
>>> scalapack-openmpi-devel-2.1.0-11.fc36.x86_64
>>> python3-openmpi-4.1.2-3.fc36.x86_64
>>> hdf5-openmpi-devel-1.12.1-5.fc36.x86_64
>>> cgnslib-openmpi-4.2.0-6.fc36.x86_64
>>> cgnslib-openmpi-devel-4.2.0-6.fc36.x86_64
>>> 
>>> and I try to compile and run the program written in the file e.c
>>> 
>>> ~# cat e.c
>>> #include <petsc.h>
>>> 
>>> int main(int argc, char **argv) {
>>>   PetscErrorCode ierr;
>>>   PetscMPIInt    rank;
>>> 
>>>   ierr = PetscInitialize(&argc,&argv,NULL,
>>>       "Compute e in parallel with PETSc.\n\n"); if (ierr) return ierr;
>>>   ierr = MPI_Comm_rank(PETSC_COMM_WORLD,&rank); CHKERRQ(ierr);
>>>   ierr = PetscPrintf(PETSC_COMM_WORLD,"My rank is %d\n",rank); CHKERRQ(ierr);
>>> 
>>>   
>>>   return PetscFinalize();
>>> }
>>> 
>>> I compile it with the command
>>> 
>>> ~# mpicc -o e e.c -I/usr/include/petsc64 -lpetsc64
>>> 
>>> and I run it with 
>>> 
>>> ~# mpirun -np 8 ./e
>>> My rank is 0
>>> My rank is 0
>>> My rank is 0
>>> My rank is 0
>>> My rank is 0
>>> My rank is 0
>>> My rank is 0
>>> My rank is 0
>>> 
>>> but all my process get rank 0.
>>> 
>>> Do you know what I'm doing wrong?
>>> 
>>> Thank you!!
>>> 
>>> Best regards,
>>> 
>>> Rafel Amer
>> 
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220823/b0406d36/attachment.html>


More information about the petsc-users mailing list