[petsc-users] PETSc on fedora 36

Matthew Knepley knepley at gmail.com
Tue Aug 23 09:04:27 CDT 2022


On Tue, Aug 23, 2022 at 9:48 AM Rafel Amer Ramon <rafel.amer at upc.edu> wrote:

>
> Hi,
>
> yes, I can compile and run it:
>
> ~$ mpicc -o pi-mpi pi-mpi.c -lm
> ~$ mpirun -np 16 --hostfile ~/hosts ./pi-mpi
> Number of intervals: 2048
> Result:   3.1411043724
> Accuracy: 0.0004882812
> Time:     0.0314383930
> ~$
>
>
> PETSc does not control the launch, so you must be using a different MPI.
For building the PETSc executable, try instead
the minimum makefile

e: e.o
      ${CLINKER} -o e e.o ${PETSC_LIB}

include ${PETSC_DIR}/lib/petsc/conf/variables
include ${PETSC_DIR}/lib/petsc/conf/rules

  Thanks,

     Matt


> Best regards,
>
> Rafel Amer
>
>
>
>
> El 23/8/22 a les 14:46, Matthew Knepley ha escrit:
>
> Can you run anything in parallel? Say the small sample code that
> calculates pi?
>
>
> https://www.cs.usfca.edu/~mmalensek/cs220/schedule/code/week09/pi-mpi.c.html
>
>   Thanks,
>
>       Matt
>
> On Tue, Aug 23, 2022 at 6:34 AM Rafel Amer Ramon <rafel.amer at upc.edu>
> wrote:
>
>>
>> Hi,
>>
>> mpicc and mpirun are from the package openmpi-devel-4.1.2-3.fc36.x86_64
>> and the petsc64 library is linked with
>> /usr/lib64/openmpi/lib/libmpi.so.40
>>
>> ~# ldd /lib64/libpetsc64.so.3.16.4
>> 	linux-vdso.so.1 (0x00007fff5becc000)
>> 	libflexiblas64.so.3 => /lib64/libflexiblas64.so.3 (0x00007f3030e30000)
>> 	libcgns.so.4.2 => /usr/lib64/openmpi/lib/libcgns.so.4.2 (0x00007f3030d46000)
>> 	libhdf5.so.200 => /usr/lib64/openmpi/lib/libhdf5.so.200 (0x00007f303091a000)
>> 	libm.so.6 => /lib64/libm.so.6 (0x00007f303083c000)
>> 	libX11.so.6 => /lib64/libX11.so.6 (0x00007f30306f4000)
>> 	libstdc++.so.6 => /lib64/libstdc++.so.6 (0x00007f30304c0000)
>> 	libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00007f303049e000)
>> 	libc.so.6 => /lib64/libc.so.6 (0x00007f303029c000)
>> 	libgfortran.so.5 => /lib64/libgfortran.so.5 (0x00007f302ffcf000)
>> 	libquadmath.so.0 => /lib64/libquadmath.so.0 (0x00007f302ff87000)
>> 	/lib64/ld-linux-x86-64.so.2 (0x00007f30327bb000)
>> 	libmpi.so.40 => /usr/lib64/openmpi/lib/libmpi.so.40 (0x00007f302fe59000)
>> 	libsz.so.2 => /lib64/libsz.so.2 (0x00007f302fe50000)
>> 	libz.so.1 => /lib64/libz.so.1 (0x00007f302fe34000)
>> 	libxcb.so.1 => /lib64/libxcb.so.1 (0x00007f302fe08000)
>> 	libopen-rte.so.40 => /usr/lib64/openmpi/lib/libopen-rte.so.40 (0x00007f302fd4d000)
>> 	libopen-pal.so.40 => /usr/lib64/openmpi/lib/libopen-pal.so.40 (0x00007f302fc9f000)
>> 	libhwloc.so.15 => /lib64/libhwloc.so.15 (0x00007f302fc42000)
>> 	libevent_core-2.1.so.7 => /lib64/libevent_core-2.1.so.7 (0x00007f302fc07000)
>> 	libevent_pthreads-2.1.so.7 => /lib64/libevent_pthreads-2.1.so.7 (0x00007f302fc02000)
>> 	libXau.so.6 => /lib64/libXau.so.6 (0x00007f302fbfc000)
>>
>>
>> I think that it's correct, but it don't work.
>>
>> Best regards,
>>
>> Rafel Amer
>>
>> El 22/8/22 a les 20:36, Barry Smith ha escrit:
>>
>>
>>   Are you sure the mpirun you are using matches the mpi that PETSc was
>> built with?
>>
>> On Aug 22, 2022, at 2:31 PM, Rafel Amer Ramon <rafel.amer at upc.edu> wrote:
>>
>> Hi,
>>
>> I have installed the following packages on fedora 36
>>
>> ~# rpm -qa | grep petsc
>> petsc64-3.16.4-3.fc36.x86_64
>> petsc-3.16.4-3.fc36.x86_64
>> petsc-openmpi-3.16.4-3.fc36.x86_64
>> petsc-openmpi-devel-3.16.4-3.fc36.x86_64
>> python3-petsc-openmpi-3.16.4-3.fc36.x86_64
>> petsc-devel-3.16.4-3.fc36.x86_64
>> petsc64-devel-3.16.4-3.fc36.x86_64
>> ~# rpm -qa | grep openmpi
>> openmpi-4.1.2-3.fc36.x86_64
>> ptscotch-openmpi-6.1.2-2.fc36.x86_64
>> scalapack-openmpi-2.1.0-11.fc36.x86_64
>> openmpi-devel-4.1.2-3.fc36.x86_64
>> MUMPS-openmpi-5.4.1-2.fc36.x86_64
>> superlu_dist-openmpi-6.1.1-9.fc36.x86_64
>> hdf5-openmpi-1.12.1-5.fc36.x86_64
>> hypre-openmpi-2.18.2-6.fc36.x86_64
>> petsc-openmpi-3.16.4-3.fc36.x86_64
>> fftw-openmpi-libs-double-3.3.10-2.fc36.x86_64
>> fftw-openmpi-libs-long-3.3.10-2.fc36.x86_64
>> fftw-openmpi-libs-single-3.3.10-2.fc36.x86_64
>> fftw-openmpi-libs-3.3.10-2.fc36.x86_64
>> fftw-openmpi-devel-3.3.10-2.fc36.x86_64
>> petsc-openmpi-devel-3.16.4-3.fc36.x86_64
>> python3-petsc-openmpi-3.16.4-3.fc36.x86_64
>> scalapack-openmpi-devel-2.1.0-11.fc36.x86_64
>> python3-openmpi-4.1.2-3.fc36.x86_64
>> hdf5-openmpi-devel-1.12.1-5.fc36.x86_64
>> cgnslib-openmpi-4.2.0-6.fc36.x86_64
>> cgnslib-openmpi-devel-4.2.0-6.fc36.x86_64
>>
>> and I try to compile and run the program written in the file e.c
>>
>> ~# cat e.c
>> #include <petsc.h>
>>
>> int main(int argc, char **argv) {
>>   PetscErrorCode ierr;
>>   PetscMPIInt    rank;
>>
>>   ierr = PetscInitialize(&argc,&argv,NULL,
>>       "Compute e in parallel with PETSc.\n\n"); if (ierr) return ierr;
>>   ierr = MPI_Comm_rank(PETSC_COMM_WORLD,&rank); CHKERRQ(ierr);
>>   ierr = PetscPrintf(PETSC_COMM_WORLD,"My rank is %d\n",rank); CHKERRQ(ierr);
>>
>>
>>   return PetscFinalize();
>> }
>>
>> I compile it with the command
>>
>> ~# mpicc -o e e.c -I/usr/include/petsc64 -lpetsc64
>>
>> and I run it with
>>
>> ~# mpirun -np 8 ./eMy rank is 0
>> My rank is 0
>> My rank is 0
>> My rank is 0
>> My rank is 0
>> My rank is 0
>> My rank is 0
>> My rank is 0
>>
>> but all my process get rank 0.
>>
>> Do you know what I'm doing wrong?
>>
>> Thank you!!
>>
>> Best regards,
>>
>> Rafel Amer
>>
>>
>>
>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220823/29a45c60/attachment.html>


More information about the petsc-users mailing list