<html><head><meta http-equiv="Content-Type" content="text/html; charset=us-ascii"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><div class=""><br class=""></div> 1) Run the trivial PETSc program with 2 MPI ranks and -info (send the output)<div class=""><br class=""></div><div class=""> 2) do a ldd on the PETSc program and the pure MPI program to see if they are all using the same libraries</div><div class=""><br class=""></div><div class=""> 3) add a MPI_Initialize() immediately before the PetscInitialize() and immediately after the PetscFinalize(), does the program think it is parallel?</div><div class=""><br class=""></div><div class=""> 4) is PETSC_HAVE_MPI_INIT_THREAD defined in $PETSC_ARCH/include/petscconf.h ? If so this means MPI_Init_thread() is used to start up MPI instead of MPI_Init(). If you change MPI_Init() in your MPI standalone code to MPI_Init_thread() does it still run properly in parallel?</div><div class=""><br class=""></div><div class=""><br class=""></div><div class=""><br class=""></div><div class=""><br class=""></div><div class=""><br class=""></div><div class=""> Barry</div><div class=""><br class=""><div><br class=""><blockquote type="cite" class=""><div class="">On Aug 23, 2022, at 6:34 AM, Rafel Amer Ramon <<a href="mailto:rafel.amer@upc.edu" class="">rafel.amer@upc.edu</a>> wrote:</div><br class="Apple-interchange-newline"><div class="">
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" class="">
<div class="">
<div class="moz-cite-prefix"><br class="">
Hi,<br class="">
<br class="">
mpicc and mpirun are from the package
openmpi-devel-4.1.2-3.fc36.x86_64<br class="">
and the petsc64 library is linked with <font color="#ffffff" class="">
/usr/lib64/openmpi/lib/libmpi.so.40</font>
<pre class="">~# ldd /lib64/libpetsc64.so.3.16.4
linux-vdso.so.1 (0x00007fff5becc000)
libflexiblas64.so.3 => /lib64/libflexiblas64.so.3 (0x00007f3030e30000)
libcgns.so.4.2 => /usr/lib64/openmpi/lib/libcgns.so.4.2 (0x00007f3030d46000)
libhdf5.so.200 => /usr/lib64/openmpi/lib/libhdf5.so.200 (0x00007f303091a000)
libm.so.6 => /lib64/libm.so.6 (0x00007f303083c000)
libX11.so.6 => /lib64/libX11.so.6 (0x00007f30306f4000)
libstdc++.so.6 => /lib64/libstdc++.so.6 (0x00007f30304c0000)
libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00007f303049e000)
libc.so.6 => /lib64/libc.so.6 (0x00007f303029c000)
libgfortran.so.5 => /lib64/libgfortran.so.5 (0x00007f302ffcf000)
libquadmath.so.0 => /lib64/libquadmath.so.0 (0x00007f302ff87000)
/lib64/ld-linux-x86-64.so.2 (0x00007f30327bb000)
<font color="#0000ff" class="">libmpi.so.40 => /usr/lib64/openmpi/lib/libmpi.so.40 (0x00007f302fe59000)</font>
libsz.so.2 => /lib64/libsz.so.2 (0x00007f302fe50000)
libz.so.1 => /lib64/libz.so.1 (0x00007f302fe34000)
libxcb.so.1 => /lib64/libxcb.so.1 (0x00007f302fe08000)
libopen-rte.so.40 => /usr/lib64/openmpi/lib/libopen-rte.so.40 (0x00007f302fd4d000)
libopen-pal.so.40 => /usr/lib64/openmpi/lib/libopen-pal.so.40 (0x00007f302fc9f000)
libhwloc.so.15 => /lib64/libhwloc.so.15 (0x00007f302fc42000)
libevent_core-2.1.so.7 => /lib64/libevent_core-2.1.so.7 (0x00007f302fc07000)
libevent_pthreads-2.1.so.7 => /lib64/libevent_pthreads-2.1.so.7 (0x00007f302fc02000)
libXau.so.6 => /lib64/libXau.so.6 (0x00007f302fbfc000)
</pre>
I think that it's correct, but it don't work.<br class="">
<br class="">
Best regards,<br class="">
<br class="">
Rafel Amer<br class="">
<br class="">
El 22/8/22 a les 20:36, Barry Smith ha escrit:<br class="">
</div>
<blockquote type="cite" cite="mid:358BFF34-D941-42AB-8648-871186EF64DB@petsc.dev" class="">
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" class="">
<div class=""><br class="">
</div>
Are you sure the mpirun you are using matches the mpi that PETSc
was built with? <br class="">
<div class=""><br class="">
<blockquote type="cite" class="">
<div class="">On Aug 22, 2022, at 2:31 PM, Rafel Amer Ramon
<<a href="mailto:rafel.amer@upc.edu" class="moz-txt-link-freetext" moz-do-not-send="true">rafel.amer@upc.edu</a>>
wrote:</div>
<br class="Apple-interchange-newline">
<div class="">
<meta http-equiv="content-type" content="text/html;
charset=UTF-8" class="">
<div class=""> Hi,<br class="">
<br class="">
I have installed the following packages on fedora 36<br class="">
<br class="">
<pre class=""><font class="" color="#0000ff">~# rpm -qa | grep petsc
petsc64-3.16.4-3.fc36.x86_64
petsc-3.16.4-3.fc36.x86_64
petsc-openmpi-3.16.4-3.fc36.x86_64
petsc-openmpi-devel-3.16.4-3.fc36.x86_64
python3-petsc-openmpi-3.16.4-3.fc36.x86_64
petsc-devel-3.16.4-3.fc36.x86_64
petsc64-devel-3.16.4-3.fc36.x86_64</font>
<font class="" color="#0000ff">~# rpm -qa | grep openmpi
openmpi-4.1.2-3.fc36.x86_64
ptscotch-openmpi-6.1.2-2.fc36.x86_64
scalapack-openmpi-2.1.0-11.fc36.x86_64
openmpi-devel-4.1.2-3.fc36.x86_64
MUMPS-openmpi-5.4.1-2.fc36.x86_64
superlu_dist-openmpi-6.1.1-9.fc36.x86_64
hdf5-openmpi-1.12.1-5.fc36.x86_64
hypre-openmpi-2.18.2-6.fc36.x86_64
petsc-openmpi-3.16.4-3.fc36.x86_64
fftw-openmpi-libs-double-3.3.10-2.fc36.x86_64
fftw-openmpi-libs-long-3.3.10-2.fc36.x86_64
fftw-openmpi-libs-single-3.3.10-2.fc36.x86_64
fftw-openmpi-libs-3.3.10-2.fc36.x86_64
fftw-openmpi-devel-3.3.10-2.fc36.x86_64
petsc-openmpi-devel-3.16.4-3.fc36.x86_64
python3-petsc-openmpi-3.16.4-3.fc36.x86_64
scalapack-openmpi-devel-2.1.0-11.fc36.x86_64
python3-openmpi-4.1.2-3.fc36.x86_64
hdf5-openmpi-devel-1.12.1-5.fc36.x86_64
cgnslib-openmpi-4.2.0-6.fc36.x86_64
cgnslib-openmpi-devel-4.2.0-6.fc36.x86_64</font>
</pre>
and I try to compile and run the program written in the
file e.c<br class="">
<br class="">
<pre class=""><font class="" color="#0000ff">~# cat e.c
#include <petsc.h>
int main(int argc, char **argv) {
PetscErrorCode ierr;
PetscMPIInt rank;
ierr = PetscInitialize(&argc,&argv,NULL,
"Compute e in parallel with PETSc.\n\n"); if (ierr) return ierr;
ierr = MPI_Comm_rank(PETSC_COMM_WORLD,&rank); CHKERRQ(ierr);
ierr = PetscPrintf(PETSC_COMM_WORLD,"My rank is %d\n",rank); CHKERRQ(ierr);
return PetscFinalize();
}
</font>
</pre>
I compile it with the command<br class="">
<br class="">
<pre class=""><font class="" color="#0000ff">~# mpicc -o e e.c -I/usr/include/petsc64 -lpetsc64</font>
</pre>
and I run it with <br class="">
<br class="">
<pre class=""><font class="" color="#0000ff">~# mpirun -np 8 ./e</font>
<font class="" color="#0000ff">My rank is 0
My rank is 0
My rank is 0
My rank is 0
My rank is 0
My rank is 0
My rank is 0
My rank is 0</font>
</pre>
but all my process get rank 0.<br class="">
<br class="">
Do you know what I'm doing wrong?<br class="">
<br class="">
Thank you!!<br class="">
<br class="">
Best regards,<br class="">
<br class="">
Rafel Amer </div>
</div>
</blockquote>
</div>
<br class="">
</blockquote>
<br class="">
</div>
</div></blockquote></div><br class=""></div></body></html>