<div dir="ltr">Dear Satish,<div><br></div><div>Thanks for the answer. Your suggestion makes a lot of sense, but this is what I get as a result of that:</div><div><br></div><div><font face="monospace" color="#0000ff">Running check examples to verify correct installation<br>Using PETSC_DIR=/home/niceno/Development/petsc-debug and PETSC_ARCH=arch-linux-c-debug<br>Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process<br>See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a><br>Invalid MIT-MAGIC-COOKIE-1 keylid velocity = 0.0016, prandtl # = 1., grashof # = 1.<br>Number of SNES iterations = 2<br>Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI processes<br>See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a><br>Invalid MIT-MAGIC-COOKIE-1 keylid velocity = 0.0016, prandtl # = 1., grashof # = 1.<br>Number of SNES iterations = 2<br>Possible error running Fortran example src/snes/tutorials/ex5f with 1 MPI process<br>See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html">http://www.mcs.anl.gov/petsc/documentation/faq.html</a><br>Invalid MIT-MAGIC-COOKIE-1 keyNumber of SNES iterations = 4<br>Completed test examples</font><br></div><div><br></div><div>I am getting the "Possible error running Fortran example" warning with this. This somehow looks more severe to me. But I could be wrong.</div><div><br></div><div>Any suggestions what to do?</div><div><br></div><div><br></div><div> Kind regards,</div><div><br></div><div> Bojan</div><div><br></div><div><br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Wed, Feb 9, 2022 at 5:49 PM Satish Balay <<a href="mailto:balay@mcs.anl.gov">balay@mcs.anl.gov</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">To clarify:<br>
<br>
you are using --download-openmpi=yes with petsc. However you say:<br>
<br>
> > The mpif90 command which<br>
> > I use to compile the code, wraps gfortran with OpenMPI<br>
<br>
This suggests a different install of OpenMPI is used to build your code.<br>
<br>
One way to resolve this is - delete current build of PETSc - and rebuild it with this same MPI [that you are using with your application]<br>
<br>
./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --download-fblaslapack --download-metis --download-parmetis --download-cmake<br>
<br>
Also PETSc provides makefile format that minimizes such conflicts..<br>
<br>
<a href="https://petsc.org/release/docs/manual/getting_started/#writing-c-c-or-fortran-applications" rel="noreferrer" target="_blank">https://petsc.org/release/docs/manual/getting_started/#writing-c-c-or-fortran-applications</a><br>
<br>
Satish<br>
<br>
On Wed, 9 Feb 2022, Balay, Satish via petsc-users wrote:<br>
<br>
> Are you using the same MPI to build both PETSc and your appliation?<br>
> <br>
> Satish<br>
> <br>
> On Wed, 2022-02-09 at 05:21 +0100, Bojan Niceno wrote:<br>
> > To whom it may concern,<br>
> > <br>
> > <br>
> > I am working on a Fortran (2003) computational fluid dynamics solver,<br>
> > which is actually quite mature, was parallelized with MPI from the<br>
> > very beginning and it comes with its own suite of Krylov solvers. <br>
> > Although the code is self-sustained, I am inclined to believe that it<br>
> > would be better to use PETSc instead of my own home-grown solvers.<br>
> > <br>
> > In the attempt to do so, I have installed PETSc 3.16.4 with following<br>
> > options:<br>
> > <br>
> > ./configure --with-debugging=yes --download-openmpi=yes --download-<br>
> > fblaslapack=yes --download-metis=yes --download-parmetis=yes --<br>
> > download-cmake=yes<br>
> > <br>
> > on a workstation running Ubuntu 20.04 LTS. The mpif90 command which<br>
> > I use to compile the code, wraps gfortran with OpenMPI, hence the<br>
> > option "--download-openmpi=yes" when configuring PETSc.<br>
> > <br>
> > Anyhow, installation of PETSc went fine, I managed to link and run it<br>
> > with my code, but I am getting the following messages during<br>
> > compilation:<br>
> > <br>
> > Petsc_Mod.f90:18:6:<br>
> > <br>
> > 18 | use PetscMat, only: tMat, MAT_FINAL_ASSEMBLY<br>
> > | 1<br>
> > Warning: Named COMMON block ‘mpi_fortran_bottom’ at (1) shall be of<br>
> > the same size as elsewhere (4 vs 8 bytes)<br>
> > <br>
> > Petsc_Mod.f90 is a module I wrote for interfacing PETSc. All works,<br>
> > but these messages give me a reason to worry.<br>
> > <br>
> > Can you tell what causes this warnings? I would guess they might<br>
> > appear if one mixes OpenMPI with MPICH, but I don't think I even have<br>
> > MPICH on my system.<br>
> > <br>
> > Please let me know what you think about it?<br>
> > <br>
> > Cheers,<br>
> > <br>
> > Bojan<br>
> > <br>
> > <br>
> > <br>
> > <br>
> <br>
> <br>
</blockquote></div>