<div dir="ltr">Dear both,<div><br></div><div>Allow me to update you on the issue. I tried to re-compile PETSc with different configuration options as Satish suggested, and went further on by specifying exact location of OpenMPI libraries and include files to the ones installed by PETSc (for those configurations for which I used "--download-openmpi=1") and the original problem, the warning <font face="monospace" color="#0000ff">Named COMMON block ‘mpi_fortran_bottom’ at (1) shall be of the same size as elsewhere (4 vs 8 bytes)</font>, prevailed.</div><div><br></div><div>In desperation, I completely removed OpenMPI from my workstation to make sure that only those which are downloaded with PETSc are used, yet the warning was still there. (That resolved the <font face="monospace" color="#0000ff">Invalid MIT-MAGIC-COOKIE-1</font> warning at least)</div><div><br></div><div>Now I am wondering if the problem originates from the fact that I already have all the necessary MPI routines developed in Fortran? All calls, including the basic <font face="monospace">MPI_Init</font>, <font face="monospace">MPI_Comm_Size</font> and <font face="monospace">MPI_Comm_Rank</font>, are done from Fortran. I actually have a module called <font face="monospace">Comm_Mod</font> which does all MPI-related calls, and this module contains line <font face="monospace">include 'mpif.h'.</font><font face="arial, sans-serif"> That include statement does take the file from PETSc installation as no other MPI installation is left on my system, but still it somehow seems to be the origin of the warning on common blocks I observe. Now I am wondering if the </font><font face="monospace">include 'mpif.h'</font><font face="arial, sans-serif"> from Fortran somehow collides with the option </font>include <font face="monospace">${PETSC_DIR}/lib/petsc/conf/variables</font> I put in my makefile in order to compile with PETSc.</div><div><br></div><div>I am really not sure if it is possible to have main program and all MPI initialization done from Fortran (as I have now) and then plug PETSc on top of it? Should that be possible?</div><div><br></div><div> Kind regards,</div><div><br></div><div> Bojan</div><div><br></div><div>P.S. The sequential version works fine, I can compile without warning and can call PETSc solvers from Fortran without a glitch.</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Feb 10, 2022 at 5:08 PM Bojan Niceno <<a href="mailto:bojan.niceno.scientist@gmail.com">bojan.niceno.scientist@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">Dear Satish,<div><br></div><div>Thanks for the advice. I will try in a few hours because it is almost dinner time with me (I am in Europe) and I am supposed to go out with a friend this evening.</div><div><br></div><div>Will let you know. Thanks for help, I highly appreciate it.</div><div><br></div><div><br></div><div> Kind regards,</div><div><br></div><div> Bojan</div><div><br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Feb 10, 2022 at 5:06 PM Satish Balay <<a href="mailto:balay@mcs.anl.gov" target="_blank">balay@mcs.anl.gov</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Hm - this is strange.<br>
<br>
Do you have 'xauth' installed?<br>
<br>
I would make sure xauth is installed, delete ~/.Xauthority - and reboot [or restart the X server]<br>
<br>
Yeah - it might not work - but perhaps worth a try..<br>
<br>
Or perhaps its not X11 related..<br>
<br>
I would also try 'strace' on an application that is producing this message - to see if I can narrow down further..<br>
<br>
Do you get this message with both (runs)?:<br>
<br>
cd src/ksp/ksp/tutorials<br>
make ex2<br>
mpiexec -n 1 ./ex2<br>
./ex2<br>
<br>
Satish<br>
<br>
On Thu, 10 Feb 2022, Bojan Niceno wrote:<br>
<br>
> Dear both,<br>
> <br>
> I work on an ASUS ROG laptop and don't use any NFS. Everything is on one<br>
> computer, one disk. That is why I couldn't resolve the Invalid Magic<br>
> Cookie, because all the advice I've found about it concerns the remote<br>
> access/display. It is not an issue for me. My laptop has an Nvidia<br>
> GeForce RTX graphical card, maybe Ubuntu drivers are simply not able to<br>
> cope with it. I am out of ideas, really.<br>
> <br>
> <br>
> Cheers,<br>
> <br>
> Bojan<br>
> <br>
> On Thu, Feb 10, 2022 at 4:53 PM Satish Balay <<a href="mailto:balay@mcs.anl.gov" target="_blank">balay@mcs.anl.gov</a>> wrote:<br>
> <br>
> > Do the compute nodes and frontend share the same NFS?<br>
> ><br>
> > I would try the following [to see if they work):<br>
> ><br>
> > - delete ~/.Xauthority [first check with 'xauth list')<br>
> > - setup ssh to not use X - i.e add the following to ~/.ssh/config<br>
> ><br>
> > ForwardX11 no<br>
> > ForwardX11Trusted no<br>
> ><br>
> > [this can be tailored to apply only to your specific compute nodes - if<br>
> > needed]<br>
> ><br>
> > Satish<br>
> ><br>
> > On Thu, 10 Feb 2022, Matthew Knepley wrote:<br>
> ><br>
> > > On Thu, Feb 10, 2022 at 10:40 AM Bojan Niceno <<br>
> > > <a href="mailto:bojan.niceno.scientist@gmail.com" target="_blank">bojan.niceno.scientist@gmail.com</a>> wrote:<br>
> > ><br>
> > > > Thanks a lot, now I feel much better.<br>
> > > ><br>
> > > > By the way, I can't get around the invalid magic cookie. It is<br>
> > occurring<br>
> > > > ever since I installed the OS (Ubuntu 20.04) so I eventually gave up<br>
> > and<br>
> > > > decided to live with it :-D<br>
> > > ><br>
> > ><br>
> > ><br>
> > <a href="https://unix.stackexchange.com/questions/199891/invalid-mit-magic-cookie-1-key-when-trying-to-run-program-remotely" rel="noreferrer" target="_blank">https://unix.stackexchange.com/questions/199891/invalid-mit-magic-cookie-1-key-when-trying-to-run-program-remotely</a><br>
> > ><br>
> > > Thanks,<br>
> > ><br>
> > > Matt<br>
> > ><br>
> > ><br>
> > > > Cheers,<br>
> > > ><br>
> > > > Bojan<br>
> > > ><br>
> > > > On Thu, Feb 10, 2022 at 4:37 PM Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>><br>
> > wrote:<br>
> > > ><br>
> > > >> On Thu, Feb 10, 2022 at 10:34 AM Bojan Niceno <<br>
> > > >> <a href="mailto:bojan.niceno.scientist@gmail.com" target="_blank">bojan.niceno.scientist@gmail.com</a>> wrote:<br>
> > > >><br>
> > > >>> Dear Satish,<br>
> > > >>><br>
> > > >>> Thanks for the answer. Your suggestion makes a lot of sense, but<br>
> > this<br>
> > > >>> is what I get as a result of that:<br>
> > > >>><br>
> > > >>> Running check examples to verify correct installation<br>
> > > >>> Using PETSC_DIR=/home/niceno/Development/petsc-debug and<br>
> > > >>> PETSC_ARCH=arch-linux-c-debug<br>
> > > >>> Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI<br>
> > process<br>
> > > >>> See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a><br>
> > > >>> Invalid MIT-MAGIC-COOKIE-1 keylid velocity = 0.0016, prandtl # = 1.,<br>
> > > >>> grashof # = 1.<br>
> > > >>> Number of SNES iterations = 2<br>
> > > >>> Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI<br>
> > processes<br>
> > > >>> See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a><br>
> > > >>> Invalid MIT-MAGIC-COOKIE-1 keylid velocity = 0.0016, prandtl # = 1.,<br>
> > > >>> grashof # = 1.<br>
> > > >>> Number of SNES iterations = 2<br>
> > > >>> Possible error running Fortran example src/snes/tutorials/ex5f with 1<br>
> > > >>> MPI process<br>
> > > >>> See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a><br>
> > > >>> Invalid MIT-MAGIC-COOKIE-1 keyNumber of SNES iterations = 4<br>
> > > >>> Completed test examples<br>
> > > >>><br>
> > > >>> I am getting the "Possible error running Fortran example" warning<br>
> > with<br>
> > > >>> this. This somehow looks more severe to me. But I could be wrong.<br>
> > > >>><br>
> > > >><br>
> > > >> You are getting this message because your MPI implementation is<br>
> > printing<br>
> > > >><br>
> > > >> Invalid MIT-MAGIC-COOKIE-1 key<br>
> > > >><br>
> > > >> It is still running fine, but this is an MPI configuration issue.<br>
> > > >><br>
> > > >> Thanks,<br>
> > > >><br>
> > > >> Matt<br>
> > > >><br>
> > > >> Any suggestions what to do?<br>
> > > >>><br>
> > > >>><br>
> > > >>> Kind regards,<br>
> > > >>><br>
> > > >>> Bojan<br>
> > > >>><br>
> > > >>><br>
> > > >>><br>
> > > >>> On Wed, Feb 9, 2022 at 5:49 PM Satish Balay <<a href="mailto:balay@mcs.anl.gov" target="_blank">balay@mcs.anl.gov</a>><br>
> > wrote:<br>
> > > >>><br>
> > > >>>> To clarify:<br>
> > > >>>><br>
> > > >>>> you are using --download-openmpi=yes with petsc. However you say:<br>
> > > >>>><br>
> > > >>>> > > The mpif90 command which<br>
> > > >>>> > > I use to compile the code, wraps gfortran with OpenMPI<br>
> > > >>>><br>
> > > >>>> This suggests a different install of OpenMPI is used to build your<br>
> > code.<br>
> > > >>>><br>
> > > >>>> One way to resolve this is - delete current build of PETSc - and<br>
> > > >>>> rebuild it with this same MPI [that you are using with your<br>
> > application]<br>
> > > >>>><br>
> > > >>>> ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90<br>
> > > >>>> --download-fblaslapack --download-metis --download-parmetis<br>
> > --download-cmake<br>
> > > >>>><br>
> > > >>>> Also PETSc provides makefile format that minimizes such conflicts..<br>
> > > >>>><br>
> > > >>>><br>
> > > >>>><br>
> > <a href="https://petsc.org/release/docs/manual/getting_started/#writing-c-c-or-fortran-applications" rel="noreferrer" target="_blank">https://petsc.org/release/docs/manual/getting_started/#writing-c-c-or-fortran-applications</a><br>
> > > >>>><br>
> > > >>>> Satish<br>
> > > >>>><br>
> > > >>>> On Wed, 9 Feb 2022, Balay, Satish via petsc-users wrote:<br>
> > > >>>><br>
> > > >>>> > Are you using the same MPI to build both PETSc and your<br>
> > appliation?<br>
> > > >>>> ><br>
> > > >>>> > Satish<br>
> > > >>>> ><br>
> > > >>>> > On Wed, 2022-02-09 at 05:21 +0100, Bojan Niceno wrote:<br>
> > > >>>> > > To whom it may concern,<br>
> > > >>>> > ><br>
> > > >>>> > ><br>
> > > >>>> > > I am working on a Fortran (2003) computational fluid dynamics<br>
> > > >>>> solver,<br>
> > > >>>> > > which is actually quite mature, was parallelized with MPI from<br>
> > the<br>
> > > >>>> > > very beginning and it comes with its own suite of Krylov<br>
> > solvers.<br>
> > > >>>> > > Although the code is self-sustained, I am inclined to believe<br>
> > that<br>
> > > >>>> it<br>
> > > >>>> > > would be better to use PETSc instead of my own home-grown<br>
> > solvers.<br>
> > > >>>> > ><br>
> > > >>>> > > In the attempt to do so, I have installed PETSc 3.16.4 with<br>
> > > >>>> following<br>
> > > >>>> > > options:<br>
> > > >>>> > ><br>
> > > >>>> > > ./configure --with-debugging=yes --download-openmpi=yes<br>
> > --download-<br>
> > > >>>> > > fblaslapack=yes --download-metis=yes --download-parmetis=yes --<br>
> > > >>>> > > download-cmake=yes<br>
> > > >>>> > ><br>
> > > >>>> > > on a workstation running Ubuntu 20.04 LTS. The mpif90 command<br>
> > which<br>
> > > >>>> > > I use to compile the code, wraps gfortran with OpenMPI, hence<br>
> > the<br>
> > > >>>> > > option "--download-openmpi=yes" when configuring PETSc.<br>
> > > >>>> > ><br>
> > > >>>> > > Anyhow, installation of PETSc went fine, I managed to link and<br>
> > run<br>
> > > >>>> it<br>
> > > >>>> > > with my code, but I am getting the following messages during<br>
> > > >>>> > > compilation:<br>
> > > >>>> > ><br>
> > > >>>> > > Petsc_Mod.f90:18:6:<br>
> > > >>>> > ><br>
> > > >>>> > > 18 | use PetscMat, only: tMat, MAT_FINAL_ASSEMBLY<br>
> > > >>>> > > | 1<br>
> > > >>>> > > Warning: Named COMMON block ‘mpi_fortran_bottom’ at (1) shall<br>
> > be of<br>
> > > >>>> > > the same size as elsewhere (4 vs 8 bytes)<br>
> > > >>>> > ><br>
> > > >>>> > > Petsc_Mod.f90 is a module I wrote for interfacing PETSc. All<br>
> > works,<br>
> > > >>>> > > but these messages give me a reason to worry.<br>
> > > >>>> > ><br>
> > > >>>> > > Can you tell what causes this warnings? I would guess they<br>
> > might<br>
> > > >>>> > > appear if one mixes OpenMPI with MPICH, but I don't think I even<br>
> > > >>>> have<br>
> > > >>>> > > MPICH on my system.<br>
> > > >>>> > ><br>
> > > >>>> > > Please let me know what you think about it?<br>
> > > >>>> > ><br>
> > > >>>> > > Cheers,<br>
> > > >>>> > ><br>
> > > >>>> > > Bojan<br>
> > > >>>> > ><br>
> > > >>>> > ><br>
> > > >>>> > ><br>
> > > >>>> > ><br>
> > > >>>> ><br>
> > > >>>> ><br>
> > > >>>><br>
> > > >>><br>
> > > >><br>
> > > >> --<br>
> > > >> What most experimenters take for granted before they begin their<br>
> > > >> experiments is infinitely more interesting than any results to which<br>
> > their<br>
> > > >> experiments lead.<br>
> > > >> -- Norbert Wiener<br>
> > > >><br>
> > > >> <a href="https://www.cse.buffalo.edu/~knepley/" rel="noreferrer" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br>
> > > >> <<a href="http://www.cse.buffalo.edu/~knepley/" rel="noreferrer" target="_blank">http://www.cse.buffalo.edu/~knepley/</a>><br>
> > > >><br>
> > > ><br>
> > ><br>
> > ><br>
> ><br>
> <br>
</blockquote></div>
</blockquote></div>