[petsc-users] Warning while compiling Fortran with PETSc
Bojan Niceno
bojan.niceno.scientist at gmail.com
Thu Feb 10 09:59:35 CST 2022
Dear both,
I work on an ASUS ROG laptop and don't use any NFS. Everything is on one
computer, one disk. That is why I couldn't resolve the Invalid Magic
Cookie, because all the advice I've found about it concerns the remote
access/display. It is not an issue for me. My laptop has an Nvidia
GeForce RTX graphical card, maybe Ubuntu drivers are simply not able to
cope with it. I am out of ideas, really.
Cheers,
Bojan
On Thu, Feb 10, 2022 at 4:53 PM Satish Balay <balay at mcs.anl.gov> wrote:
> Do the compute nodes and frontend share the same NFS?
>
> I would try the following [to see if they work):
>
> - delete ~/.Xauthority [first check with 'xauth list')
> - setup ssh to not use X - i.e add the following to ~/.ssh/config
>
> ForwardX11 no
> ForwardX11Trusted no
>
> [this can be tailored to apply only to your specific compute nodes - if
> needed]
>
> Satish
>
> On Thu, 10 Feb 2022, Matthew Knepley wrote:
>
> > On Thu, Feb 10, 2022 at 10:40 AM Bojan Niceno <
> > bojan.niceno.scientist at gmail.com> wrote:
> >
> > > Thanks a lot, now I feel much better.
> > >
> > > By the way, I can't get around the invalid magic cookie. It is
> occurring
> > > ever since I installed the OS (Ubuntu 20.04) so I eventually gave up
> and
> > > decided to live with it :-D
> > >
> >
> >
> https://unix.stackexchange.com/questions/199891/invalid-mit-magic-cookie-1-key-when-trying-to-run-program-remotely
> >
> > Thanks,
> >
> > Matt
> >
> >
> > > Cheers,
> > >
> > > Bojan
> > >
> > > On Thu, Feb 10, 2022 at 4:37 PM Matthew Knepley <knepley at gmail.com>
> wrote:
> > >
> > >> On Thu, Feb 10, 2022 at 10:34 AM Bojan Niceno <
> > >> bojan.niceno.scientist at gmail.com> wrote:
> > >>
> > >>> Dear Satish,
> > >>>
> > >>> Thanks for the answer. Your suggestion makes a lot of sense, but
> this
> > >>> is what I get as a result of that:
> > >>>
> > >>> Running check examples to verify correct installation
> > >>> Using PETSC_DIR=/home/niceno/Development/petsc-debug and
> > >>> PETSC_ARCH=arch-linux-c-debug
> > >>> Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI
> process
> > >>> See http://www.mcs.anl.gov/petsc/documentation/faq.html
> > >>> Invalid MIT-MAGIC-COOKIE-1 keylid velocity = 0.0016, prandtl # = 1.,
> > >>> grashof # = 1.
> > >>> Number of SNES iterations = 2
> > >>> Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI
> processes
> > >>> See http://www.mcs.anl.gov/petsc/documentation/faq.html
> > >>> Invalid MIT-MAGIC-COOKIE-1 keylid velocity = 0.0016, prandtl # = 1.,
> > >>> grashof # = 1.
> > >>> Number of SNES iterations = 2
> > >>> Possible error running Fortran example src/snes/tutorials/ex5f with 1
> > >>> MPI process
> > >>> See http://www.mcs.anl.gov/petsc/documentation/faq.html
> > >>> Invalid MIT-MAGIC-COOKIE-1 keyNumber of SNES iterations = 4
> > >>> Completed test examples
> > >>>
> > >>> I am getting the "Possible error running Fortran example" warning
> with
> > >>> this. This somehow looks more severe to me. But I could be wrong.
> > >>>
> > >>
> > >> You are getting this message because your MPI implementation is
> printing
> > >>
> > >> Invalid MIT-MAGIC-COOKIE-1 key
> > >>
> > >> It is still running fine, but this is an MPI configuration issue.
> > >>
> > >> Thanks,
> > >>
> > >> Matt
> > >>
> > >> Any suggestions what to do?
> > >>>
> > >>>
> > >>> Kind regards,
> > >>>
> > >>> Bojan
> > >>>
> > >>>
> > >>>
> > >>> On Wed, Feb 9, 2022 at 5:49 PM Satish Balay <balay at mcs.anl.gov>
> wrote:
> > >>>
> > >>>> To clarify:
> > >>>>
> > >>>> you are using --download-openmpi=yes with petsc. However you say:
> > >>>>
> > >>>> > > The mpif90 command which
> > >>>> > > I use to compile the code, wraps gfortran with OpenMPI
> > >>>>
> > >>>> This suggests a different install of OpenMPI is used to build your
> code.
> > >>>>
> > >>>> One way to resolve this is - delete current build of PETSc - and
> > >>>> rebuild it with this same MPI [that you are using with your
> application]
> > >>>>
> > >>>> ./configure --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90
> > >>>> --download-fblaslapack --download-metis --download-parmetis
> --download-cmake
> > >>>>
> > >>>> Also PETSc provides makefile format that minimizes such conflicts..
> > >>>>
> > >>>>
> > >>>>
> https://petsc.org/release/docs/manual/getting_started/#writing-c-c-or-fortran-applications
> > >>>>
> > >>>> Satish
> > >>>>
> > >>>> On Wed, 9 Feb 2022, Balay, Satish via petsc-users wrote:
> > >>>>
> > >>>> > Are you using the same MPI to build both PETSc and your
> appliation?
> > >>>> >
> > >>>> > Satish
> > >>>> >
> > >>>> > On Wed, 2022-02-09 at 05:21 +0100, Bojan Niceno wrote:
> > >>>> > > To whom it may concern,
> > >>>> > >
> > >>>> > >
> > >>>> > > I am working on a Fortran (2003) computational fluid dynamics
> > >>>> solver,
> > >>>> > > which is actually quite mature, was parallelized with MPI from
> the
> > >>>> > > very beginning and it comes with its own suite of Krylov
> solvers.
> > >>>> > > Although the code is self-sustained, I am inclined to believe
> that
> > >>>> it
> > >>>> > > would be better to use PETSc instead of my own home-grown
> solvers.
> > >>>> > >
> > >>>> > > In the attempt to do so, I have installed PETSc 3.16.4 with
> > >>>> following
> > >>>> > > options:
> > >>>> > >
> > >>>> > > ./configure --with-debugging=yes --download-openmpi=yes
> --download-
> > >>>> > > fblaslapack=yes --download-metis=yes --download-parmetis=yes --
> > >>>> > > download-cmake=yes
> > >>>> > >
> > >>>> > > on a workstation running Ubuntu 20.04 LTS. The mpif90 command
> which
> > >>>> > > I use to compile the code, wraps gfortran with OpenMPI, hence
> the
> > >>>> > > option "--download-openmpi=yes" when configuring PETSc.
> > >>>> > >
> > >>>> > > Anyhow, installation of PETSc went fine, I managed to link and
> run
> > >>>> it
> > >>>> > > with my code, but I am getting the following messages during
> > >>>> > > compilation:
> > >>>> > >
> > >>>> > > Petsc_Mod.f90:18:6:
> > >>>> > >
> > >>>> > > 18 | use PetscMat, only: tMat, MAT_FINAL_ASSEMBLY
> > >>>> > > | 1
> > >>>> > > Warning: Named COMMON block ‘mpi_fortran_bottom’ at (1) shall
> be of
> > >>>> > > the same size as elsewhere (4 vs 8 bytes)
> > >>>> > >
> > >>>> > > Petsc_Mod.f90 is a module I wrote for interfacing PETSc. All
> works,
> > >>>> > > but these messages give me a reason to worry.
> > >>>> > >
> > >>>> > > Can you tell what causes this warnings? I would guess they
> might
> > >>>> > > appear if one mixes OpenMPI with MPICH, but I don't think I even
> > >>>> have
> > >>>> > > MPICH on my system.
> > >>>> > >
> > >>>> > > Please let me know what you think about it?
> > >>>> > >
> > >>>> > > Cheers,
> > >>>> > >
> > >>>> > > Bojan
> > >>>> > >
> > >>>> > >
> > >>>> > >
> > >>>> > >
> > >>>> >
> > >>>> >
> > >>>>
> > >>>
> > >>
> > >> --
> > >> What most experimenters take for granted before they begin their
> > >> experiments is infinitely more interesting than any results to which
> their
> > >> experiments lead.
> > >> -- Norbert Wiener
> > >>
> > >> https://www.cse.buffalo.edu/~knepley/
> > >> <http://www.cse.buffalo.edu/~knepley/>
> > >>
> > >
> >
> >
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220210/7931f304/attachment-0001.html>
More information about the petsc-users
mailing list