<html xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:w="urn:schemas-microsoft-com:office:word" xmlns:m="http://schemas.microsoft.com/office/2004/12/omml" xmlns="http://www.w3.org/TR/REC-html40">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=Windows-1252">
<meta name="Generator" content="Microsoft Word 15 (filtered medium)">
<style><!--
/* Font Definitions */
@font-face
{font-family:"Cambria Math";
panose-1:2 4 5 3 5 4 6 3 2 4;}
@font-face
{font-family:Calibri;
panose-1:2 15 5 2 2 2 4 3 2 4;}
/* Style Definitions */
p.MsoNormal, li.MsoNormal, div.MsoNormal
{margin:0cm;
margin-bottom:.0001pt;
font-size:11.0pt;
font-family:"Calibri",sans-serif;}
a:link, span.MsoHyperlink
{mso-style-priority:99;
color:blue;
text-decoration:underline;}
.MsoChpDefault
{mso-style-type:export-only;}
@page WordSection1
{size:612.0pt 792.0pt;
margin:70.85pt 2.0cm 2.0cm 2.0cm;}
div.WordSection1
{page:WordSection1;}
--></style>
</head>
<body lang="IT" link="blue" vlink="#954F72">
<div class="WordSection1">
<p class="MsoNormal">Ok, I see, but this seems to translate to compiling MS-MPI with mingw in cygwin by myself.</p>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal">Should be doable, maybe following how they managed to do that in MSYS in the first place (the instructions are available for each package).</p>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal">Actually, now I see I might have messed up when working in Cygwin because the mingw tool there might still be a cross compiler (while it isn’t in MSYS2), so it might have required something like “--host=x86_64-w64-mingw32”</p>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal">Will update if I make any progress on this</p>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal">Paolo</p>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal">Inviato da <a href="https://go.microsoft.com/fwlink/?LinkId=550986">
Posta</a> per Windows 10</p>
<p class="MsoNormal"><o:p> </o:p></p>
<div style="mso-element:para-border-div;border:none;border-top:solid #E1E1E1 1.0pt;padding:3.0pt 0cm 0cm 0cm">
<p class="MsoNormal" style="border:none;padding:0cm"><b>Da: </b><a href="mailto:balay@mcs.anl.gov">Satish Balay</a><br>
<b>Inviato: </b>lunedì 6 luglio 2020 20:31<br>
<b>A: </b><a href="mailto:paololampitella@hotmail.com">Paolo Lampitella</a><br>
<b>Cc: </b><a href="mailto:petsc-users@mcs.anl.gov">petsc-users</a>; <a href="mailto:pierre.jolivet@enseeiht.fr">
Pierre Jolivet</a><br>
<b>Oggetto: </b>Re: [petsc-users] R: PETSc and Windows 10</p>
</div>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal">I was thinking in terms of: If using mingw-gcc from cygwin - then it could be used in the same way as mingw-gcc in msys2 is used - i.e with MS-MPI etc..<br>
<br>
[one can install mingw-gcc in cygwin - which is different than cygwin native gcc - perhaps this is similar to mingw-gcc install in msys2]<br>
<br>
I haven't tried this though..<br>
<br>
Likely cygwin doesn't have the equivalent of mingw-w64-x86_64-msmpi - for easy use of MS-MPI from mingw-gfortran<br>
<br>
Satish<br>
<br>
On Mon, 6 Jul 2020, Paolo Lampitella wrote:<br>
<br>
> Dear Satish,<br>
> <br>
> Yes indeed, or at least that is my understanding. Still, my experience so far with Cygwin has been, let’s say, controversial.<br>
> <br>
> I wasn’t able to compile myself MPICH, with both gcc and mingw.<br>
> <br>
> When having PETSc compile also MPICH, I was successful only with gcc but not mingw.<br>
> <br>
> I didn’t even try compiling OpenMPI with mingw, as PETSc compilation already failed using the OpenMPI available trough cygwin libraries (which is based on gcc and not mingw).<br>
> <br>
> Not sure if this is my fault, but in the end it didn’t go well<br>
> <br>
> Paolo<br>
> <br>
> Inviato da Posta<<a href="https://go.microsoft.com/fwlink/?LinkId=550986">https://go.microsoft.com/fwlink/?LinkId=550986</a>> per Windows 10<br>
> <br>
> Da: Satish Balay<<a href="mailto:balay@mcs.anl.gov">mailto:balay@mcs.anl.gov</a>><br>
> Inviato: domenica 5 luglio 2020 23:50<br>
> A: Paolo Lampitella<<a href="mailto:paololampitella@hotmail.com">mailto:paololampitella@hotmail.com</a>><br>
> Cc: Pierre Jolivet<<a href="mailto:pierre.jolivet@enseeiht.fr">mailto:pierre.jolivet@enseeiht.fr</a>>; petsc-users<<a href="mailto:petsc-users@mcs.anl.gov">mailto:petsc-users@mcs.anl.gov</a>><br>
> Oggetto: Re: [petsc-users] PETSc and Windows 10<br>
> <br>
> Sounds like there are different mingw tools and msys2 tools.<br>
> <br>
> So I guess one could use mingw compilers even from cygwin [using cygwin tools] - i.e mingw compilers don't really need msys2 tools to work.<br>
> <br>
> Satish<br>
> <br>
> On Sun, 5 Jul 2020, Paolo Lampitella wrote:<br>
> <br>
> > Unfortunately, even PETSC_ARCH=i didn't work out. And while with-single-library=0 wasn't really appealing to me, it worked but only to later fail on make test.<br>
> ><br>
> > I guess all these differences are due to the fortran bindings and/or gcc 10.<br>
> ><br>
> > However, until I discover how they are different, I guess I'll be fine with /usr/bin/ar<br>
> ><br>
> > Paolo<br>
> ><br>
> ><br>
> ><br>
> > Inviato da smartphone Samsung Galaxy.<br>
> ><br>
> ><br>
> ><br>
> > -------- Messaggio originale --------<br>
> > Da: Paolo Lampitella <paololampitella@hotmail.com><br>
> > Data: 05/07/20 14:00 (GMT+01:00)<br>
> > A: Pierre Jolivet <pierre.jolivet@enseeiht.fr><br>
> > Cc: Matthew Knepley <knepley@gmail.com>, petsc-users <petsc-users@mcs.anl.gov><br>
> > Oggetto: RE: [petsc-users] PETSc and Windows 10<br>
> ><br>
> > Thank you very much Pierre.<br>
> ><br>
> > I'll keep you informed in case I see any relevant change from the tests when using your suggestion.<br>
> ><br>
> > Paolo<br>
> ><br>
> ><br>
> ><br>
> > Inviato da smartphone Samsung Galaxy.<br>
> ><br>
> ><br>
> ><br>
> > -------- Messaggio originale --------<br>
> > Da: Pierre Jolivet <pierre.jolivet@enseeiht.fr><br>
> > Data: 05/07/20 13:45 (GMT+01:00)<br>
> > A: Paolo Lampitella <paololampitella@hotmail.com><br>
> > Cc: Matthew Knepley <knepley@gmail.com>, petsc-users <petsc-users@mcs.anl.gov><br>
> > Oggetto: Re: [petsc-users] PETSc and Windows 10<br>
> ><br>
> > Hello Paolo,<br>
> ><br>
> > On 5 Jul 2020, at 1:15 PM, Paolo Lampitella <paololampitella@hotmail.com<mailto:paololampitella@hotmail.com>> wrote:<br>
> ><br>
> > Dear all,<br>
> ><br>
> > I just want to update you on my journey to PETSc compilation in Windows under MSYS2+MINGW64<br>
> ><br>
> > Unfortunately, I haven’t been able to compile petsc-slepc trough Freefem but, as my final goal required also Fortran bindings (but I only needed blas, lapack, metis and hypre), I decided to follow my own route using the useful information from Pierre.<br>
> ><br>
> ><br>
> > * I started by installing MPI from <a href="https://www.microsoft.com/en-us/download/details.aspx?id=100593">
https://www.microsoft.com/en-us/download/details.aspx?id=100593</a>. I don’t think the SDK is actually needed in my specific workflow, but I installed it as well together with mpisetup.<br>
> > * Then I installed MSYS2 just following the wizard. Opened the MSYS2 terminal and updated with pacman -Syuu, closed if asked, reopened it and used again pacman -Syuu several times until no more updates were available. Closed it and opened it back.<br>
> > * Under the MSYS2 terminal installed just the following packages:<br>
> ><br>
> ><br>
> ><br>
> > * pacman -S base-devel git gcc gcc-fortran<br>
> > * pacman -S mingw-w64-x86_64-toolchain<br>
> > * pacman -S mingw-w64-x86_64-cmake<br>
> > * pacman -S mingw-w64-x86_64-msmpi<br>
> ><br>
> ><br>
> ><br>
> > * Closed the MSYS2 terminal and opened the MINGW64 one, went to /mingw64/include and compiled my mpi module following
<a href="https://www.scivision.dev/windows-mpi-msys2/:">https://www.scivision.dev/windows-mpi-msys2/:</a><br>
> ><br>
> ><br>
> ><br>
> > * gfortran -c mpi.f90 -fno-range-check -fallow-invalid-boz<br>
> ><br>
> ><br>
> > However, I will keep an eye on the MS-MPI GitHub repository because the fortran side seems to be far from perfect.<br>
> ><br>
> ><br>
> > * Then I downloaded the 3.13.3 version of petsc and configured it, still under the MINGW64 terminal, with the following command:<br>
> ><br>
> ><br>
> > /usr/bin/python ./configure --prefix=/home/paolo/petsc --with-ar=/usr/bin/ar<br>
> > --with-shared-libraries=0 --with-debugging=0 --with-windows-graphics=0 --with-x=0<br>
> > COPTFLAGS="-O3 -mtune=native"<br>
> > CXXOPTFLAGS="-O3 -mtune=native"<br>
> > FOPTFLAGS="-O3 -mtune=native"<br>
> > FFLAGS=-fallow-invalid-boz<br>
> > --with-mpiexec="/C/Program\ Files/Microsoft\ MPI/Bin/mpiexec"<br>
> > --download-fblaslapack --download-metis --download-hypre<br>
> > --download-metis-cmake-arguments='-G "MSYS Makefiles"'<br>
> > --download-hypre-configure-arguments="--build=x86_64-linux-gnu --host=x86_64-linux-gnu"<br>
> ><br>
> > Note that I just bypassed uninstalling python in mingw64 (which doesn’t work) by using /usr/bin/python and that, as opposed to Pierre, I needed to also use the MSYS2 archiver (/usr/bin/ar) as opposed to the mingw64 one (/mingw64/bin/ar that shows up in
the Pierre configure) as also mentioned here <a href="http://hillyuan.blogspot.com/2017/11/build-petsc-in-windows-under-mingw64.html">
http://hillyuan.blogspot.com/2017/11/build-petsc-in-windows-under-mingw64.html</a>, probably because of this issue
<a href="https://stackoverflow.com/questions/37504625/ar-on-msys2-shell-receives-truncated-paths-when-called-from-makefile">
https://stackoverflow.com/questions/37504625/ar-on-msys2-shell-receives-truncated-paths-when-called-from-makefile</a>.<br>
> ><br>
> > You are right that you can avoid deinstalling mingw-w64-x86_64-python if you can supply the proper Python yourself (we don’t have that luxury in our Makefile).<br>
> > If you want to avoid using that AR, and stick to /mingw64/bin/ar (not sure what the pros and cons are), you can either:<br>
> > - use another PETSC_ARCH (very short, like pw, for petsc-windows);<br>
> > - use --with-single-library=0.<br>
> > See this post on GitLab <a href="https://gitlab.com/petsc/petsc/-/issues/647#note_373507681">
https://gitlab.com/petsc/petsc/-/issues/647#note_373507681</a><br>
> > The OS I’m referring to is indeed my Windows + MSYS2 box.<br>
> ><br>
> > Thanks,<br>
> > Pierre<br>
> ><br>
> > Then make all, make install and make check all went smooth. Also, I don’t know exactly what with-x=0 and with-windows-graphics=0 do, but I think it is stuff that I don’t need (yet configure worked with windows-graphics as well).<br>
> ><br>
> ><br>
> > * Finally I launched make test. As some tests failed, I replicated the same install procedure on all the systems I have available on this same Windows machine (Ubuntu 20.04 and Centos 8 under a VirtualBox 6.0.22 VM, Ubuntu 20.04 under WSL1 and the MSYS2-MINGW64
toolchain). I am attaching a file with the results printed to screen (not sure about which file should be used for a comparison/check). Note, however, that the tests in MSYS2 started with some cyclic reference issues for some .mod files, but this doesn’t show
up in any file I could check.<br>
> ><br>
> ><br>
> > I am still left with some doubts about the archiver, the cyclic reference errors and the differences in the test results, but I am able to link my code with petsc. Unfortunately, as this Windows porting is part of a large code restructuring, I can’t do
much more with it, now, from my code. But if you can suggest some specific tutorial to use as test also for the parallel, I would be glad to dig deeper into the matter.<br>
> ><br>
> > Best regards<br>
> ><br>
> > Paolo<br>
> ><br>
> > Inviato da Posta<<a href="https://go.microsoft.com/fwlink/?LinkId=550986">https://go.microsoft.com/fwlink/?LinkId=550986</a>> per Windows 10<br>
> ><br>
> > Da: Pierre Jolivet<<a href="mailto:pierre.jolivet@enseeiht.fr">mailto:pierre.jolivet@enseeiht.fr</a>><br>
> > Inviato: martedì 30 giugno 2020 15:22<br>
> > A: Paolo Lampitella<<a href="mailto:paololampitella@hotmail.com">mailto:paololampitella@hotmail.com</a>><br>
> > Cc: Matthew Knepley<<a href="mailto:knepley@gmail.com">mailto:knepley@gmail.com</a>>; petsc-users<<a href="mailto:petsc-users@mcs.anl.gov">mailto:petsc-users@mcs.anl.gov</a>><br>
> > Oggetto: Re: [petsc-users] PETSc and Windows 10<br>
> ><br>
> > Please use the 3.13.2 tarball, this was fixed by Satish in the previous commit I already linked (<a href="https://gitlab.com/petsc/petsc/-/commit/2cd8068296b34e127f055bb32f556e3599f17523">https://gitlab.com/petsc/petsc/-/commit/2cd8068296b34e127f055bb32f556e3599f17523</a>).<br>
> > (If you want FreeFEM to do the dirty work for you, just switch to the develop branch, and redo “make petsc-slepc”)<br>
> > But I think you’ve got everything you need now for a smooth compilation :)<br>
> ><br>
> > Thanks,<br>
> > Pierre<br>
> ><br>
> ><br>
> > On 30 Jun 2020, at 3:09 PM, Paolo Lampitella <paololampitella@hotmail.com<mailto:paololampitella@hotmail.com>> wrote:<br>
> ><br>
> > Dear Pierre,<br>
> ><br>
> > thanks for the fast response. Unfortunately it still fails, but now in the configure of ScaLAPACK<br>
> > (which means that it went ok for slepc, tetgen, metis, parmetis, ptscotch, superlu and suitesparse).<br>
> ><br>
> > The way I applied the modification is by manually editing the Makefile in the 3rdparty/ff-petsc folder, adding -fallow-invalid-boz to both CFLAGS and FFLAGS (this entry added by me). Then executed make petsc-slepc.<br>
> ><br>
> > As my project is much less ambitious, I have a good feeling that I will be able to use your Makefile successfully, but as I am kind of slow I tought that it would have been useful for you to know. The configure.log is attached. This time the error is:<br>
> ><br>
> > Rank mismatch between actual argument at (1) and actual argument at (2) (scalar and rank-1)<br>
> ><br>
> > in subroutine pclarf.f of ScaLAPACK.<br>
> ><br>
> > However, before attampting with my project, I have few questions about your Makefile, in particular this piece:<br>
> ><br>
> > --with-mpi-lib=/c/Windows/System32/msmpi.dll --with-mpi-include=/home/paolo/FreeFem-sources/3rdparty/include/msmpi --with-mpiexec="/C/Program\ Files/Microsoft\ MPI/Bin/mpiexec"<br>
> ><br>
> > I see from MPI.py that I should not use ‘--with-mpi-lib/include’ if I want to use my now working mpi wrappers. Is this correct?<br>
> ><br>
> > Paolo<br>
> ><br>
> > Inviato da Posta<<a href="https://go.microsoft.com/fwlink/?LinkId=550986">https://go.microsoft.com/fwlink/?LinkId=550986</a>> per Windows 10<br>
> ><br>
> > Da: Pierre Jolivet<<a href="mailto:pierre.jolivet@enseeiht.fr">mailto:pierre.jolivet@enseeiht.fr</a>><br>
> > Inviato: lunedì 29 giugno 2020 21:37<br>
> > A: Paolo Lampitella<<a href="mailto:paololampitella@hotmail.com">mailto:paololampitella@hotmail.com</a>><br>
> > Cc: Matthew Knepley<<a href="mailto:knepley@gmail.com">mailto:knepley@gmail.com</a>>; petsc-users<<a href="mailto:petsc-users@mcs.anl.gov">mailto:petsc-users@mcs.anl.gov</a>><br>
> > Oggetto: Re: [petsc-users] PETSc and Windows 10<br>
> ><br>
> > I do not give up easily on Windows problems:<br>
> > 1) that’s around 50% of our (FreeFEM) user-base (and I want them to use PETSc and SLEPc, ofc…)<br>
> > 2) most people I work with from corporations just have Windows laptops/desktops and I always recommend MSYS because it’s very lightweight and you can pass .exe around<br>
> > 3) I’ve bothered enough Satish, Jed, and Matt on GitLab to take (at least partially) the blame now when it doesn’t work on MSYS<br>
> ><br>
> > That being said, the magic keyword is the added flag FFLAGS="-fallow-invalid-boz" (see, I told you ./configure issues were easier to deal with than the others).<br>
> > Here you’ll see that everything goes through just fine (sorry, it took me a long time to post this because everything is slow on my VM):<br>
> > 1) <a href="http://jolivet.perso.enseeiht.fr/win10/configure.log">http://jolivet.perso.enseeiht.fr/win10/configure.log</a><br>
> > 2) <a href="http://jolivet.perso.enseeiht.fr/win10/make.log">http://jolivet.perso.enseeiht.fr/win10/make.log</a> (both steps #1 and #2 in MSYS terminal, gcc/gfortran 10, MS-MPI see screenshot)<br>
> > 3) <a href="http://jolivet.perso.enseeiht.fr/win10/ex2.txt">http://jolivet.perso.enseeiht.fr/win10/ex2.txt</a> (Command Prompt, 4 processes + MUMPS, I can send you the .exe if you want to try on your machine)<br>
> > I just realize that I didn’t generate the Fortran bindings, but you can see I compiled MUMPS and ScaLAPACK, so that shouldn’t be a problem.<br>
> > Or if there is a problem, we will need to fix this in PETSc.<br>
> ><br>
> > I’ll push this added flag to the FreeFEM repo, thanks for reminding me of the brokenness of gcc/gfortran 10 + MS-MPI.<br>
> > Here is to hoping this won’t affect PETSc ./configure with previous gcc/gfortran version (unlikely, this option is apparently 13-year old
<a href="https://gcc.gnu.org/bugzilla/show_bug.cgi?id=29471">https://gcc.gnu.org/bugzilla/show_bug.cgi?id=29471</a>)<br>
> ><br>
> > Let me know of the next hiccup, if any.<br>
> > Thanks,<br>
> > Pierre<br>
> ><br>
> ><br>
> ><br>
> > On 29 Jun 2020, at 8:09 PM, Paolo Lampitella <paololampitella@hotmail.com<mailto:paololampitella@hotmail.com>> wrote:<br>
> ><br>
> > Dear Pierre,<br>
> ><br>
> > thanks again for your time<br>
> ><br>
> > I guess there is no way for me to use the toolchain you are using (I don’t remember having any choice on which version of MSYS or GCC I could install)<br>
> ><br>
> > Paolo<br>
> ><br>
> > Inviato da Posta<<a href="https://go.microsoft.com/fwlink/?LinkId=550986">https://go.microsoft.com/fwlink/?LinkId=550986</a>> per Windows 10<br>
> ><br>
> > Da: Pierre Jolivet<<a href="mailto:pierre.jolivet@enseeiht.fr">mailto:pierre.jolivet@enseeiht.fr</a>><br>
> > Inviato: lunedì 29 giugno 2020 20:01<br>
> > A: Matthew Knepley<<a href="mailto:knepley@gmail.com">mailto:knepley@gmail.com</a>><br>
> > Cc: Paolo Lampitella<<a href="mailto:paololampitella@hotmail.com">mailto:paololampitella@hotmail.com</a>>; petsc-users<<a href="mailto:petsc-users@mcs.anl.gov">mailto:petsc-users@mcs.anl.gov</a>><br>
> > Oggetto: Re: [petsc-users] PETSc and Windows 10<br>
> ><br>
> ><br>
> ><br>
> ><br>
> ><br>
> ><br>
> > On 29 Jun 2020, at 7:47 PM, Matthew Knepley <knepley@gmail.com<mailto:knepley@gmail.com>> wrote:<br>
> ><br>
> > On Mon, Jun 29, 2020 at 1:35 PM Paolo Lampitella <paololampitella@hotmail.com<mailto:paololampitella@hotmail.com>> wrote:<br>
> > Dear Pierre, sorry to bother you, but I already have some issues. What I did:<br>
> ><br>
> ><br>
> > * pacman -R mingw-w64-x86_64-python mingw-w64-x86_64-gdb (is gdb also troublesome?)<br>
> > * Followed points 6 and 7 at <a href="https://doc.freefem.org/introduction/installation.html#compilation-on-windows">
https://doc.freefem.org/introduction/installation.html#compilation-on-windows</a><br>
> ><br>
> > I first got a warning on the configure at point 6, as –disable-hips is not recognized. Then, on make ‘petsc-slepc’ of point 7 (no SUDO=sudo flag was necessary) I got to this point:<br>
> ><br>
> > tar xzf ../pkg/petsc-lite-3.13.0.tar.gz<br>
> > patch -p1 < petsc-suitesparse.patch<br>
> > patching file petsc-3.13.0/config/BuildSystem/config/packages/SuiteSparse.py<br>
> > touch petsc-3.13.0/tag-tar<br>
> > cd petsc-3.13.0 && ./configure MAKEFLAGS='' \<br>
> > --prefix=/home/paolo/freefem/ff-petsc//r \<br>
> > --with-debugging=0 COPTFLAGS='-O3 -mtune=generic' CXXOPTFLAGS='-O3 -mtune=generic' FOPTFLAGS='-O3 -mtune=generic' --with-cxx-dialect=C++11 --with-ssl=0 --with-x=0 --with-fortran-bindings=0 --with-shared-libraries=0 --with-cc='gcc' --with-cxx='g++'
--with-fc='gfortran' CXXFLAGS='-fno-stack-protector' CFLAGS='-fno-stack-protector' --with-scalar-type=real --with-mpi-lib='/c/Windows/System32/msmpi.dll' --with-mpi-include='/home/paolo/FreeFem-sources/3rdparty/include/msmpi' --with-mpiexec='/C/Program\ Files/Microsoft\
MPI/Bin/mpiexec' --with-blaslapack-include='' --with-blaslapack-lib='/mingw64/bin/libopenblas.dll' --download-scalapack --download-metis --download-ptscotch --download-mumps --download-hypre --download-parmetis --download-superlu --download-suitesparse --download-tetgen
--download-slepc '--download-metis-cmake-arguments=-G "MSYS Makefiles"' '--download-parmetis-cmake-arguments=-G "MSYS Makefiles"' '--download-superlu-cmake-arguments=-G "MSYS Makefiles"' '<br>
--<br>
> download-hypre-configure-arguments=--build=x86_64-linux-gnu --host=x86_64-linux-gnu' PETSC_ARCH=fr<br>
> > ===============================================================================<br>
> > Configuring PETSc to compile on your system<br>
> > ===============================================================================<br>
> > TESTING: FortranMPICheck from config.packages.MPI(config/BuildSystem/config/pack*******************************************************************************<br>
> > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details):<br>
> > -------------------------------------------------------------------------------<br>
> > Fortran error! mpi_init() could not be located!<br>
> > *******************************************************************************<br>
> ><br>
> > make: *** [Makefile:210: petsc-3.13.0/tag-conf-real] Errore 1<br>
> ><br>
> > Note that I didn’t add anything to any PATH variable, because this is not mentioned in your documentation.<br>
> ><br>
> > On a side note, this is the same error I got when trying to build PETSc in Cygwin with the default OpenMPI available in Cygwin.<br>
> ><br>
> > I am attaching the configure.log… it seems to me that the error comes from the configure trying to include the mpif.h in your folder and not using the -fallow-invalid-boz flag that I had to use, for example, to compile mpi.f90 into mpi.mod<br>
> ><br>
> > But I’m not sure why this is happening<br>
> ><br>
> > Pierre,<br>
> ><br>
> > Could this be due to gcc 10?<br>
> ><br>
> > Sorry, I’m slow. You are right. Our workers use gcc 9, everything is fine, but I see on my VM which I updated that I use gcc 10 and had to disable Fortran, I guess the MUMPS run I showcased was with a prior PETSc build.<br>
> > I’ll try to resolve this and will keep you posted.<br>
> > They really caught a lot of people off guard with gfortran 10…<br>
> ><br>
> > Thanks,<br>
> > Pierre<br>
> ><br>
> ><br>
> ><br>
> ><br>
> > Executing: gfortran -c -o /tmp/petsc-ur0cff6a/config.libraries/conftest.o -I/tmp/petsc-ur0cff6a/config.compilers -I/tmp/petsc-ur0cff6a/config.setCompilers -I/tmp/petsc-ur0cff6a/config.compilersFortran -I/tmp/petsc-ur0cff6a/config.libraries -Wall -ffree-line-length-0
-Wno-unused-dummy-argument -O3 -mtune=generic -I/home/paolo/FreeFem-sources/3rdparty/include/msmpi /tmp/petsc-ur0cff6a/config.libraries/conftest.F90<br>
> > Possible ERROR while running compiler: exit code 1<br>
> > stderr:<br>
> > C:/msys64/home/paolo/FreeFem-sources/3rdparty/include/msmpi/mpif.h:227:36:<br>
> ><br>
> > 227 | PARAMETER (MPI_DATATYPE_NULL=z'0c000000')<br>
> > | 1<br>
> > Error: BOZ literal constant at (1) is neither a data-stmt-constant nor an actual argument to INT, REAL, DBLE, or CMPLX intrinsic function [see '-fno-allow-invalid-boz']<br>
> > C:/msys64/home/paolo/FreeFem-sources/3rdparty/include/msmpi/mpif.h:303:27:<br>
> ><br>
> > 303 | PARAMETER (MPI_CHAR=z'4c000101')<br>
> > | 1<br>
> > Error: BOZ literal constant at (1) is neither a data-stmt-constant nor an actual argument to INT, REAL, DBLE, or CMPLX intrinsic function [see '-fno-allow-invalid-boz']<br>
> > C:/msys64/home/paolo/FreeFem-sources/3rdparty/include/msmpi/mpif.h:305:36:<br>
> ><br>
> > 305 | PARAMETER (MPI_UNSIGNED_CHAR=z'4c000102')<br>
> > | 1<br>
> ><br>
> > Thanks,<br>
> ><br>
> > Matt<br>
> ><br>
> > Thanks<br>
> ><br>
> > Paolo<br>
> ><br>
> > Inviato da Posta<<a href="https://go.microsoft.com/fwlink/?LinkId=550986">https://go.microsoft.com/fwlink/?LinkId=550986</a>> per Windows 10<br>
> ><br>
> > Da: Pierre Jolivet<<a href="mailto:pierre.jolivet@enseeiht.fr">mailto:pierre.jolivet@enseeiht.fr</a>><br>
> > Inviato: lunedì 29 giugno 2020 18:34<br>
> > A: Paolo Lampitella<<a href="mailto:paololampitella@hotmail.com">mailto:paololampitella@hotmail.com</a>><br>
> > Cc: Satish Balay<<a href="mailto:balay@mcs.anl.gov">mailto:balay@mcs.anl.gov</a>>; petsc-users<<a href="mailto:petsc-users@mcs.anl.gov">mailto:petsc-users@mcs.anl.gov</a>><br>
> > Oggetto: Re: [petsc-users] PETSc and Windows 10<br>
> ><br>
> ><br>
> ><br>
> > On 29 Jun 2020, at 6:27 PM, Paolo Lampitella <paololampitella@hotmail.com<mailto:paololampitella@hotmail.com>> wrote:<br>
> ><br>
> > I think I made the first step of having mingw64 from msys2 working with ms-mpi.<br>
> ><br>
> > I found that the issue I was having was related to:<br>
> ><br>
> > <a href="https://gcc.gnu.org/bugzilla/show_bug.cgi?id=91556">https://gcc.gnu.org/bugzilla/show_bug.cgi?id=91556</a><br>
> ><br>
> > and, probably (but impossible to check now), I was using an msys2 and/or mingw mpi package before this fix:<br>
> ><br>
> > <a href="https://github.com/msys2/MINGW-packages/commit/11b4cff3d2ec7411037b692b0ad5a9f3e9b9978d#diff-eac59989e3096be97d940c8f47b50fba">
https://github.com/msys2/MINGW-packages/commit/11b4cff3d2ec7411037b692b0ad5a9f3e9b9978d#diff-eac59989e3096be97d940c8f47b50fba</a><br>
> ><br>
> > Admittedly, I never used gcc 10 before on any machine. Still, I feel that reporting that sort of error in that way is,<br>
> > at least, misleading (I would have preferred the initial implementation as mentioned in the gcc bug track).<br>
> ><br>
> > A second thing that I was not used to, and made me more uncertain of the procedure I was following, is having to compile myself the mpi module. There are several version of this out there, but I decided to stick with this one:<br>
> ><br>
> > <a href="https://www.scivision.dev/windows-mpi-msys2/">https://www.scivision.dev/windows-mpi-msys2/</a><br>
> ><br>
> > even if there seems to be no need to include -fno-range-check and the current mpi.f90 version is different from the mpif.h as reported here:<br>
> ><br>
> > <a href="https://github.com/microsoft/Microsoft-MPI/issues/33">https://github.com/microsoft/Microsoft-MPI/issues/33</a><br>
> ><br>
> > which, to me, are both signs of lack of attention on the fortran side by those that maintain this thing.<br>
> ><br>
> > In summary, this is the procedure I followed so far (on a 64 bit machine with Windows 10):<br>
> ><br>
> ><br>
> > * Install MSYS2 from <a href="https://www.msys2.org/">https://www.msys2.org/</a> and just follow the install wizard<br>
> > * Open the MSYS2 terminal and execute: pacman -Syuu<br>
> > * Close the terminal when asked and reopen it<br>
> > * Keep executing ‘pacman -Syuu’ until nothing else needs to be updated<br>
> > * Close the MSYS2 terminal and reopen it (I guess because was in paranoid mode), then install packages with:<br>
> ><br>
> ><br>
> > pacman -S base-devel git gcc gcc-fortran bsdcpio lndir pax-git unzip<br>
> > pacman -S mingw-w64-x86_64-toolchain<br>
> > pacman -S mingw-w64-x86_64-msmpi<br>
> > pacman -S mingw-w64-x86_64-cmake<br>
> > pacman -S mingw-w64-x86_64-freeglut<br>
> > pacman -S mingw-w64-x86_64-gsl<br>
> > pacman -S mingw-w64-x86_64-libmicroutils<br>
> > pacman -S mingw-w64-x86_64-hdf5<br>
> > pacman -S mingw-w64-x86_64-openblas<br>
> > pacman -S mingw-w64-x86_64-arpack<br>
> > pacman -S mingw-w64-x86_64-jq<br>
> ><br>
> > This set should include all the libraries mentioned by Pierre and/or used by his Jenkins, as the final scope here is to have PETSc and dependencies working. But I think that for pure MPI one could stop to msmpi (even, maybe, just install msmpi and have
the dependencies figured out by pacman). Honestly, I don’t remember the exact order I used to install the packages, but this should not affect things. Also, as I was still in paranoid mode, I kept executing ‘pacman -Syuu’ after each package was installed.
After this, close the MSYS2 terminal.<br>
> ><br>
> ><br>
> > * Open the MINGW64 terminal and create the .mod file out of the mpi.f90 file, as mentioned here
<a href="https://www.scivision.dev/windows-mpi-msys2/">https://www.scivision.dev/windows-mpi-msys2/</a>, with:<br>
> ><br>
> ><br>
> > cd /mingw64/include<br>
> > gfortran mpif90 -c -fno-range-check -fallow-invalid-boz<br>
> ><br>
> > Ah, yes, that’s new to gfortran 10 (we use gfortran 9 on our workers), which is now what’s ship with MSYS2 (we haven’t updated yet). Sorry that I forgot about that.<br>
> ><br>
> > This is needed to ‘USE mpi’ (as opposed to INCLUDE ‘mpif.h’)<br>
> ><br>
> ><br>
> > * Install the latest MS-MPI (both sdk and setup) from <a href="https://www.microsoft.com/en-us/download/details.aspx?id=100593">
https://www.microsoft.com/en-us/download/details.aspx?id=100593</a><br>
> ><br>
> ><br>
> > At this point I’ve been able to compile (using the MINGW64 terminal) different mpi test programs and they run as expected in the classical Windows prompt. I added this function to my .bashrc in MSYS2 in order to easily copy the required dependencies out
of MSYS:<br>
> ><br>
> > function copydep() { ldd $1 | grep "=> /$2" | awk '{print $3}' | xargs -I '{}' cp -v '{}' .; }<br>
> ><br>
> > which can be used, with the MINGW64 terminal, by navigating to the folder where the final executable, say, my.exe, resides (even if under a Windows path) and executing:<br>
> ><br>
> > copydep my.exe mingw64<br>
> ><br>
> > This, of course, must be done before actually trying to execute the .exe in the windows cmd prompt.<br>
> ><br>
> > Hopefully, I should now be able to follow Pierre’s instructions for PETSc (but first I wanna give a try to the system python before removing it)<br>
> ><br>
> > Looks like the hard part is over. It’s usually easier to deal with ./configure issues.<br>
> > If you have weird errors like “incomplete Cygwin install” or whatever, this is the kind of issues I was referring to earlier.<br>
> > In that case, what I’d suggest is just, as before:<br>
> > pacman -R mingw-w64-x86_64-python mingw-w64-x86_64-gdb<br>
> > pacman -S python<br>
> ><br>
> > Thanks,<br>
> > Pierre<br>
> ><br>
> > Thanks<br>
> ><br>
> > Paolo<br>
> ><br>
> ><br>
> ><br>
> ><br>
> > --<br>
> > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
> > -- Norbert Wiener<br>
> ><br>
> > <a href="https://www.cse.buffalo.edu/~knepley/%3chttp:/www.cse.buffalo.edu/~knepley/%3chttps:/www.cse.buffalo.edu/~knepley/%3chttp:/www.cse.buffalo.edu/~knepley/">
https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/<https://www.cse.buffalo.edu/~knepley/%3chttp:/www.cse.buffalo.edu/~knepley/</a>>><br>
> ><br>
> ><br>
> > <configure.log><br>
> ><br>
> ><br>
> > <petsc_test.txt><br>
> ><br>
> ><br>
> <br>
> <o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
</div>
</body>
</html>