[petsc-users] PETSc and Windows 10

Paolo Lampitella paololampitella at hotmail.com
Sun Jul 5 07:00:45 CDT 2020


Thank you very much Pierre.

I'll keep you informed in case I see any relevant change from the tests when using your suggestion.

Paolo



Inviato da smartphone Samsung Galaxy.



-------- Messaggio originale --------
Da: Pierre Jolivet <pierre.jolivet at enseeiht.fr>
Data: 05/07/20 13:45 (GMT+01:00)
A: Paolo Lampitella <paololampitella at hotmail.com>
Cc: Matthew Knepley <knepley at gmail.com>, petsc-users <petsc-users at mcs.anl.gov>
Oggetto: Re: [petsc-users] PETSc and Windows 10

Hello Paolo,

On 5 Jul 2020, at 1:15 PM, Paolo Lampitella <paololampitella at hotmail.com<mailto:paololampitella at hotmail.com>> wrote:

Dear all,

I just want to update you on my journey to PETSc compilation in Windows under MSYS2+MINGW64

Unfortunately, I haven’t been able to compile petsc-slepc trough Freefem but, as my final goal required also Fortran bindings (but I only needed blas, lapack, metis and hypre), I decided to follow my own route using the useful information from Pierre.


  *   I started by installing MPI from https://www.microsoft.com/en-us/download/details.aspx?id=100593. I don’t think the SDK is actually needed in my specific workflow, but I installed it as well together with mpisetup.
  *   Then I installed MSYS2 just following the wizard. Opened the MSYS2 terminal and updated with pacman -Syuu, closed if asked, reopened it and used again pacman -Syuu several times until no more updates were available. Closed it and opened it back.
  *   Under the MSYS2 terminal installed just the following packages:



     *   pacman -S base-devel git gcc gcc-fortran
     *   pacman -S mingw-w64-x86_64-toolchain
     *   pacman -S mingw-w64-x86_64-cmake
     *   pacman -S mingw-w64-x86_64-msmpi



  *   Closed the MSYS2 terminal and opened the MINGW64 one, went to /mingw64/include and compiled my mpi module following https://www.scivision.dev/windows-mpi-msys2/:



     *   gfortran -c mpi.f90 -fno-range-check -fallow-invalid-boz


However, I will keep an eye on the MS-MPI GitHub repository because the fortran side seems to be far from perfect.


  *   Then I downloaded the 3.13.3 version of petsc and configured it, still under the MINGW64 terminal, with the following command:


/usr/bin/python ./configure --prefix=/home/paolo/petsc --with-ar=/usr/bin/ar
--with-shared-libraries=0 --with-debugging=0 --with-windows-graphics=0 --with-x=0
COPTFLAGS="-O3 -mtune=native"
CXXOPTFLAGS="-O3 -mtune=native"
FOPTFLAGS="-O3 -mtune=native"
FFLAGS=-fallow-invalid-boz
--with-mpiexec="/C/Program\ Files/Microsoft\ MPI/Bin/mpiexec"
--download-fblaslapack --download-metis --download-hypre
--download-metis-cmake-arguments='-G "MSYS Makefiles"'
--download-hypre-configure-arguments="--build=x86_64-linux-gnu --host=x86_64-linux-gnu"

Note that I just bypassed uninstalling python in mingw64 (which doesn’t work) by using /usr/bin/python and that, as opposed to Pierre, I needed to also use the MSYS2 archiver (/usr/bin/ar) as opposed to the mingw64 one (/mingw64/bin/ar that shows up in the Pierre configure) as also mentioned here http://hillyuan.blogspot.com/2017/11/build-petsc-in-windows-under-mingw64.html, probably because of this issue https://stackoverflow.com/questions/37504625/ar-on-msys2-shell-receives-truncated-paths-when-called-from-makefile.

You are right that you can avoid deinstalling mingw-w64-x86_64-python if you can supply the proper Python yourself (we don’t have that luxury in our Makefile).
If you want to avoid using that AR, and stick to /mingw64/bin/ar (not sure what the pros and cons are), you can either:
- use another PETSC_ARCH (very short, like pw, for petsc-windows);
- use --with-single-library=0.
See this post on GitLab https://gitlab.com/petsc/petsc/-/issues/647#note_373507681
The OS I’m referring to is indeed my Windows + MSYS2 box.

Thanks,
Pierre

Then make all, make install and make check all went smooth. Also, I don’t know exactly what with-x=0 and with-windows-graphics=0 do, but I think it is stuff that I don’t need (yet configure worked with windows-graphics as well).


  *   Finally I launched make test. As some tests failed, I replicated the same install procedure on all the systems I have available on this same Windows machine (Ubuntu 20.04 and Centos 8 under a VirtualBox 6.0.22 VM, Ubuntu 20.04 under WSL1 and the MSYS2-MINGW64 toolchain). I am attaching a file with the results printed to screen (not sure about which file should be used for a comparison/check). Note, however, that the tests in MSYS2 started with some cyclic reference issues for some .mod files, but this doesn’t show up in any file I could check.


I am still left with some doubts about the archiver, the cyclic reference errors and the differences in the test results, but I am able to link my code with petsc. Unfortunately, as this Windows porting is part of a large code restructuring, I can’t do much more with it, now, from my code. But if you can suggest some specific tutorial to use as test also for the parallel, I would be glad to dig deeper into the matter.

Best regards

Paolo

Inviato da Posta<https://go.microsoft.com/fwlink/?LinkId=550986> per Windows 10

Da: Pierre Jolivet<mailto:pierre.jolivet at enseeiht.fr>
Inviato: martedì 30 giugno 2020 15:22
A: Paolo Lampitella<mailto:paololampitella at hotmail.com>
Cc: Matthew Knepley<mailto:knepley at gmail.com>; petsc-users<mailto:petsc-users at mcs.anl.gov>
Oggetto: Re: [petsc-users] PETSc and Windows 10

Please use the 3.13.2 tarball, this was fixed by Satish in the previous commit I already linked (https://gitlab.com/petsc/petsc/-/commit/2cd8068296b34e127f055bb32f556e3599f17523).
(If you want FreeFEM to do the dirty work for you, just switch to the develop branch, and redo “make petsc-slepc”)
But I think you’ve got everything you need now for a smooth compilation :)

Thanks,
Pierre


On 30 Jun 2020, at 3:09 PM, Paolo Lampitella <paololampitella at hotmail.com<mailto:paololampitella at hotmail.com>> wrote:

Dear Pierre,

thanks for the fast response. Unfortunately it still fails, but now in the configure of ScaLAPACK
(which means that it went ok for slepc, tetgen, metis, parmetis, ptscotch, superlu and suitesparse).

The way I applied the modification is by manually editing the Makefile in the 3rdparty/ff-petsc folder, adding -fallow-invalid-boz to both CFLAGS and FFLAGS (this entry added by me). Then executed make petsc-slepc.

As my project is much less ambitious, I have a good feeling that I will be able to use your Makefile successfully, but as I am kind of slow I tought that it would have been useful for you to know. The configure.log is attached. This time the error is:

Rank mismatch between actual argument at (1) and actual argument at (2) (scalar and rank-1)

in subroutine pclarf.f of ScaLAPACK.

However, before attampting with my project, I have few questions about your Makefile, in particular this piece:

--with-mpi-lib=/c/Windows/System32/msmpi.dll --with-mpi-include=/home/paolo/FreeFem-sources/3rdparty/include/msmpi --with-mpiexec="/C/Program\ Files/Microsoft\ MPI/Bin/mpiexec"

I see from MPI.py that I should not use ‘--with-mpi-lib/include’ if I want to use my now working mpi wrappers. Is this correct?

Paolo

Inviato da Posta<https://go.microsoft.com/fwlink/?LinkId=550986> per Windows 10

Da: Pierre Jolivet<mailto:pierre.jolivet at enseeiht.fr>
Inviato: lunedì 29 giugno 2020 21:37
A: Paolo Lampitella<mailto:paololampitella at hotmail.com>
Cc: Matthew Knepley<mailto:knepley at gmail.com>; petsc-users<mailto:petsc-users at mcs.anl.gov>
Oggetto: Re: [petsc-users] PETSc and Windows 10

I do not give up easily on Windows problems:
1) that’s around 50% of our (FreeFEM) user-base (and I want them to use PETSc and SLEPc, ofc…)
2) most people I work with from corporations just have Windows laptops/desktops and I always recommend MSYS because it’s very lightweight and you can pass .exe around
3) I’ve bothered enough Satish, Jed, and Matt on GitLab to take (at least partially) the blame now when it doesn’t work on MSYS

That being said, the magic keyword is the added flag FFLAGS="-fallow-invalid-boz" (see, I told you ./configure issues were easier to deal with than the others).
Here you’ll see that everything goes through just fine (sorry, it took me a long time to post this because everything is slow on my VM):
1) http://jolivet.perso.enseeiht.fr/win10/configure.log
2) http://jolivet.perso.enseeiht.fr/win10/make.log (both steps #1 and #2 in MSYS terminal, gcc/gfortran 10, MS-MPI see screenshot)
3) http://jolivet.perso.enseeiht.fr/win10/ex2.txt (Command Prompt, 4 processes + MUMPS, I can send you the .exe if you want to try on your machine)
I just realize that I didn’t generate the Fortran bindings, but you can see I compiled MUMPS and ScaLAPACK, so that shouldn’t be a problem.
Or if there is a problem, we will need to fix this in PETSc.

I’ll push this added flag to the FreeFEM repo, thanks for reminding me of the brokenness of gcc/gfortran 10 + MS-MPI.
Here is to hoping this won’t affect PETSc ./configure with previous gcc/gfortran version (unlikely, this option is apparently 13-year old https://gcc.gnu.org/bugzilla/show_bug.cgi?id=29471)

Let me know of the next hiccup, if any.
Thanks,
Pierre



On 29 Jun 2020, at 8:09 PM, Paolo Lampitella <paololampitella at hotmail.com<mailto:paololampitella at hotmail.com>> wrote:

Dear Pierre,

thanks again for your time

I guess there is no way for me to use the toolchain you are using (I don’t remember having any choice on which version of MSYS or GCC I could install)

Paolo

Inviato da Posta<https://go.microsoft.com/fwlink/?LinkId=550986> per Windows 10

Da: Pierre Jolivet<mailto:pierre.jolivet at enseeiht.fr>
Inviato: lunedì 29 giugno 2020 20:01
A: Matthew Knepley<mailto:knepley at gmail.com>
Cc: Paolo Lampitella<mailto:paololampitella at hotmail.com>; petsc-users<mailto:petsc-users at mcs.anl.gov>
Oggetto: Re: [petsc-users] PETSc and Windows 10






On 29 Jun 2020, at 7:47 PM, Matthew Knepley <knepley at gmail.com<mailto:knepley at gmail.com>> wrote:

On Mon, Jun 29, 2020 at 1:35 PM Paolo Lampitella <paololampitella at hotmail.com<mailto:paololampitella at hotmail.com>> wrote:
Dear Pierre, sorry to bother you, but I already have some issues. What I did:


  *   pacman -R mingw-w64-x86_64-python mingw-w64-x86_64-gdb (is gdb also troublesome?)
  *   Followed points 6 and 7 at https://doc.freefem.org/introduction/installation.html#compilation-on-windows

I first got a warning on the configure at point 6, as –disable-hips is not recognized. Then, on make ‘petsc-slepc’ of point 7 (no SUDO=sudo flag was necessary) I got to this point:

tar xzf ../pkg/petsc-lite-3.13.0.tar.gz
patch -p1 < petsc-suitesparse.patch
patching file petsc-3.13.0/config/BuildSystem/config/packages/SuiteSparse.py
touch petsc-3.13.0/tag-tar
cd petsc-3.13.0 && ./configure MAKEFLAGS='' \
        --prefix=/home/paolo/freefem/ff-petsc//r \
        --with-debugging=0 COPTFLAGS='-O3 -mtune=generic' CXXOPTFLAGS='-O3 -mtune=generic' FOPTFLAGS='-O3 -mtune=generic' --with-cxx-dialect=C++11 --with-ssl=0 --with-x=0 --with-fortran-bindings=0 --with-shared-libraries=0 --with-cc='gcc' --with-cxx='g++' --with-fc='gfortran' CXXFLAGS='-fno-stack-protector' CFLAGS='-fno-stack-protector' --with-scalar-type=real --with-mpi-lib='/c/Windows/System32/msmpi.dll' --with-mpi-include='/home/paolo/FreeFem-sources/3rdparty/include/msmpi' --with-mpiexec='/C/Program\ Files/Microsoft\ MPI/Bin/mpiexec' --with-blaslapack-include='' --with-blaslapack-lib='/mingw64/bin/libopenblas.dll' --download-scalapack --download-metis --download-ptscotch --download-mumps --download-hypre --download-parmetis --download-superlu --download-suitesparse --download-tetgen --download-slepc '--download-metis-cmake-arguments=-G "MSYS Makefiles"' '--download-parmetis-cmake-arguments=-G "MSYS Makefiles"' '--download-superlu-cmake-arguments=-G "MSYS Makefiles"' '--download-hypre-configure-arguments=--build=x86_64-linux-gnu --host=x86_64-linux-gnu' PETSC_ARCH=fr
===============================================================================
             Configuring PETSc to compile on your system
===============================================================================
TESTING: FortranMPICheck from config.packages.MPI(config/BuildSystem/config/pack*******************************************************************************
         UNABLE to CONFIGURE with GIVEN OPTIONS    (see configure.log for details):
-------------------------------------------------------------------------------
Fortran error! mpi_init() could not be located!
*******************************************************************************

make: *** [Makefile:210: petsc-3.13.0/tag-conf-real] Errore 1

Note that I didn’t add anything to any PATH variable, because this is not mentioned in your documentation.

On a side note, this is the same error I got when trying to build PETSc in Cygwin with the default OpenMPI available in Cygwin.

I am attaching the configure.log… it seems to me that the error comes from the configure trying to include the mpif.h in your folder and not using the -fallow-invalid-boz flag that I had to use, for example, to compile mpi.f90 into mpi.mod

But I’m not sure why this is happening

Pierre,

Could this be due to gcc 10?

Sorry, I’m slow. You are right. Our workers use gcc 9, everything is fine, but I see on my VM which I updated that I use gcc 10 and had to disable Fortran, I guess the MUMPS run I showcased was with a prior PETSc build.
I’ll try to resolve this and will keep you posted.
They really caught a lot of people off guard with gfortran 10…

Thanks,
Pierre




Executing: gfortran -c -o /tmp/petsc-ur0cff6a/config.libraries/conftest.o -I/tmp/petsc-ur0cff6a/config.compilers -I/tmp/petsc-ur0cff6a/config.setCompilers -I/tmp/petsc-ur0cff6a/config.compilersFortran -I/tmp/petsc-ur0cff6a/config.libraries  -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -O3 -mtune=generic   -I/home/paolo/FreeFem-sources/3rdparty/include/msmpi /tmp/petsc-ur0cff6a/config.libraries/conftest.F90
Possible ERROR while running compiler: exit code 1
stderr:
C:/msys64/home/paolo/FreeFem-sources/3rdparty/include/msmpi/mpif.h:227:36:

  227 |        PARAMETER (MPI_DATATYPE_NULL=z'0c000000')
      |                                    1
Error: BOZ literal constant at (1) is neither a data-stmt-constant nor an actual argument to INT, REAL, DBLE, or CMPLX intrinsic function [see '-fno-allow-invalid-boz']
C:/msys64/home/paolo/FreeFem-sources/3rdparty/include/msmpi/mpif.h:303:27:

  303 |        PARAMETER (MPI_CHAR=z'4c000101')
      |                           1
Error: BOZ literal constant at (1) is neither a data-stmt-constant nor an actual argument to INT, REAL, DBLE, or CMPLX intrinsic function [see '-fno-allow-invalid-boz']
C:/msys64/home/paolo/FreeFem-sources/3rdparty/include/msmpi/mpif.h:305:36:

  305 |        PARAMETER (MPI_UNSIGNED_CHAR=z'4c000102')
      |                                    1

  Thanks,

     Matt

Thanks

Paolo

Inviato da Posta<https://go.microsoft.com/fwlink/?LinkId=550986> per Windows 10

Da: Pierre Jolivet<mailto:pierre.jolivet at enseeiht.fr>
Inviato: lunedì 29 giugno 2020 18:34
A: Paolo Lampitella<mailto:paololampitella at hotmail.com>
Cc: Satish Balay<mailto:balay at mcs.anl.gov>; petsc-users<mailto:petsc-users at mcs.anl.gov>
Oggetto: Re: [petsc-users] PETSc and Windows 10



On 29 Jun 2020, at 6:27 PM, Paolo Lampitella <paololampitella at hotmail.com<mailto:paololampitella at hotmail.com>> wrote:

I think I made the first step of having mingw64 from msys2 working with ms-mpi.

I found that the issue I was having was related to:

https://gcc.gnu.org/bugzilla/show_bug.cgi?id=91556

and, probably (but impossible to check now), I was using an msys2 and/or mingw mpi package before this fix:

https://github.com/msys2/MINGW-packages/commit/11b4cff3d2ec7411037b692b0ad5a9f3e9b9978d#diff-eac59989e3096be97d940c8f47b50fba

Admittedly, I never used gcc 10 before on any machine. Still, I feel that reporting that sort of error in that way is,
at least, misleading (I would have preferred the initial implementation as mentioned in the gcc bug track).

A second thing that I was not used to, and made me more uncertain of the procedure I was following, is having to compile myself the mpi module. There are several version of this out there, but I decided to stick with this one:

https://www.scivision.dev/windows-mpi-msys2/

even if there seems to be no need to include -fno-range-check and the current mpi.f90 version is different from the mpif.h as reported here:

https://github.com/microsoft/Microsoft-MPI/issues/33

which, to me, are both signs of lack of attention on the fortran side by those that maintain this thing.

In summary, this is the procedure I followed so far (on a 64 bit machine with Windows 10):


  *   Install MSYS2 from https://www.msys2.org/ and just follow the install wizard
  *   Open the MSYS2 terminal and execute: pacman -Syuu
  *   Close the terminal when asked and reopen it
  *   Keep executing ‘pacman -Syuu’ until nothing else needs to be updated
  *   Close the MSYS2 terminal and reopen it (I guess because was in paranoid mode), then install packages with:


pacman -S base-devel git gcc gcc-fortran bsdcpio lndir pax-git unzip
pacman -S mingw-w64-x86_64-toolchain
pacman -S mingw-w64-x86_64-msmpi
pacman -S mingw-w64-x86_64-cmake
pacman -S mingw-w64-x86_64-freeglut
pacman -S mingw-w64-x86_64-gsl
pacman -S mingw-w64-x86_64-libmicroutils
pacman -S mingw-w64-x86_64-hdf5
pacman -S mingw-w64-x86_64-openblas
pacman -S mingw-w64-x86_64-arpack
pacman -S mingw-w64-x86_64-jq

This set should include all the libraries mentioned by Pierre and/or used by his Jenkins, as the final scope here is to have PETSc and dependencies working. But I think that for pure MPI one could stop to msmpi (even, maybe, just install msmpi and have the dependencies figured out by pacman). Honestly, I don’t remember the exact order I used to install the packages, but this should not affect things. Also, as I was still in paranoid mode, I kept executing ‘pacman -Syuu’ after each package was installed. After this, close the MSYS2 terminal.


  *   Open the MINGW64 terminal and create the .mod file out of the mpi.f90 file, as mentioned here https://www.scivision.dev/windows-mpi-msys2/, with:


cd /mingw64/include
gfortran mpif90 -c -fno-range-check -fallow-invalid-boz

Ah, yes, that’s new to gfortran 10 (we use gfortran 9 on our workers), which is now what’s ship with MSYS2 (we haven’t updated yet). Sorry that I forgot about that.

This is needed to ‘USE mpi’ (as opposed to INCLUDE ‘mpif.h’)


  *   Install the latest MS-MPI (both sdk and setup) from https://www.microsoft.com/en-us/download/details.aspx?id=100593


At this point I’ve been able to compile (using the MINGW64 terminal) different mpi test programs and they run as expected in the classical Windows prompt. I added this function to my .bashrc in MSYS2 in order to easily copy the required dependencies out of MSYS:

function copydep() { ldd $1 | grep "=> /$2" | awk '{print $3}' | xargs -I '{}' cp -v '{}' .; }

which can be used, with the MINGW64 terminal, by navigating to the folder where the final executable, say, my.exe, resides (even if under a Windows path) and executing:

copydep my.exe mingw64

This, of course, must be done before actually trying to execute the .exe in the windows cmd prompt.

Hopefully, I should now be able to follow Pierre’s instructions for PETSc (but first I wanna give a try to the system python before removing it)

Looks like the hard part is over. It’s usually easier to deal with ./configure issues.
If you have weird errors like “incomplete Cygwin install” or whatever, this is the kind of issues I was referring to earlier.
In that case, what I’d suggest is just, as before:
pacman -R mingw-w64-x86_64-python mingw-w64-x86_64-gdb
pacman -S python

Thanks,
Pierre

Thanks

Paolo




--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>


<configure.log>


<petsc_test.txt>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200705/6cdd1fde/attachment-0001.html>


More information about the petsc-users mailing list