[petsc-users] Solver compilation with 64-bit version of PETSc under Windows 10 using Cygwin
Smith, Barry F.
bsmith at mcs.anl.gov
Mon Jan 20 07:32:04 CST 2020
First you need to figure out what is triggering:
C:/MPI/Bin/mpiexec.exe: error while loading shared libraries: ?: cannot open shared object file: No such file or directory
Googling it finds all kinds of suggestions for Linux. But Windows? Maybe the debugger will help.
Second
> VecNorm_Seq line 221 /cygdrive/d/Computational_geomechanics/installation/petsc-barry/src/vec/vec/impls/seq/bvec2.c
Debugger is best to find out what is triggering this. Since it is the C side of things it would be odd that the Fortran change affects it.
Barry
> On Jan 20, 2020, at 4:43 AM, Дмитрий Мельничук <dmitry.melnichuk at geosteertech.com> wrote:
>
> Thank you so much for your assistance!
>
> As far as I have been able to find out, the errors "Type mismatch in argument ‘ierr’" have been successfully fixed.
> But execution of command "make PETSC_DIR=/cygdrive/d/... PETSC_ARCH=arch-mswin-c-debug check" leads to the appereance of Segmantation Violation error.
>
> I compiled PETSc with Microsoft MPI v10.
> Does it make sense to compile PETSc with another MPI implementation (such as MPICH) in order to resolve the issue?
>
> Error message:
> Running test examples to verify correct installation
> Using PETSC_DIR=/cygdrive/d/Computational_geomechanics/installation/petsc-barry and PETSC_ARCH=arch-mswin-c-debug
> Possible error running C/C++ src/snes/examples/tutorials/ex19 with 1 MPI process
> See http://www.mcs.anl.gov/petsc/documentation/faq.html
> C:/MPI/Bin/mpiexec.exe: error while loading shared libraries: ?: cannot open shared object file: No such file or directory
> Possible error running C/C++ src/snes/examples/tutorials/ex19 with 2 MPI processes
> See http://www.mcs.anl.gov/petsc/documentation/faq.html
> C:/MPI/Bin/mpiexec.exe: error while loading shared libraries: ?: cannot open shared object file: No such file or directory
> Possible error running Fortran example src/snes/examples/tutorials/ex5f with 1 MPI process
> See http://www.mcs.anl.gov/petsc/documentation/faq.html
> [0]PETSC ERROR: ------------------------------------------------------------------------
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [0]PETSC ERROR: or see https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
> [0]PETSC ERROR: likely location of problem given in stack below
> [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------
> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
> [0]PETSC ERROR: INSTEAD the line number of the start of the function
> [0]PETSC ERROR: is given.
> [0]PETSC ERROR: [0] VecNorm_Seq line 221 /cygdrive/d/Computational_geomechanics/installation/petsc-barry/src/vec/vec/impls/seq/bvec2.c
> [0]PETSC ERROR: [0] VecNorm line 213 /cygdrive/d/Computational_geomechanics/installation/petsc-barry/src/vec/vec/interface/rvector.c
> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144 /cygdrive/d/Computational_geomechanics/installation/petsc-barry/src/snes/impls/ls/ls.c
> [0]PETSC ERROR: [0] SNESSolve line 4375 /cygdrive/d/Computational_geomechanics/installation/petsc-barry/src/snes/interface/snes.c
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> [0]PETSC ERROR: Signal received
> [0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: unknown GIT Date: unknown
> [0]PETSC ERROR: ./ex5f on a arch-mswin-c-debug named DESKTOP-R88IMOB by useruser Mon Jan 20 09:18:34 2020
> [0]PETSC ERROR: Configure options --with-cc=x86_64-w64-mingw32-gcc --with-cxx=x86_64-w64-mingw32-g++ --with-fc=x86_64-w64-mingw32-gfortran --with-mpi-include=/cygdrive/c/MPISDK/Include --with-mpi-lib=/cygdrive/c/MPISDK/Lib/libmsmpi.a --with-mpi-mpiexec=/cygdrive/c/MPI/Bin/mpiexec.exe --with-debugging=yes -CFLAGS=-O2 -CXXFLAGS=-O2 -FFLAGS="-O2 -static-libgfortran -static -lpthread -fno-range-check -fdefault-integer-8" --download-fblaslapack --with-shared-libraries=no --with-64-bit-indices --force
> [0]PETSC ERROR: #1 User provided function() line 0 in unknown file
>
> job aborted:
> [ranks] message
>
> [0] application aborted
> aborting MPI_COMM_WORLD (comm=0x44000000), error 50152059, comm rank 0
>
> ---- error analysis -----
>
> [0] on DESKTOP-R88IMOB
> ./ex5f aborted the job. abort code 50152059
>
> ---- error analysis -----
> Completed test examples
>
> Kind regards,
> Dmitry Melnichuk
>
> 19.01.2020, 07:47, "Smith, Barry F." <bsmith at mcs.anl.gov>:
>
> Dmitry,
>
> I have completed and tested the branch barry/2020-01-15/support-default-integer-8 it is undergoing testing now https://gitlab.com/petsc/petsc/merge_requests/2456
>
> Please give it a try. Note that MPI has no support for integer promotion so YOU must insure that any MPI calls from Fortran pass 4 byte integers not promoted 8 byte integers.
>
> I have tested it with recent versions of MPICH and OpenMPI, it is fragile at compile time and may fail to compile with different versions of MPI.
>
> Good luck,
>
> Barry
>
> I do not recommend this approach for integer promotion in Fortran. Just blindly promoting all integers can often lead to problems. I recommend using the kind mechanism of
> Fortran to insure that each variable is the type you want, you can recompile with different options to promote the kind declared variables you wish. Of course this is more intrusive and requires changes to the Fortran code.
>
>
> On Jan 15, 2020, at 7:00 AM, Дмитрий Мельничук <dmitry.melnichuk at geosteertech.com> wrote:
>
> Hello all!
>
> At present time I need to compile solver called Defmod (https://bitbucket.org/stali/defmod/wiki/Home), which is written in Fortran 95.
> Defmod uses PETSc for solving linear algebra system.
> Solver compilation with 32-bit version of PETSc does not cause any problem.
> But solver compilation with 64-bit version of PETSc produces an error with size of ierr PETSc variable.
>
> 1. For example, consider the following statements written in Fortran:
>
>
> PetscErrorCode :: ierr_m
> PetscInt :: ierr
> ...
> ...
> call VecDuplicate(Vec_U,Vec_Um,ierr)
> call VecCopy(Vec_U,Vec_Um,ierr)
> call VecGetLocalSize(Vec_U,j,ierr)
> call VecGetOwnershipRange(Vec_U,j1,j2,ierr_m)
>
>
> As can be seen first three subroutunes require ierr to be size of INTEGER(8), while the last subroutine (VecGetOwnershipRange) requires ierr to be size of INTEGER(4).
> Using the same integer format gives an error:
>
> There is no specific subroutine for the generic ‘vecgetownershiprange’ at (1)
>
> 2. Another example is:
>
>
> call MatAssemblyBegin(Mat_K,Mat_Final_Assembly,ierr)
> CHKERRA(ierr)
> call MatAssemblyEnd(Mat_K,Mat_Final_Assembly,ierr)
>
>
> I am not able to define an appropriate size if ierr in CHKERRA(ierr). If I choose INTEGER(8), the error "Type mismatch in argument ‘ierr’ at (1); passed INTEGER(8) to INTEGER(4)" occurs.
> If I define ierr as INTEGER(4), the error "Type mismatch in argument ‘ierr’ at (1); passed INTEGER(4) to INTEGER(8)" appears.
>
>
> 3. If I change the sizes of ierr vaiables as error messages require, the compilation completed successfully, but an error occurs when calculating the RHS vector with following message:
>
> [0]PETSC ERROR: Out of range index value -4 cannot be negative
>
>
> Command to configure 32-bit version of PETSc under Windows 10 using Cygwin:
> ./configure --with-cc=x86_64-w64-mingw32-gcc --with-cxx=x86_64-w64-mingw32-g++ --with-fc=x86_64-w64-mingw32-gfortran --download-fblaslapack --with-mpi-include=/cygdrive/c/MPISDK/Include --with-mpi-lib=/cygdrive/c/MPISDK/Lib/libmsmpi.a --with-mpi-mpiexec=/cygdrive/c/MPI/Bin/mpiexec.exe --with-debugging=yes -CFLAGS='-O2' -CXXFLAGS='-O2' -FFLAGS='-O2 -static-libgfortran -static -lpthread -fno-range-check' --with-shared-libraries=no
>
> Command to configure 64-bit version of PETSc under Windows 10 using Cygwin:
> ./configure --with-cc=x86_64-w64-mingw32-gcc --with-cxx=x86_64-w64-mingw32-g++ --with-fc=x86_64-w64-mingw32-gfortran --download-fblaslapack --with-mpi-include=/cygdrive/c/MPISDK/Include --with-mpi-lib=/cygdrive/c/MPISDK/Lib/libmsmpi.a --with-mpi-mpiexec=/cygdrive/c/MPI/Bin/mpiexec.exe --with-debugging=yes -CFLAGS='-O2' -CXXFLAGS='-O2' -FFLAGS='-O2 -static-libgfortran -static -lpthread -fno-range-check -fdefault-integer-8' --with-shared-libraries=no --with-64-bit-indices --known-64-bit-blas-indices
>
>
> Kind regards,
> Dmitry Melnichuk
>
More information about the petsc-users
mailing list