[petsc-users] Compiling PETSC with Intel OneAPI compilers and OpenMPI
Samar Khatiwala
samar.khatiwala at earth.ox.ac.uk
Mon May 15 12:22:18 CDT 2023
Hi Marcos,
Yes, I compiled with clang instead of icc (no particular reason for this; I tend to use gcc/clang). I use mpich4.1.1, which I first built with clang and ifort:
FC=ifort
./configure --prefix=/usr/local/mpich4 --enable-two-level-namespace
Samar
On May 15, 2023, at 6:07 PM, Vanella, Marcos (Fed) <marcos.vanella at nist.gov> wrote:
Hi Samar, what MPI library do you use? Did you compile it with clang instead of icc?
Thanks,
Marcos
________________________________
From: Samar Khatiwala <samar.khatiwala at earth.ox.ac.uk>
Sent: Monday, May 15, 2023 1:05 PM
To: Matthew Knepley <knepley at gmail.com>
Cc: Vanella, Marcos (Fed) <marcos.vanella at nist.gov>; petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and OpenMPI
Hi, for what it’s worth, clang + ifort from OneAPI 2023 update 1 works fine for me on both Intel and M2 Macs. So it might just be a matter of upgrading.
Samar
On May 15, 2023, at 5:53 PM, Matthew Knepley <knepley at gmail.com> wrote:
Send us
$PETSC_ARCH/include/petscconf.h
Thanks,
Matt
On Mon, May 15, 2023 at 12:49 PM Vanella, Marcos (Fed) <marcos.vanella at nist.gov<mailto:marcos.vanella at nist.gov>> wrote:
Hi Matt, I configured the lib like this:
$ ./configure --with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1 --with-debugging=0 --with-shared-libraries=0 --download-make
and compiled. I still get some check segfault error. See below:
$ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 PETSC_ARCH=arch-darwin-c-opt check
Running check examples to verify correct installation
Using PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 and PETSC_ARCH=arch-darwin-c-opt
*******************Error detected during compile or link!*******************
See https://petsc.org/release/faq/
/Users/mnv/Documents/Software/petsc-3.19.1/src/snes/tutorials ex19
*********************************************************************************
mpicc -Wl,-bind_at_load -Wl,-multiply_defined,suppress -Wl,-multiply_defined -Wl,suppress -Wl,-commons,use_dylibs -Wl,-search_paths_first -Wl,-no_compact_unwind -fPIC -wd1572 -Wno-unknown-pragmas -g -O3 -I/Users/mnv/Documents/Software/petsc-3.19.1/include -I/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/include -I/opt/X11/include -std=c99 ex19.c -L/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/lib -Wl,-rpath,/opt/intel/oneapi/mkl/2022.2.1/lib -L/opt/intel/oneapi/mkl/2022.2.1/lib -Wl,-rpath,/opt/X11/lib -L/opt/X11/lib -L/opt/openmpi414_oneapi22u3/lib -Wl,-rpath,/opt/intel/oneapi/compiler/2022.2.1/mac/compiler/lib -L/opt/intel/oneapi/tbb/2021.7.1/lib -L/opt/intel/oneapi/ippcp/2021.6.2/lib -L/opt/intel/oneapi/ipp/2021.6.2/lib -L/opt/intel/oneapi/dnnl/2022.2.1/cpu_iomp/lib -L/opt/intel/oneapi/dal/2021.7.1/lib -L/opt/intel/oneapi/compiler/2022.2.1/mac/compiler/lib -L/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/lib -Wl,-rpath,/opt/intel/oneapi/compiler/2022.2.1/mac/bin/intel64/../../compiler/lib -L/opt/intel/oneapi/compiler/2022.2.1/mac/bin/intel64/../../compiler/lib -Wl,-rpath,/Library/Developer/CommandLineTools/usr/lib/clang/14.0.3/lib/darwin -L/Library/Developer/CommandLineTools/usr/lib/clang/14.0.3/lib/darwin -lpetsc -lmkl_intel_lp64 -lmkl_core -lmkl_sequential -lpthread -lX11 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lopen-rte -lopen-pal -limf -lm -lz -lifport -lifcoremt -lsvml -lipgo -lirc -lpthread -lclang_rt.osx -lmpi -lopen-rte -lopen-pal -limf -lm -lz -lsvml -lirng -lc++ -lipgo -ldecimal -lirc -lclang_rt.osx -lmpi -lopen-rte -lopen-pal -limf -lm -lz -lsvml -lirng -lc++ -lipgo -ldecimal -lirc -lclang_rt.osx -o ex19
icc: remark #10441: The Intel(R) C++ Compiler Classic (ICC) is deprecated and will be removed from product release in the second half of 2023. The Intel(R) oneAPI DPC++/C++ Compiler (ICX) is the recommended compiler moving forward. Please transition to use this compiler. Use '-diag-disable=10441' to disable this message.
In file included from /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsys.h(44),
from /Users/mnv/Documents/Software/petsc-3.19.1/include/petscvec.h(9),
from /Users/mnv/Documents/Software/petsc-3.19.1/include/petscmat.h(7),
from /Users/mnv/Documents/Software/petsc-3.19.1/include/petscpc.h(7),
from /Users/mnv/Documents/Software/petsc-3.19.1/include/petscksp.h(7),
from /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsnes.h(7),
from ex19.c(68):
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsystypes.h(68): warning #2621: attribute "warn_unused_result" does not apply here
PETSC_ERROR_CODE_TYPEDEF enum PETSC_ERROR_CODE_NODISCARD {
^
Possible error running C/C++ src/snes/tutorials/ex19 with 1 MPI process
See https://petsc.org/release/faq/
[excess:37807] *** Process received signal ***
[excess:37807] Signal: Segmentation fault: 11 (11)
[excess:37807] Signal code: Address not mapped (1)
[excess:37807] Failing at address: 0x7f
[excess:37807] *** End of error message ***
--------------------------------------------------------------------------
Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpiexec noticed that process rank 0 with PID 0 on node excess exited on signal 11 (Segmentation fault: 11).
--------------------------------------------------------------------------
Possible error running C/C++ src/snes/tutorials/ex19 with 2 MPI processes
See https://petsc.org/release/faq/
[excess:37831] *** Process received signal ***
[excess:37831] Signal: Segmentation fault: 11 (11)
[excess:37831] Signal code: Address not mapped (1)
[excess:37831] Failing at address: 0x7f
[excess:37831] *** End of error message ***
[excess:37832] *** Process received signal ***
[excess:37832] Signal: Segmentation fault: 11 (11)
[excess:37832] Signal code: Address not mapped (1)
[excess:37832] Failing at address: 0x7f
[excess:37832] *** End of error message ***
--------------------------------------------------------------------------
Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpiexec noticed that process rank 1 with PID 0 on node excess exited on signal 11 (Segmentation fault: 11).
--------------------------------------------------------------------------
Possible error running Fortran example src/snes/tutorials/ex5f with 1 MPI process
See https://petsc.org/release/faq/
forrtl: severe (174): SIGSEGV, segmentation fault occurred
Image PC Routine Line Source
libifcoremt.dylib 000000010B7F7FE4 for__signal_handl Unknown Unknown
libsystem_platfor 00007FF8024C25ED _sigtramp Unknown Unknown
ex5f 00000001087AFA38 PetscGetArchType Unknown Unknown
ex5f 000000010887913B PetscErrorPrintfI Unknown Unknown
ex5f 000000010878D227 PetscInitialize_C Unknown Unknown
ex5f 000000010879D289 petscinitializef_ Unknown Unknown
ex5f 0000000108713C09 petscsys_mp_petsc Unknown Unknown
ex5f 0000000108710B5D MAIN__ Unknown Unknown
ex5f 0000000108710AEE main Unknown Unknown
dyld 00007FF80213B41F start Unknown Unknown
--------------------------------------------------------------------------
Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpiexec detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
Process name: [[48108,1],0]
Exit code: 174
--------------------------------------------------------------------------
Completed test examples
Error while running make check
make[1]: *** [check] Error 1
make: *** [check] Error 2
________________________________
From: Vanella, Marcos (Fed) <marcos.vanella at nist.gov<mailto:marcos.vanella at nist.gov>>
Sent: Monday, May 15, 2023 12:20 PM
To: Matthew Knepley <knepley at gmail.com<mailto:knepley at gmail.com>>
Cc: petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>
Subject: Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and OpenMPI
Thank you Matt I'll try this and let you know.
Marcos
________________________________
From: Matthew Knepley <knepley at gmail.com<mailto:knepley at gmail.com>>
Sent: Monday, May 15, 2023 12:08 PM
To: Vanella, Marcos (Fed) <marcos.vanella at nist.gov<mailto:marcos.vanella at nist.gov>>
Cc: petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov> <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>
Subject: Re: [petsc-users] Compiling PETSC with Intel OneAPI compilers and OpenMPI
On Mon, May 15, 2023 at 11:19 AM Vanella, Marcos (Fed) via petsc-users <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>> wrote:
Hello, I'm trying to compile the PETSc library version 3.19.1 with OpenMPI 4.1.4 and the OneAPI 2022 Update 2 Intel Compiler suite on a Mac with OSX Ventura 13.3.1.
I can compile PETSc in debug mode with this configure and make lines. I can run the PETSC tests, which seem fine.
When I compile the library in optimized mode, either using -O3 or O1, for example configuring with:
I hate to yell "compiler bug" when this happens, but it sure seems like one. Can you just use
--with-debugging=0
without the custom COPTFLAGS, CXXOPTFLAGS, FOPTFLAGS? If that works, it is almost
certainly a compiler bug. If not, then we can go in the debugger and see what is failing.
Thanks,
Matt
$ ./configure --prefix=/opt/petsc-oneapi22u3 --with-blaslapack-dir=/opt/intel/oneapi/mkl/2022.2.1 COPTFLAGS='-m64 -O1 -g -diag-disable=10441' CXXOPTFLAGS='-m64 -O1 -g -diag-disable=10441' FOPTFLAGS='-m64 -O1 -g' LDFLAGS='-m64' --with-debugging=0 --with-shared-libraries=0 --download-make
and using mpicc (icc), mpif90 (ifort) from Open MPI, the static lib compiles. Yet, I see right off the bat this segfault error in the first PETSc example:
$ make PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 PETSC_ARCH=arch-darwin-c-opt test
/Users/mnv/Documents/Software/petsc-3.19.1/arch-darwin-c-opt/bin/make --no-print-directory -f /Users/mnv/Documents/Software/petsc-3.19.1/gmakefile.test PETSC_ARCH=arch-darwin-c-opt PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1 test
/opt/intel/oneapi/intelpython/latest/bin/python3 /Users/mnv/Documents/Software/petsc-3.19.1/config/gmakegentest.py --petsc-dir=/Users/mnv/Documents/Software/petsc-3.19.1 --petsc-arch=arch-darwin-c-opt --testdir=./arch-darwin-c-opt/tests
Using MAKEFLAGS: --no-print-directory -- PETSC_ARCH=arch-darwin-c-opt PETSC_DIR=/Users/mnv/Documents/Software/petsc-3.19.1
CC arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1.o
In file included from /Users/mnv/Documents/Software/petsc-3.19.1/include/petscsys.h(44),
from /Users/mnv/Documents/Software/petsc-3.19.1/src/sys/classes/draw/tests/ex1.c(4):
/Users/mnv/Documents/Software/petsc-3.19.1/include/petscsystypes.h(68): warning #2621: attribute "warn_unused_result" does not apply here
PETSC_ERROR_CODE_TYPEDEF enum PETSC_ERROR_CODE_NODISCARD {
^
CLINKER arch-darwin-c-opt/tests/sys/classes/draw/tests/ex1
TEST arch-darwin-c-opt/tests/counts/sys_classes_draw_tests-ex1_1.counts
not ok sys_classes_draw_tests-ex1_1 # Error code: 139
# [excess:98681] *** Process received signal ***
# [excess:98681] Signal: Segmentation fault: 11 (11)
# [excess:98681] Signal code: Address not mapped (1)
# [excess:98681] Failing at address: 0x7f
# [excess:98681] *** End of error message ***
# --------------------------------------------------------------------------
# Primary job terminated normally, but 1 process returned
# a non-zero exit code. Per user-direction, the job has been aborted.
# --------------------------------------------------------------------------
# --------------------------------------------------------------------------
# mpiexec noticed that process rank 0 with PID 0 on node excess exited on signal 11 (Segmentation fault: 11).
# --------------------------------------------------------------------------
ok sys_classes_draw_tests-ex1_1 # SKIP Command failed so no diff
I see the same segfault error in all PETSc examples.
Any help is mostly appreciated, I'm starting to work with PETSc. Our plan is to use the linear solver from PETSc for the Poisson equation on our numerical scheme and test this on a GPU cluster. So also, any guideline on how to interface PETSc with a fortran code and personal experience is also most appreciated!
Marcos
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230515/94d01a9a/attachment-0001.html>
More information about the petsc-users
mailing list