[petsc-users] Error "Attempting to use an MPI routine before initializing MPICH" after compiling PETSc with Intel MPI and GCC

Matthew Knepley knepley at gmail.com
Wed Oct 13 05:36:01 CDT 2021


On Wed, Oct 13, 2021 at 6:32 AM Roland Richter <roland.richter at ntnu.no>
wrote:

> Yes, the first part (which works) consists out of a compilation line and a
> linking line, while the second command is a combination of compile- and
> linking line.
>
The link line in the first does not tell us anything because MPI is not
even present. It is being pulled in I presume from libarmadillo, which we
cannot see. It still
seems most likely, as Stefano said, that you are mixing versions of MPI.

  Thanks,

     Matt


> Am 13.10.21 um 12:26 schrieb Matthew Knepley:
>
> On Wed, Oct 13, 2021 at 5:53 AM Roland Richter <roland.richter at ntnu.no>
> wrote:
>
>> Hei,
>>
>> I noticed a difference in when the program is running, and when not. The
>> code works fine if I compile it via a CMake-file and load PETSc there. If I
>> use the compilation line which is included in the Makefiles, then the code
>> will fail with the mentioned error. The cmake-generated compilation line
>> (including armadillo, because my test sample contained armadillo-code) is
>>
> One of these is a compile command and the other is a link command.
>
>    Matt
>
>> */opt/intel/oneapi/mpi/2021.4.0/bin/mpicxx -D__INSDIR__="" -I/include
>> -I/opt/petsc/include -I/opt/armadillo/include -std=c++0x -g -MD -MT
>> CMakeFiles/main.dir/source/main.cpp.o -MF
>> CMakeFiles/main.dir/source/main.cpp.o.d -o
>> CMakeFiles/main.dir/source/main.cpp.o -c source/main.cpp*
>> */opt/intel/oneapi/mpi/2021.4.0/bin/mpicxx -rdynamic
>> CMakeFiles/main.dir/source/main.cpp.o -o main_short
>> -Wl,-rpath,/opt/petsc/lib:/opt/armadillo/lib64 /opt/petsc/lib/libpetsc.so
>> /opt/armadillo/lib64/libarmadillo.so *
>>
>> Meanwhile, the original compilation line from PETSc is
>>
>> *mpicxx -mavx2 -march=native -O3 -fPIC -fopenmp    -I/opt/petsc/include
>> -I/opt/armadillo/include -I/opt/intel/oneapi/mkl/latest/include
>> -I/opt/fftw3/include -I/opt/hdf5/include -I/opt/boost/include
>> source/main.cpp -Wl,-rpath,/opt/petsc/lib -L/opt/petsc/lib
>> -Wl,-rpath,/opt/petsc/lib -L/opt/petsc/lib
>> -L/opt/intel/oneapi/mkl/latest/lib/intel64 -Wl,-rpath,/opt/fftw3/lib64
>> -L/opt/fftw3/lib64 -Wl,-rpath,/opt/armadillo/lib64 -L/opt/armadillo/lib64
>> -Wl,-rpath,/opt/intel/oneapi/mkl/latest/lib/intel64
>> -Wl,-rpath,/opt/hdf5/lib -L/opt/hdf5/lib
>> -Wl,-rpath,/opt/intel/oneapi/mpi/2021.4.0/lib/release
>> -L/opt/intel/oneapi/mpi/2021.4.0/lib/release
>> -Wl,-rpath,/opt/intel/oneapi/mpi/2021.4.0/lib
>> -L/opt/intel/oneapi/mpi/2021.4.0/lib
>> -Wl,-rpath,/usr/lib64/gcc/x86_64-suse-linux/11
>> -L/usr/lib64/gcc/x86_64-suse-linux/11
>> -Wl,-rpath,/opt/intel/oneapi/vpl/2021.6.0/lib
>> -L/opt/intel/oneapi/vpl/2021.6.0/lib
>> -Wl,-rpath,/opt/intel/oneapi/tbb/2021.4.0/lib/intel64/gcc4.8
>> -L/opt/intel/oneapi/tbb/2021.4.0/lib/intel64/gcc4.8
>> -Wl,-rpath,/opt/intel/oneapi/mpi/2021.4.0/libfabric/lib
>> -L/opt/intel/oneapi/mpi/2021.4.0/libfabric/lib
>> -Wl,-rpath,/opt/intel/oneapi/mkl/2021.4.0/lib/intel64
>> -L/opt/intel/oneapi/mkl/2021.4.0/lib/intel64
>> -Wl,-rpath,/opt/intel/oneapi/ipp/2021.4.0/lib/intel64
>> -L/opt/intel/oneapi/ipp/2021.4.0/lib/intel64
>> -Wl,-rpath,/opt/intel/oneapi/ippcp/2021.4.0/lib/intel64
>> -L/opt/intel/oneapi/ippcp/2021.4.0/lib/intel64
>> -Wl,-rpath,/opt/intel/oneapi/dnnl/2021.4.0/cpu_dpcpp_gpu_dpcpp/lib
>> -L/opt/intel/oneapi/dnnl/2021.4.0/cpu_dpcpp_gpu_dpcpp/lib
>> -Wl,-rpath,/opt/intel/oneapi/dal/2021.4.0/lib/intel64
>> -L/opt/intel/oneapi/dal/2021.4.0/lib/intel64
>> -Wl,-rpath,/opt/intel/oneapi/compiler/2021.4.0/linux/compiler/lib/intel64_lin
>> -L/opt/intel/oneapi/compiler/2021.4.0/linux/compiler/lib/intel64_lin
>> -Wl,-rpath,/opt/intel/oneapi/compiler/2021.4.0/linux/lib
>> -L/opt/intel/oneapi/compiler/2021.4.0/linux/lib
>> -Wl,-rpath,/opt/intel/oneapi/clck/2021.4.0/lib/intel64
>> -L/opt/intel/oneapi/clck/2021.4.0/lib/intel64
>> -Wl,-rpath,/opt/intel/oneapi/ccl/2021.4.0/lib/cpu_gpu_dpcpp
>> -L/opt/intel/oneapi/ccl/2021.4.0/lib/cpu_gpu_dpcpp
>> -Wl,-rpath,/usr/x86_64-suse-linux/lib -L/usr/x86_64-suse-linux/lib
>> -larmadillo -lpetsc -lHYPRE -lcmumps -ldmumps -lsmumps -lzmumps
>> -lmumps_common -lpord -lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64 -lspqr
>> -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd
>> -lsuitesparseconfig -lsuperlu -lsuperlu_dist -lEl -lElSuiteSparse -lpmrrr
>> -lfftw3_mpi -lfftw3 -lp4est -lsc -lmkl_intel_lp64 -lmkl_core
>> -lmkl_intel_thread -liomp5 -ldl -lpthread -lptesmumps -lptscotchparmetis
>> -lptscotch -lptscotcherr -lesmumps -lscotch -lscotcherr -lhdf5_hl -lhdf5
>> -lparmetis -lmetis -lm -lz -lmuparser -lX11 -lstdc++ -ldl -lmpifort -lmpi
>> -lrt -lpthread -lgfortran -lm -lgfortran -lm -lgcc_s -lquadmath -lrt
>> -lquadmath -lstdc++ -ldl -o main_long*
>>
>> Both executables have the same libraries linked to them, but in a
>> different order when comparing them with ldd.
>>
>> Does that explain the observed behavior?
>>
>> Thanks,
>>
>> regards,
>>
>> Roland
>> Am 11.10.21 um 15:13 schrieb Roland Richter:
>>
>> Hei,
>>
>> the following code works fine:
>>
>> #include <iostream>
>> #include <petsc.h>
>>
>> static char help[] = "Solves 2D Poisson equation using multigrid.\n\n";
>> int main(int argc,char **argv) {
>>     PetscInitialize(&argc,&argv,(char*)0,help);
>>     std::cout << "Hello World\n";
>>     PetscFinalize();
>>     return 0;
>> }
>>
>> Regards,
>>
>> Roland
>> Am 11.10.21 um 14:34 schrieb Stefano Zampini:
>>
>> Can you try with a simple call that only calls PetscInitialize/Finalize?
>>
>>
>> On Oct 11, 2021, at 3:30 PM, Roland Richter <roland.richter at ntnu.no>
>> wrote:
>>
>> At least according to configure.log mpiexec was defined as
>>
>> Checking for program /opt/intel/oneapi/mpi/2021.4.0//bin/mpiexec...found
>>                   Defined make macro "MPIEXECEXECUTABLE" to
>> "/opt/intel/oneapi/mpi/2021.4.0/bin/mpiexec"
>>
>> When running ex19 with this mpiexec it fails with the usual error, even
>> though all configuration steps worked fine. I attached the configuration
>> log.
>>
>> Regards,
>>
>> Roland
>> Am 11.10.21 um 14:24 schrieb Stefano Zampini:
>>
>> You are most probably using a different mpiexec then the one used to
>> compile petsc.
>>
>>
>>
>> On Oct 11, 2021, at 3:23 PM, Roland Richter <roland.richter at ntnu.no>
>> wrote:
>>
>> I tried either *./ex19* (SNES-example), *mpirun ./ex19* or *mpirun -n 1
>> ./ex19*, all with the same result.
>>
>> Regards,
>>
>> Roland
>> Am 11.10.21 um 14:22 schrieb Matthew Knepley:
>>
>> On Mon, Oct 11, 2021 at 8:07 AM Roland Richter <roland.richter at ntnu.no>
>> wrote:
>>
>>> Hei,
>>>
>>> at least in gdb it fails with
>>>
>>> Attempting to use an MPI routine before initializing MPICH
>>> [Inferior 1 (process 7854) exited with code 01]
>>> (gdb) backtrace
>>> No stack.
>>>
>>
>> What were you running? If it never makes it into PETSc code, I am not
>> sure what we are
>> doing to cause this.
>>
>>   Thanks,
>>
>>      Matt
>>
>>
>>> Regards,
>>>
>>> Roland
>>> Am 11.10.21 um 13:57 schrieb Matthew Knepley:
>>>
>>> On Mon, Oct 11, 2021 at 5:24 AM Roland Richter <roland.richter at ntnu.no>
>>> wrote:
>>>
>>>> Hei,
>>>>
>>>> I compiled PETSc with Intel MPI (MPICH) and GCC as compiler (i.e. using
>>>> Intel OneAPI together with the supplied mpicxx-compiler). Compilation
>>>> and installation worked fine, but running the tests resulted in the
>>>> error "Attempting to use an MPI routine before initializing MPICH". A
>>>> simple test program (attached) worked fine with the same combination.
>>>>
>>>> What could be the reason for that?
>>>>
>>>
>>> Hi Roland,
>>>
>>> Can you get a stack trace for this error using the debugger?
>>>
>>>   Thanks,
>>>
>>>      Matt
>>>
>>>
>>>> Thanks!
>>>>
>>>> Regards,
>>>>
>>>> Roland Richter
>>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> <http://www.cse.buffalo.edu/~knepley/>
>>>
>>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>>
>> <configure.log>
>>
>>
>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20211013/b71dd918/attachment-0001.html>


More information about the petsc-users mailing list