[petsc-users] Full OpenMP strategy MUMPS

Smith, Barry F. bsmith at mcs.anl.gov
Mon Jul 29 12:58:23 CDT 2019


  Please look at https://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/Mat/MATSOLVERMUMPS.html  it has a clearer explanation of the two ways one can use MUMPS and OpenMP ( note the petsc-dev in the URL).

  In your case you want to use 
  
OMP_NUM_THREADS=8  mpirun -n 2 ./my_program 

This will run two MPI processes of MUMPS each using 8 threads

You cannot mix the OMP_NUM_THREADS=<n> and the -mat_mumps_use_omp_threads <n> option. They are for different situations.

  Barry



> OMP_NUM_THREADS=16 I execute:
> mpirun -n 2 ./my_program -mat_mumps_use_omp_threads 8

> 
> On Jul 29, 2019, at 12:23 PM, Piotr Sierant via petsc-users <petsc-users at mcs.anl.gov> wrote:
> 
> Hello everyone,
> 
> I am trying to use PETSc with MUMPS in full openMP strategy motivated by data from
> SciPost Phys. 5, 045 (2018) which suggest that I can reduce RAM usage for particular system sizes in this way. 
> I cannot get it to work in the way it is described at 
> https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MATSOLVERMUMPS.html
> Having configured PETSc with the following options:
> Configure options --download-mpich --download-scalapack --download-cmake --with-openmp --download-metis --download-mumps --with-threadsafety --with-log=0 --with-debugging=0 --download-hwloc --with-blaslapack-dir=/opt/intel/Compiler/19.0/compilers_and_libraries_2019.4.243/linux/mkl
> (also with configured SLEPc which I need for shift-and-invert which requires LU decomposition from MUMPS) With OMP_NUM_THREADS=16 I execute:
> mpirun -n 2 ./my_program -mat_mumps_use_omp_threads 8
> obtaining 
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> [0]PETSC ERROR: Argument out of range
> [0]PETSC ERROR: number of OpenMP threads 8 can not be < 1 or > the MPI shared memory communicator size 1
> (...)
> What should I do to be able to run the code with single (or few) MPI rank(s) and with MUMPS operating with 8 openMP threads? I will greatly appreciate any comments.
> 
> Thanks, Piotr
> 
> 
> 
> 
> 



More information about the petsc-users mailing list