[petsc-users] Doubts on direct solver with mumps

Barry Smith bsmith at petsc.dev
Tue Mar 4 14:22:36 CST 2025



> On Mar 4, 2025, at 2:02 PM, Emmanuel Ayala <juaneah at gmail.com> wrote:
> 
> Hello everyone.
> 
> I'm trying to solve a linear system (which comes from 3D FEM with structured DM mesh) with a direct solver. I configured petsc installation with mumps (–download-mumps –download-scalapack –download-parmetis –download-metis, --download-hwloc, without ptscotch) and I have the following functions:
> 
>     // K is the stiffness matrix, assembly correctly
>     // U is the solution vector
>     // RHS is the right hand side of the linear equation
>     
>     Mat Kfactor;
> 
>     ierr = MatGetFactor(K,MATSOLVERMUMPS, MAT_FACTOR_CHOLESKY, &Kfactor); CHKERRQ(ierr);
>     ierr = MatCholeskyFactorSymbolic(Kfactor,K,0,0); CHKERRQ(ierr);
>     ierr = MatCholeskyFactorNumeric(Kfactor,K,0); CHKERRQ(ierr);
>     ierr = MatSolve(Kfactor,RHS,U);


Note 1)  these four lines of code above are not needed if you use are using KSPSolve to solve the system. The options below will trigger this. 
>     
>     and run with options: 
>     -pc_type cholesky -pc_factor_mat_solver_type mumps -mat_mumps_icntl_1 1 -mat_mumps_icntl_13 0 -mat_mumps_icntl_28 2 -mat_mumps_icntl_29 2

Note 2) it is imperative you use a good BLAS library and optimization when using mumps. Do not use --download-blaslapack (or friends) and use --with-debugging=0 in the configure option).

Note 3) If you are running sequentially (no MPI) then also ensure you have --with-openmp in your configure options and set an appropriate value for the number of OpenMP threads when you run your program.

   
> 
> PROBLEM:    
> I got the correct solution, but the function MatCholeskyFactorNumeric( ) takes too much time to be completed. MatCholeskyFactorSymbolic() and MatSolve() are very fast. The test uses a square K matrix of 700k dofs, and the MatCholeskyFactorNumeric() takes around 14 minutes, while an iterative solver (KSPCG/PCJACOBI) takes 5 seconds to get the solution. Any suggestions? 
> 
> Thanks in advance.



More information about the petsc-users mailing list