[petsc-users] Doubts on direct solver with mumps

Emmanuel Ayala juaneah at gmail.com
Tue Mar 4 15:21:13 CST 2025


Thanks for the notes.

El mar, 4 mar 2025 a la(s) 2:22 p.m., Barry Smith (bsmith at petsc.dev)
escribió:

>
>
> > On Mar 4, 2025, at 2:02 PM, Emmanuel Ayala <juaneah at gmail.com> wrote:
> >
> > Hello everyone.
> >
> > I'm trying to solve a linear system (which comes from 3D FEM with
> structured DM mesh) with a direct solver. I configured petsc installation
> with mumps (–download-mumps –download-scalapack –download-parmetis
> –download-metis, --download-hwloc, without ptscotch) and I have the
> following functions:
> >
> >     // K is the stiffness matrix, assembly correctly
> >     // U is the solution vector
> >     // RHS is the right hand side of the linear equation
> >
> >     Mat Kfactor;
> >
> >     ierr = MatGetFactor(K,MATSOLVERMUMPS, MAT_FACTOR_CHOLESKY,
> &Kfactor); CHKERRQ(ierr);
> >     ierr = MatCholeskyFactorSymbolic(Kfactor,K,0,0); CHKERRQ(ierr);
> >     ierr = MatCholeskyFactorNumeric(Kfactor,K,0); CHKERRQ(ierr);
> >     ierr = MatSolve(Kfactor,RHS,U);
>
>
> Note 1)  these four lines of code above are not needed if you use are
> using KSPSolve to solve the system. The options below will trigger this.
> >
> >     and run with options:
> >     -pc_type cholesky -pc_factor_mat_solver_type mumps
> -mat_mumps_icntl_1 1 -mat_mumps_icntl_13 0 -mat_mumps_icntl_28 2
> -mat_mumps_icntl_29 2
>
OK. I'm solving sepparetelly the direct solver and the iterative one.

>
> Note 2) it is imperative you use a good BLAS library and optimization when
> using mumps. Do not use --download-blaslapack (or friends) and use
> --with-debugging=0 in the configure option).
>
Right. I configured it with --with-debugging=0 and --download-fblaslapack. *So,
which is a good BLAS library?* from petsc: ...One can use
--download-f2cblaslapack --download-blis...

>
> Note 3) If you are running sequentially (no MPI) then also ensure you have
> --with-openmp in your configure options and set an appropriate value for
> the number of OpenMP threads when you run your program.
>
I'm running with MPI .

Regards.

>
>
> >
> > PROBLEM:
> > I got the correct solution, but the function MatCholeskyFactorNumeric( )
> takes too much time to be completed. MatCholeskyFactorSymbolic() and
> MatSolve() are very fast. The test uses a square K matrix of 700k dofs, and
> the MatCholeskyFactorNumeric() takes around 14 minutes, while an iterative
> solver (KSPCG/PCJACOBI) takes 5 seconds to get the solution. Any
> suggestions?
> >
> > Thanks in advance.
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250304/fee15720/attachment.html>


More information about the petsc-users mailing list