[petsc-users] Doubts on direct solver with mumps
Emmanuel Ayala
juaneah at gmail.com
Tue Mar 4 22:58:58 CST 2025
El mar, 4 mar 2025 a la(s) 4:25 p.m., Pierre Jolivet (pierre at joliv.et)
escribió:
>
>
> On 4 Mar 2025, at 11:17 PM, Emmanuel Ayala <juaneah at gmail.com> wrote:
>
>
> Well, I just tested the configuration with --download-f2cblaslapack --download-blis
> and the direct solver cholesky with mumps becomes 10x faster :D
>
>
> If your problem is big enough (and 700k unknowns should be by far big
> enough), you should turn on BLR (block low-rank) which will give both
> performance boost and memory savings.
> But a good BLAS is mandatory indeed.
>
Thanks for the advice, I will try it.
Regards.
>
> Thanks,
> Pierre
>
> Thanks!
>
> El mar, 4 mar 2025 a la(s) 3:21 p.m., Emmanuel Ayala (juaneah at gmail.com)
> escribió:
>
>> Thanks for the notes.
>>
>> El mar, 4 mar 2025 a la(s) 2:22 p.m., Barry Smith (bsmith at petsc.dev)
>> escribió:
>>
>>>
>>>
>>> > On Mar 4, 2025, at 2:02 PM, Emmanuel Ayala <juaneah at gmail.com> wrote:
>>> >
>>> > Hello everyone.
>>> >
>>> > I'm trying to solve a linear system (which comes from 3D FEM with
>>> structured DM mesh) with a direct solver. I configured petsc installation
>>> with mumps (–download-mumps –download-scalapack –download-parmetis
>>> –download-metis, --download-hwloc, without ptscotch) and I have the
>>> following functions:
>>> >
>>> > // K is the stiffness matrix, assembly correctly
>>> > // U is the solution vector
>>> > // RHS is the right hand side of the linear equation
>>> >
>>> > Mat Kfactor;
>>> >
>>> > ierr = MatGetFactor(K,MATSOLVERMUMPS, MAT_FACTOR_CHOLESKY,
>>> &Kfactor); CHKERRQ(ierr);
>>> > ierr = MatCholeskyFactorSymbolic(Kfactor,K,0,0); CHKERRQ(ierr);
>>> > ierr = MatCholeskyFactorNumeric(Kfactor,K,0); CHKERRQ(ierr);
>>> > ierr = MatSolve(Kfactor,RHS,U);
>>>
>>>
>>> Note 1) these four lines of code above are not needed if you use are
>>> using KSPSolve to solve the system. The options below will trigger this.
>>> >
>>> > and run with options:
>>> > -pc_type cholesky -pc_factor_mat_solver_type mumps
>>> -mat_mumps_icntl_1 1 -mat_mumps_icntl_13 0 -mat_mumps_icntl_28 2
>>> -mat_mumps_icntl_29 2
>>>
>> OK. I'm solving sepparetelly the direct solver and the iterative one.
>>
>>>
>>> Note 2) it is imperative you use a good BLAS library and optimization
>>> when using mumps. Do not use --download-blaslapack (or friends) and use
>>> --with-debugging=0 in the configure option).
>>>
>> Right. I configured it with --with-debugging=0 and
>> --download-fblaslapack. *So, which is a good BLAS library?* from petsc:
>> ...One can use --download-f2cblaslapack --download-blis...
>>
>>>
>>> Note 3) If you are running sequentially (no MPI) then also ensure you
>>> have --with-openmp in your configure options and set an appropriate value
>>> for the number of OpenMP threads when you run your program.
>>>
>> I'm running with MPI .
>>
>> Regards.
>>
>>>
>>>
>>> >
>>> > PROBLEM:
>>> > I got the correct solution, but the function MatCholeskyFactorNumeric(
>>> ) takes too much time to be completed. MatCholeskyFactorSymbolic() and
>>> MatSolve() are very fast. The test uses a square K matrix of 700k dofs, and
>>> the MatCholeskyFactorNumeric() takes around 14 minutes, while an iterative
>>> solver (KSPCG/PCJACOBI) takes 5 seconds to get the solution. Any
>>> suggestions?
>>> >
>>> > Thanks in advance.
>>>
>>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250304/fc4a5895/attachment-0001.html>
More information about the petsc-users
mailing list