[petsc-users] 1D parallel split of dense matrix

Yann Jobic yann.jobic at univ-amu.fr
Thu Sep 17 16:26:20 CDT 2020


Hello Pierre,
I just tested it, it's working just fine !
I thought i could have problems using mumps, as i rapidly read your 
article, and that i may be using MatMatSolve, but no, it's working.
I have correct results.
This interface for multi-RHS  is indeed very handy.
One question : I used KSPSetMatSolveBlockSize(ksp,1);
I don't know what it is doing, i saw it on your example.
Is it mandatory ?
Thanks a lot,
Yann

Le 9/17/2020 à 6:08 PM, Pierre Jolivet a écrit :
> Hello Yann,
> This is probably not fully answering your question, but the proper way 
> to solve a system with N RHS is _not_ to use KSPSolve(), but instead 
> KSPMatSolve(), cf. 
> https://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/KSP/KSPMatSolve.html.
> If you are tracking master (from the GitLab repository), it’s available 
> out of the box. If you are using the release tarballs, it will be 
> available in 3.14.0 scheduled to be released in a couple of days.
> If you want to know more about the current status of block solvers in 
> PETSc, please feel free to have a look at this preprint: 
> http://jolivet.perso.enseeiht.fr/article.pdf
> If you are using a specific PC which is not “multi-RHS ready”, see the 
> list at the top of page 5, please let me know and I’ll tell you how easy 
> to support it.
> Thanks,
> Pierre


More information about the petsc-users mailing list