[petsc-users] 1D parallel split of dense matrix
Yann Jobic
yann.jobic at univ-amu.fr
Thu Sep 17 16:36:05 CDT 2020
Hi Stefano,
You're right, i just switched the rows and colomns, and it's working
fine. However, i've done this in order to have better performance when i
access the rows, with MatGetRow. With this order, i need
MatGetColumnVector, which is believe is kind of slow, as stated in the
comments of the implementation (but i couldn't find it again, maybe i
was looking at an old implementation?).
I still don't understand why i can not use the transpose of this matrix,
as i give the right parallel decomposition. It should be a bijective
operation no ?
I think i'll be using the KSPMatSolve of Pierre, so i don't have to
redevelop this part.
Thanks a lot,
Yann
Le 9/17/2020 à 6:29 PM, Stefano Zampini a écrit :
> Yann
>
> you want to have the number of columns equal to the number of rhs. Not
> the number of rows.
> This is also consistent in terms of logical layouts. Simply create and
> dump the matrix this way and you can read in parallel splitting yhe rows
> ( ie the logical size per process of the vectors)
>
> Il Gio 17 Set 2020, 18:09 Pierre Jolivet <pierre.jolivet at enseeiht.fr
> <mailto:pierre.jolivet at enseeiht.fr>> ha scritto:
>
> Hello Yann,
> This is probably not fully answering your question, but the proper
> way to solve a system with N RHS is _not_ to use KSPSolve(), but
> instead KSPMatSolve(), cf.
> https://www.mcs.anl.gov/petsc/petsc-dev/docs/manualpages/KSP/KSPMatSolve.html.
> If you are tracking master (from the GitLab repository), it’s
> available out of the box. If you are using the release tarballs, it
> will be available in 3.14.0 scheduled to be released in a couple of
> days.
> If you want to know more about the current status of block solvers
> in PETSc, please feel free to have a look at this preprint:
> http://jolivet.perso.enseeiht.fr/article.pdf
> If you are using a specific PC which is not “multi-RHS ready”, see
> the list at the top of page 5, please let me know and I’ll tell you
> how easy to support it.
> Thanks,
> Pierre
>
More information about the petsc-users
mailing list