[petsc-users] KSPSolve with MPIAIJ with non-square 'diagonal parts'
Stefano Zampini
stefano.zampini at gmail.com
Mon Aug 30 15:42:21 CDT 2021
What is the error you are getting from the KSP? Default solver in parallel in BlockJacobi+ILU which does not work for non-square matrices. You do not need to call PCSetFromOptions on the pc. Just call KSPSetFromOptions and run with -pc_type none
> On Aug 30, 2021, at 9:17 PM, Olivier Jamond <olivier.jamond at cea.fr> wrote:
>
> Hello,
>
> I am sorry because I surely miss something, but I cannot manage to solve a problem with a MPIAIJ matrix which has non-square 'diagonal parts'.
>
> I copy/paste at the bottom of this message a very simple piece of code which causes me troubles. I this code I try to do 'x=KSP(A)*b' (with gmres/jacobi), but this fails whereas a matrix multiplication 'b=A*x' seems to work. Is ksp with such a matrix supposed to work (I can't find anything in the documentation about that, so I guess that it is...) ?
>
> Many thanks,
> Olivier
>
> NB: this code should be launched with exactly 3 procs
>
> #include "petscsys.h" /* framework routines */
> #include "petscvec.h" /* vectors */
> #include "petscmat.h" /* matrices */
> #include "petscksp.h"
>
> #include <vector>
> #include <string>
> #include <iostream>
> #include <numeric>
>
> static char help[] = "Trying to solve a linear system on a sub-block of a matrix\n\n";
> int main(int argc, char **argv)
> {
> MPI_Init(NULL, NULL);
> PetscErrorCode ierr;
> ierr = PetscInitialize(&argc, &argv, NULL, help);
> CHKERRQ(ierr);
>
> // clang-format off
> std::vector<std::vector<double>> AA = {
> { 1, 2, 0, /**/ 0, 3, /**/ 0, 0, 4},
> { 0, 5, 6, /**/ 7, 0, /**/ 0, 8, 0},
> { 9, 0, 10, /**/11, 0, /**/ 0, 12, 0},
> //---------------------------------------
> {13, 0, 14, /**/15, 16, /**/17, 0, 0},
> { 0, 18, 0, /**/19, 20, /**/21, 0, 0},
> { 0, 0, 0, /**/22, 23, /**/ 1, 24, 0},
> //--------------------------------------
> {25, 26, 27, /**/ 0, 0, /**/28, 29, 0},
> {30, 0, 0, /**/31, 32, /**/33, 0, 34},
> };
>
>
> std::vector<double> bb = {1.,
> 1.,
> 1.,
> //
> 1.,
> 1.,
> 1.,
> //
> 1.,
> 1.};
>
>
> std::vector<int> nDofsRow = {3, 3, 2};
> std::vector<int> nDofsCol = {3, 2, 3};
> // clang-format on
>
> int NDofs = std::accumulate(nDofsRow.begin(), nDofsRow.end(), 0);
>
> int pRank, nProc;
> MPI_Comm_rank(PETSC_COMM_WORLD, &pRank);
> MPI_Comm_size(PETSC_COMM_WORLD, &nProc);
>
> if (nProc != 3) {
> std::cerr << "THIS TEST MUST BE LAUNCHED WITH EXACTLY 3 PROCS\n";
> abort();
> }
>
> Mat A;
> MatCreate(PETSC_COMM_WORLD, &A);
> MatSetType(A, MATMPIAIJ);
> MatSetSizes(A, nDofsRow[pRank], nDofsCol[pRank], PETSC_DETERMINE, PETSC_DETERMINE);
> MatMPIAIJSetPreallocation(A, NDofs, NULL, NDofs, NULL);
>
> Vec b;
> VecCreate(PETSC_COMM_WORLD, &b);
> VecSetType(b, VECMPI);
> VecSetSizes(b, nDofsRow[pRank], PETSC_DECIDE);
>
> if (pRank == 0) {
> for (int i = 0; i < NDofs; ++i) {
> for (int j = 0; j < NDofs; ++j) {
> if (AA[i][j] != 0.) {
> MatSetValue(A, i, j, AA[i][j], ADD_VALUES);
> }
> }
> VecSetValue(b, i, bb[i], ADD_VALUES);
> }
> }
>
> MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY);
> MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY);
> VecAssemblyBegin(b);
> VecAssemblyEnd(b);
>
> PetscViewerPushFormat(PETSC_VIEWER_STDOUT_WORLD, PETSC_VIEWER_ASCII_DENSE);
> MatView(A, PETSC_VIEWER_STDOUT_WORLD);
> VecView(b, PETSC_VIEWER_STDOUT_WORLD);
>
> KSP ksp;
> KSPCreate(PETSC_COMM_WORLD, &ksp);
> KSPSetOperators(ksp, A, A);
> KSPSetFromOptions(ksp);
>
> PC pc;
> KSPGetPC(ksp, &pc);
> PCSetFromOptions(pc);
>
> Vec x;
> MatCreateVecs(A, &x, NULL);
> ierr = KSPSolve(ksp, b, x); // this fails
> MatMult(A, x, b); // whereas the seems to be ok...
>
> VecView(x, PETSC_VIEWER_STDOUT_WORLD);
>
> MPI_Finalize();
>
> return 0;
> }
>
More information about the petsc-users
mailing list