<div dir="ltr"><div><div><div><div><div><div><div>If you don't specify preconditioner via -pc_type XXX, the default being used is BJacobi-ILU.<br></div>This preconditioner will yield different results on different numbers of MPI-processes, and will yield different results for a fixed number of different MPI-processes, but with a different matrix partitioning. If you operator is singular (or close to singular), ILU is likely to fail.<br></div><div><br></div>To be sure your code is working correctly, test it using a preconditioner which isn't dependent on the partitioning of the matrix. I would use these options:<br></div> -pc_type jacobi<br></div> -ksp_monitor_true_residual<br><br></div>The last option is useful as it will report both the preconditioned residual and the true residual. If the operator is singular, or close to singular, gmres-ILU or gmres-BJacobi-ILU can report a preconditioned residual which is small, but is orders of magnitude different from the true residual.<br><br></div>Thanks,<br></div> Dave<br><div><div><div><div><div><div><div><div><div><br><br></div></div></div></div></div></div></div></div></div></div><div class="gmail_extra"><br><div class="gmail_quote">On 9 February 2016 at 14:39, Florian Lindner <span dir="ltr"><<a href="mailto:mailinglists@xgm.de" target="_blank">mailinglists@xgm.de</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Addition. The KSP Solver shows very different convergence:<br>
<br>
<br>
WRONG:<br>
<br>
[0] KSPConvergedDefault(): Linear solver has converged. Residual norm<br>
6.832362172732e+06 is less than relative tolerance 1.000000000000e-09<br>
times initial right hand side norm 6.934533099989e+15 at iteration 8447<br>
<br>
RIGHT:<br>
<br>
[0] KSPConvergedDefault(): Linear solver has converged. Residual norm<br>
7.959757133341e-08 is less than relative tolerance 1.000000000000e-09<br>
times initial right hand side norm 1.731788191624e+02 at iteration 9<br>
<br>
Best,<br>
Florian<br>
<div class="HOEnZb"><div class="h5"><br>
On Tue, 9 Feb 2016 14:06:01 +0100<br>
Florian Lindner <<a href="mailto:mailinglists@xgm.de">mailinglists@xgm.de</a>> wrote:<br>
<br>
> Hello,<br>
><br>
> I use PETSc with 4 MPI processes and I experience different results<br>
> when using different distribution of rows amoung ranks. The code looks<br>
> like that:<br>
><br>
><br>
> KSPSetOperators(_solver, _matrixC.matrix, _matrixC.matrix);<br>
> // _solverRtol = 1e-9<br>
> KSPSetTolerances(_solver, _solverRtol, PETSC_DEFAULT, PETSC_DEFAULT,<br>
> PETSC_DEFAULT);<br>
> KSPSetFromOptions(_solver);<br>
><br>
> // means: MatGetVecs(matrix, nullptr, &vector);<br>
> petsc::Vector Au(_matrixC, "Au");<br>
> petsc::Vector out(_matrixC, "out");<br>
> petsc::Vector in(_matrixA, "in");<br>
><br>
> // is an identity mapping here<br>
> ierr = VecSetLocalToGlobalMapping(in.vector, _ISmapping);<br>
><br>
> // fill and assemble vector in<br>
><br>
> in.view();<br>
> MatMultTranspose(_matrixA.matrix, in.vector, Au.vector);<br>
> Au.view();<br>
> KSPSolve(_solver, Au.vector, out.vector);<br>
><br>
> out.view();<br>
><br>
> I have experimented with two variants. The first one, non-working has<br>
> MatrixC rows distributed like that: 3, 4, 2, 2. The other, working,<br>
> one has 4, 3, 2, 2.<br>
><br>
> All input values: _matrixA, _matrixC, in are identival, Au is<br>
> identical too, but out differs. See the results from object::view<br>
> below.<br>
><br>
> Could the bad condition of matrixC be a problem? I'm not using any<br>
> special KSP options. matrixC is of type MATSBAIJ, matrixA is MATAIJ.<br>
><br>
> Thanks,<br>
> Florian<br>
><br>
> WRONG<br>
> =================<br>
><br>
> Vec Object:in 4 MPI processes<br>
> type: mpi<br>
> Process [0]<br>
> 1<br>
> 2<br>
> Process [1]<br>
> 3<br>
> 4<br>
> Process [2]<br>
> 5<br>
> 6<br>
> Process [3]<br>
> 7<br>
> 8<br>
><br>
> Vec Object:Au 4 MPI processes<br>
> type: mpi<br>
> Process [0]<br>
> 36<br>
> 74<br>
> 20<br>
> Process [1]<br>
> 1.09292<br>
> 2.09259<br>
> 3.18584<br>
> 4.20349<br>
> Process [2]<br>
> 5.29708<br>
> 6.31472<br>
> Process [3]<br>
> 7.24012<br>
> 8.23978<br>
><br>
> // should not be result<br>
> Vec Object:out 4 MPI processes<br>
> type: mpi<br>
> Process [0]<br>
> -1.10633e+07<br>
> 618058<br>
> 9.01497e+06<br>
> Process [1]<br>
> 0.996195<br>
> 1.98711<br>
> 3.01111<br>
> 4.00203<br>
> Process [2]<br>
> 5.00736<br>
> 6.01644<br>
> Process [3]<br>
> 6.98534<br>
> 7.99442<br>
><br>
> Mat Object:C 4 MPI processes<br>
> type: mpisbaij<br>
> row 0: (0, 0) (3, 1) (4, 1) (5, 1) (6, 1) (7, 1) (8, 1) (9,<br>
> 1) (10, 1) row 1: (1, 0) (3, 0) (4, 0) (5, 1) (6, 1) (7, 2)<br>
> (8, 2) (9, 3) (10, 3) row 2: (2, 0) (3, 0) (4, 1) (5, 0) (6,<br>
> 1) (7, 0) (8, 1) (9, 0) (10, 1) row 3: (3, 1) (4, 0.0183156)<br>
> (5, 0.0183156) (6, 0.000335463) (7, 1.12535e-07) (8, 2.06115e-09)<br>
> row 4: (4, 1) (5, 0.000335463) (6, 0.0183156) (7, 2.06115e-09)<br>
> (8, 1.12535e-07) row 5: (5, 1) (6, 0.0183156) (7, 0.0183156) (8,<br>
> 0.000335463) (9, 1.12535e-07) (10, 2.06115e-09) row 6: (6, 1) (7,<br>
> 0.000335463) (8, 0.0183156) (9, 2.06115e-09) (10, 1.12535e-07) row<br>
> 7: (7, 1) (8, 0.0183156) (9, 0.0183156) (10, 0.000335463) row 8:<br>
> (8, 1) (9, 0.000335463) (10, 0.0183156) row 9: (9, 1) (10,<br>
> 0.0183156) row 10: (10, 1)<br>
><br>
> Mat Object:A 4 MPI processes<br>
> type: mpiaij<br>
> 1.00000e+00 0.00000e+00 0.00000e+00 1.00000e+00 1.83156e-02<br>
> 1.83156e-02 3.35463e-04 1.12535e-07 2.06115e-09 0.00000e+00<br>
> 0.00000e+00 1.00000e+00 0.00000e+00 1.00000e+00 1.83156e-02<br>
> 1.00000e+00 3.35463e-04 1.83156e-02 2.06115e-09 1.12535e-07<br>
> 0.00000e+00 0.00000e+00 1.00000e+00 1.00000e+00 0.00000e+00<br>
> 1.83156e-02 3.35463e-04 1.00000e+00 1.83156e-02 1.83156e-02<br>
> 3.35463e-04 1.12535e-07 2.06115e-09 1.00000e+00 1.00000e+00<br>
> 1.00000e+00 3.35463e-04 1.83156e-02 1.83156e-02 1.00000e+00<br>
> 3.35463e-04 1.83156e-02 2.06115e-09 1.12535e-07 1.00000e+00<br>
> 2.00000e+00 0.00000e+00 1.12535e-07 2.06115e-09 1.83156e-02<br>
> 3.35463e-04 1.00000e+00 1.83156e-02 1.83156e-02 3.35463e-04<br>
> 1.00000e+00 2.00000e+00 1.00000e+00 2.06115e-09 1.12535e-07<br>
> 3.35463e-04 1.83156e-02 1.83156e-02 1.00000e+00 3.35463e-04<br>
> 1.83156e-02 1.00000e+00 3.00000e+00 0.00000e+00 0.00000e+00<br>
> 0.00000e+00 1.12535e-07 2.06115e-09 1.83156e-02 3.35463e-04<br>
> 1.00000e+00 1.83156e-02 1.00000e+00 3.00000e+00 1.00000e+00<br>
> 0.00000e+00 0.00000e+00 2.06115e-09 1.12535e-07 3.35463e-04<br>
> 1.83156e-02 1.83156e-02 1.00000e+00<br>
><br>
><br>
><br>
> RIGHT<br>
> =================<br>
><br>
> Vec Object:in 4 MPI processes<br>
> type: mpi<br>
> Process [0]<br>
> 1<br>
> 2<br>
> Process [1]<br>
> 3<br>
> 4<br>
> Process [2]<br>
> 5<br>
> 6<br>
> Process [3]<br>
> 7<br>
> 8<br>
><br>
> Vec Object:Au 4 MPI processes<br>
> type: mpi<br>
> Process [0]<br>
> 36<br>
> 74<br>
> 20<br>
> 1.09292<br>
> Process [1]<br>
> 2.09259<br>
> 3.18584<br>
> 4.20349<br>
> Process [2]<br>
> 5.29708<br>
> 6.31472<br>
> Process [3]<br>
> 7.24012<br>
> 8.23978<br>
><br>
> // should be result<br>
> Vec Object:out 4 MPI processes<br>
> type: mpi<br>
> Process [0]<br>
> 0<br>
> 0<br>
> 0<br>
> 1<br>
> Process [1]<br>
> 2<br>
> 3<br>
> 4<br>
> Process [2]<br>
> 5<br>
> 6<br>
> Process [3]<br>
> 7<br>
> 8<br>
><br>
><br>
> Mat Object:C 4 MPI processes<br>
> type: mpisbaij<br>
> row 0: (0, 0) (3, 1) (4, 1) (5, 1) (6, 1) (7, 1) (8, 1) (9,<br>
> 1) (10, 1) row 1: (1, 0) (3, 0) (4, 0) (5, 1) (6, 1) (7, 2)<br>
> (8, 2) (9, 3) (10, 3) row 2: (2, 0) (3, 0) (4, 1) (5, 0) (6,<br>
> 1) (7, 0) (8, 1) (9, 0) (10, 1) row 3: (3, 1) (4, 0.0183156)<br>
> (5, 0.0183156) (6, 0.000335463) (7, 1.12535e-07) (8, 2.06115e-09)<br>
> row 4: (4, 1) (5, 0.000335463) (6, 0.0183156) (7, 2.06115e-09)<br>
> (8, 1.12535e-07) row 5: (5, 1) (6, 0.0183156) (7, 0.0183156) (8,<br>
> 0.000335463) (9, 1.12535e-07) (10, 2.06115e-09) row 6: (6, 1) (7,<br>
> 0.000335463) (8, 0.0183156) (9, 2.06115e-09) (10, 1.12535e-07) row<br>
> 7: (7, 1) (8, 0.0183156) (9, 0.0183156) (10, 0.000335463) row 8:<br>
> (8, 1) (9, 0.000335463) (10, 0.0183156) row 9: (9, 1) (10,<br>
> 0.0183156) row 10: (10, 1)<br>
><br>
> Mat Object:A 4 MPI processes<br>
> type: mpiaij<br>
> 1.00000e+00 0.00000e+00 0.00000e+00 1.00000e+00 1.83156e-02<br>
> 1.83156e-02 3.35463e-04 1.12535e-07 2.06115e-09 0.00000e+00<br>
> 0.00000e+00 1.00000e+00 0.00000e+00 1.00000e+00 1.83156e-02<br>
> 1.00000e+00 3.35463e-04 1.83156e-02 2.06115e-09 1.12535e-07<br>
> 0.00000e+00 0.00000e+00 1.00000e+00 1.00000e+00 0.00000e+00<br>
> 1.83156e-02 3.35463e-04 1.00000e+00 1.83156e-02 1.83156e-02<br>
> 3.35463e-04 1.12535e-07 2.06115e-09 1.00000e+00 1.00000e+00<br>
> 1.00000e+00 3.35463e-04 1.83156e-02 1.83156e-02 1.00000e+00<br>
> 3.35463e-04 1.83156e-02 2.06115e-09 1.12535e-07 1.00000e+00<br>
> 2.00000e+00 0.00000e+00 1.12535e-07 2.06115e-09 1.83156e-02<br>
> 3.35463e-04 1.00000e+00 1.83156e-02 1.83156e-02 3.35463e-04<br>
> 1.00000e+00 2.00000e+00 1.00000e+00 2.06115e-09 1.12535e-07<br>
> 3.35463e-04 1.83156e-02 1.83156e-02 1.00000e+00 3.35463e-04<br>
> 1.83156e-02 1.00000e+00 3.00000e+00 0.00000e+00 0.00000e+00<br>
> 0.00000e+00 1.12535e-07 2.06115e-09 1.83156e-02 3.35463e-04<br>
> 1.00000e+00 1.83156e-02 1.00000e+00 3.00000e+00 1.00000e+00<br>
> 0.00000e+00 0.00000e+00 2.06115e-09 1.12535e-07 3.35463e-04<br>
> 1.83156e-02 1.83156e-02 1.00000e+00<br>
</div></div></blockquote></div><br></div>