[petsc-users] Different results with different distribution of rows

Matthew Knepley knepley at gmail.com
Tue Feb 9 08:00:54 CST 2016


On Tue, Feb 9, 2016 at 7:06 AM, Florian Lindner <mailinglists at xgm.de> wrote:

> Hello,
>
> I use PETSc with 4 MPI processes and I experience different results
> when using different distribution of rows amoung ranks. The code looks
> like that:
>

The default PC is BJacobi/ILU. This depends on the parallel layout because
each
subdomain block is approximately factored. ILU is not very reliable, so it
can fail
as you see below. If you change to LU, you will see the problem

  -sub_pc_type lu

  Thanks,

     Matt


> KSPSetOperators(_solver, _matrixC.matrix, _matrixC.matrix);
> // _solverRtol = 1e-9
> KSPSetTolerances(_solver, _solverRtol, PETSC_DEFAULT, PETSC_DEFAULT,
> PETSC_DEFAULT);
> KSPSetFromOptions(_solver);
>
> // means: MatGetVecs(matrix, nullptr, &vector);
> petsc::Vector Au(_matrixC, "Au");
> petsc::Vector out(_matrixC, "out");
> petsc::Vector in(_matrixA, "in");
>
> // is an identity mapping here
> ierr = VecSetLocalToGlobalMapping(in.vector, _ISmapping);
>
> // fill and assemble vector in
>
> in.view();
> MatMultTranspose(_matrixA.matrix, in.vector, Au.vector);
> Au.view();
> KSPSolve(_solver, Au.vector, out.vector);
>
> out.view();
>
> I have experimented with two variants. The first one, non-working has
> MatrixC rows distributed like that: 3, 4, 2, 2. The other, working, one
> has 4, 3, 2, 2.
>
> All input values: _matrixA, _matrixC, in are identival, Au is identical
> too, but out differs. See the results from object::view below.
>
> Could the bad condition of matrixC be a problem? I'm not using any
> special KSP options. matrixC is of type MATSBAIJ, matrixA is MATAIJ.
>
> Thanks,
> Florian
>
> WRONG
> =================
>
> Vec Object:in 4 MPI processes
>   type: mpi
> Process [0]
> 1
> 2
> Process [1]
> 3
> 4
> Process [2]
> 5
> 6
> Process [3]
> 7
> 8
>
> Vec Object:Au 4 MPI processes
>   type: mpi
> Process [0]
> 36
> 74
> 20
> Process [1]
> 1.09292
> 2.09259
> 3.18584
> 4.20349
> Process [2]
> 5.29708
> 6.31472
> Process [3]
> 7.24012
> 8.23978
>
> // should not be result
> Vec Object:out 4 MPI processes
>   type: mpi
> Process [0]
> -1.10633e+07
> 618058
> 9.01497e+06
> Process [1]
> 0.996195
> 1.98711
> 3.01111
> 4.00203
> Process [2]
> 5.00736
> 6.01644
> Process [3]
> 6.98534
> 7.99442
>
> Mat Object:C 4 MPI processes
>   type: mpisbaij
> row 0: (0, 0)  (3, 1)  (4, 1)  (5, 1)  (6, 1)  (7, 1)  (8, 1)  (9, 1)
> (10, 1)
> row 1: (1, 0)  (3, 0)  (4, 0)  (5, 1)  (6, 1)  (7, 2)  (8, 2)  (9, 3)
> (10, 3)
> row 2: (2, 0)  (3, 0)  (4, 1)  (5, 0)  (6, 1)  (7, 0)  (8, 1)  (9, 0)
> (10, 1)
> row 3: (3, 1)  (4, 0.0183156)  (5, 0.0183156)  (6, 0.000335463)  (7,
> 1.12535e-07)  (8, 2.06115e-09)
> row 4: (4, 1)  (5, 0.000335463)  (6, 0.0183156)  (7, 2.06115e-09)  (8,
> 1.12535e-07)
> row 5: (5, 1)  (6, 0.0183156)  (7, 0.0183156)  (8, 0.000335463)  (9,
> 1.12535e-07)  (10, 2.06115e-09)
> row 6: (6, 1)  (7, 0.000335463)  (8, 0.0183156)  (9, 2.06115e-09)  (10,
> 1.12535e-07)
> row 7: (7, 1)  (8, 0.0183156)  (9, 0.0183156)  (10, 0.000335463)
> row 8: (8, 1)  (9, 0.000335463)  (10, 0.0183156)
> row 9: (9, 1)  (10, 0.0183156)
> row 10: (10, 1)
>
> Mat Object:A 4 MPI processes
>   type: mpiaij
>  1.00000e+00  0.00000e+00  0.00000e+00  1.00000e+00  1.83156e-02
> 1.83156e-02  3.35463e-04  1.12535e-07  2.06115e-09  0.00000e+00  0.00000e+00
>  1.00000e+00  0.00000e+00  1.00000e+00  1.83156e-02  1.00000e+00
> 3.35463e-04  1.83156e-02  2.06115e-09  1.12535e-07  0.00000e+00  0.00000e+00
>  1.00000e+00  1.00000e+00  0.00000e+00  1.83156e-02  3.35463e-04
> 1.00000e+00  1.83156e-02  1.83156e-02  3.35463e-04  1.12535e-07  2.06115e-09
>  1.00000e+00  1.00000e+00  1.00000e+00  3.35463e-04  1.83156e-02
> 1.83156e-02  1.00000e+00  3.35463e-04  1.83156e-02  2.06115e-09  1.12535e-07
>  1.00000e+00  2.00000e+00  0.00000e+00  1.12535e-07  2.06115e-09
> 1.83156e-02  3.35463e-04  1.00000e+00  1.83156e-02  1.83156e-02  3.35463e-04
>  1.00000e+00  2.00000e+00  1.00000e+00  2.06115e-09  1.12535e-07
> 3.35463e-04  1.83156e-02  1.83156e-02  1.00000e+00  3.35463e-04  1.83156e-02
>  1.00000e+00  3.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00
> 1.12535e-07  2.06115e-09  1.83156e-02  3.35463e-04  1.00000e+00  1.83156e-02
>  1.00000e+00  3.00000e+00  1.00000e+00  0.00000e+00  0.00000e+00
> 2.06115e-09  1.12535e-07  3.35463e-04  1.83156e-02  1.83156e-02  1.00000e+00
>
>
>
> RIGHT
> =================
>
> Vec Object:in 4 MPI processes
>   type: mpi
> Process [0]
> 1
> 2
> Process [1]
> 3
> 4
> Process [2]
> 5
> 6
> Process [3]
> 7
> 8
>
> Vec Object:Au 4 MPI processes
>   type: mpi
> Process [0]
> 36
> 74
> 20
> 1.09292
> Process [1]
> 2.09259
> 3.18584
> 4.20349
> Process [2]
> 5.29708
> 6.31472
> Process [3]
> 7.24012
> 8.23978
>
> // should be result
> Vec Object:out 4 MPI processes
>   type: mpi
> Process [0]
> 0
> 0
> 0
> 1
> Process [1]
> 2
> 3
> 4
> Process [2]
> 5
> 6
> Process [3]
> 7
> 8
>
>
> Mat Object:C 4 MPI processes
>   type: mpisbaij
> row 0: (0, 0)  (3, 1)  (4, 1)  (5, 1)  (6, 1)  (7, 1)  (8, 1)  (9, 1)
> (10, 1)
> row 1: (1, 0)  (3, 0)  (4, 0)  (5, 1)  (6, 1)  (7, 2)  (8, 2)  (9, 3)
> (10, 3)
> row 2: (2, 0)  (3, 0)  (4, 1)  (5, 0)  (6, 1)  (7, 0)  (8, 1)  (9, 0)
> (10, 1)
> row 3: (3, 1)  (4, 0.0183156)  (5, 0.0183156)  (6, 0.000335463)  (7,
> 1.12535e-07)  (8, 2.06115e-09)
> row 4: (4, 1)  (5, 0.000335463)  (6, 0.0183156)  (7, 2.06115e-09)  (8,
> 1.12535e-07)
> row 5: (5, 1)  (6, 0.0183156)  (7, 0.0183156)  (8, 0.000335463)  (9,
> 1.12535e-07)  (10, 2.06115e-09)
> row 6: (6, 1)  (7, 0.000335463)  (8, 0.0183156)  (9, 2.06115e-09)  (10,
> 1.12535e-07)
> row 7: (7, 1)  (8, 0.0183156)  (9, 0.0183156)  (10, 0.000335463)
> row 8: (8, 1)  (9, 0.000335463)  (10, 0.0183156)
> row 9: (9, 1)  (10, 0.0183156)
> row 10: (10, 1)
>
> Mat Object:A 4 MPI processes
>   type: mpiaij
>  1.00000e+00  0.00000e+00  0.00000e+00  1.00000e+00  1.83156e-02
> 1.83156e-02  3.35463e-04  1.12535e-07  2.06115e-09  0.00000e+00  0.00000e+00
>  1.00000e+00  0.00000e+00  1.00000e+00  1.83156e-02  1.00000e+00
> 3.35463e-04  1.83156e-02  2.06115e-09  1.12535e-07  0.00000e+00  0.00000e+00
>  1.00000e+00  1.00000e+00  0.00000e+00  1.83156e-02  3.35463e-04
> 1.00000e+00  1.83156e-02  1.83156e-02  3.35463e-04  1.12535e-07  2.06115e-09
>  1.00000e+00  1.00000e+00  1.00000e+00  3.35463e-04  1.83156e-02
> 1.83156e-02  1.00000e+00  3.35463e-04  1.83156e-02  2.06115e-09  1.12535e-07
>  1.00000e+00  2.00000e+00  0.00000e+00  1.12535e-07  2.06115e-09
> 1.83156e-02  3.35463e-04  1.00000e+00  1.83156e-02  1.83156e-02  3.35463e-04
>  1.00000e+00  2.00000e+00  1.00000e+00  2.06115e-09  1.12535e-07
> 3.35463e-04  1.83156e-02  1.83156e-02  1.00000e+00  3.35463e-04  1.83156e-02
>  1.00000e+00  3.00000e+00  0.00000e+00  0.00000e+00  0.00000e+00
> 1.12535e-07  2.06115e-09  1.83156e-02  3.35463e-04  1.00000e+00  1.83156e-02
>  1.00000e+00  3.00000e+00  1.00000e+00  0.00000e+00  0.00000e+00
> 2.06115e-09  1.12535e-07  3.35463e-04  1.83156e-02  1.83156e-02  1.00000e+00
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160209/f0def05b/attachment.html>


More information about the petsc-users mailing list