[petsc-users] How to solve multiple linear systems in parallel, one on each process?
Dave May
dave.mayhem23 at gmail.com
Thu Dec 6 07:58:04 CST 2018
On Thu, 6 Dec 2018 at 13:40, Klaus Burkart <k_burkart at yahoo.com> wrote:
> I was wrong with the decomposed case structure:
>
> In case of a multicore simulation, I get a decomposed case with local data
> (matrix+rhs+x, neighbour processors (==rank) numbers) on each process/rank.
> There are no halo regions (instead the original application works with data
> streams to exchange processor border data). There's no global addressing
> available at this point. The underlying mesh is decomposed using scotch.
>
It sounds like you are thus unable to explicitly assemble a sparse matrix
of type MAT{SEQ,MPI}AIJ.
That severely limits you choice of "out-of-the-box" preconditioners
available through PETSc.
If your code represented the halo parts and you stored the assembled
sub-domain operator, I suspect you could use MATIS which would give you
access to PCBDDC.
>
> Is this data useable/sufficient to solve a linear system with petsc?
>
You can wrap you action of y = A x in a MATSHELL and feed that to any
Krylov method.
A Krylov method without a preconditioner is pretty useless. At the very
least you should implement MatGetDiagonal. Then you could use GMRES/JACOBI
Thanks,
Dave
> If so, what data structure would be required by petsc?
>
> Klaus
>
> Am Montag, 3. Dezember 2018, 15:12:58 MEZ hat Dave May <
> dave.mayhem23 at gmail.com> Folgendes geschrieben:
>
>
>
>
> On Mon, 3 Dec 2018 at 13:52, Klaus Burkart via petsc-users <
> petsc-users at mcs.anl.gov> wrote:
>
> Hello,
>
> I want to solve a cfd case, after decomposition, I get a sub matrix
> allocated to each process. The example below shows how the data is
> allocated to the processes (the sample data includes only the lower parts
> of the matrices). Row and column addresses are local.
>
> What petsc program setup/concept can be used to solve multiple linear
> systems in parallel, one on each process?
>
>
> Create your KSP, Mat and Vec objects using PETSC_COMM_SELF.
>
>
>
>
> Klaus
>
> Sample raw data:
>
> process matrix section value row column
> 0 lower 1,80E-05 1 0
> 0 lower 1,80E-05 5 0
> 0 lower 1,96E-05 2 1
> 0 lower 2,20E-05 6 1
> 0 lower 1,96E-05 3 2
> 0 lower 2,20E-05 7 2
> 0 lower 1,96E-05 4 3
> 0 lower 2,20E-05 8 3
> 0 lower 2,20E-05 9 4
> 0 lower 2,20E-05 6 5
> 0 lower 1,96E-05 10 5
> 0 lower 2,44E-05 7 6
> 0 lower 2,44E-05 11 6
> 0 lower 2,44E-05 8 7
> 0 lower 2,44E-05 12 7
> 0 lower 2,44E-05 9 8
> 0 lower 2,44E-05 13 8
> 0 lower 2,44E-05 14 9
> 0 lower 2,20E-05 11 10
> 0 lower 1,96E-05 15 10
> 0 lower 2,44E-05 12 11
> 0 lower 2,44E-05 16 11
> 0 lower 2,44E-05 13 12
> 0 lower 2,44E-05 17 12
> 0 lower 2,44E-05 14 13
> 0 lower 2,44E-05 18 13
> 0 lower 2,44E-05 19 14
> 0 lower 2,20E-05 16 15
> 0 lower 1,96E-05 20 15
> 0 lower 2,44E-05 17 16
> 0 lower 2,44E-05 21 16
> 0 lower 2,44E-05 18 17
> 0 lower 2,44E-05 22 17
> 0 lower 2,44E-05 19 18
> 0 lower 2,44E-05 23 18
> 0 lower 2,44E-05 24 19
> 0 lower 2,20E-05 21 20
> 0 lower 2,44E-05 22 21
> 0 lower 2,44E-05 23 22
> 0 lower 2,44E-05 24 23
> 1 lower 2,20E-05 1 0
> 1 lower 1,96E-05 5 0
> 1 lower 2,44E-05 2 1
> 1 lower 2,44E-05 6 1
> 1 lower 2,44E-05 3 2
> 1 lower 2,44E-05 7 2
> 1 lower 2,44E-05 4 3
> 1 lower 2,44E-05 8 3
> 1 lower 2,44E-05 9 4
> 1 lower 2,20E-05 6 5
> 1 lower 1,96E-05 10 5
> 1 lower 2,44E-05 7 6
> 1 lower 2,44E-05 11 6
> 1 lower 2,44E-05 8 7
> 1 lower 2,44E-05 12 7
> 1 lower 2,44E-05 9 8
> 1 lower 2,44E-05 13 8
> 1 lower 2,44E-05 14 9
> 1 lower 2,20E-05 11 10
> 1 lower 1,96E-05 15 10
> 1 lower 2,44E-05 12 11
> 1 lower 2,44E-05 16 11
> 1 lower 2,44E-05 13 12
> 1 lower 2,44E-05 17 12
> 1 lower 2,44E-05 14 13
> 1 lower 2,44E-05 18 13
> 1 lower 2,44E-05 19 14
> 1 lower 2,20E-05 16 15
> 1 lower 1,80E-05 20 15
> 1 lower 2,44E-05 17 16
> 1 lower 2,20E-05 21 16
> 1 lower 2,44E-05 18 17
> 1 lower 2,20E-05 22 17
> 1 lower 2,44E-05 19 18
> 1 lower 2,20E-05 23 18
> 1 lower 2,20E-05 24 19
> 1 lower 1,80E-05 21 20
> 1 lower 1,96E-05 22 21
> 1 lower 1,96E-05 23 22
> 1 lower 1,96E-05 24 23
> 2 lower 1,96E-05 1 0
> 2 lower 2,20E-05 5 0
> 2 lower 1,96E-05 2 1
> 2 lower 2,20E-05 6 1
> 2 lower 1,96E-05 3 2
> 2 lower 2,20E-05 7 2
> 2 lower 1,80E-05 4 3
> 2 lower 2,20E-05 8 3
> 2 lower 1,80E-05 9 4
> 2 lower 2,44E-05 6 5
> 2 lower 2,44E-05 10 5
> 2 lower 2,44E-05 7 6
> 2 lower 2,44E-05 11 6
> 2 lower 2,44E-05 8 7
> 2 lower 2,44E-05 12 7
> 2 lower 2,20E-05 9 8
> 2 lower 2,44E-05 13 8
> 2 lower 1,96E-05 14 9
> 2 lower 2,44E-05 11 10
> 2 lower 2,44E-05 15 10
> 2 lower 2,44E-05 12 11
> 2 lower 2,44E-05 16 11
> 2 lower 2,44E-05 13 12
> 2 lower 2,44E-05 17 12
> 2 lower 2,20E-05 14 13
> 2 lower 2,44E-05 18 13
> 2 lower 1,96E-05 19 14
> 2 lower 2,44E-05 16 15
> 2 lower 2,44E-05 20 15
> 2 lower 2,44E-05 17 16
> 2 lower 2,44E-05 21 16
> 2 lower 2,44E-05 18 17
> 2 lower 2,44E-05 22 17
> 2 lower 2,20E-05 19 18
> 2 lower 2,44E-05 23 18
> 2 lower 1,96E-05 24 19
> 2 lower 2,44E-05 21 20
> 2 lower 2,44E-05 22 21
> 2 lower 2,44E-05 23 22
> 2 lower 2,20E-05 24 23
> 3 lower 2,44E-05 1 0
> 3 lower 2,44E-05 5 0
> 3 lower 2,44E-05 2 1
> 3 lower 2,44E-05 6 1
> 3 lower 2,44E-05 3 2
> 3 lower 2,44E-05 7 2
> 3 lower 2,20E-05 4 3
> 3 lower 2,44E-05 8 3
> 3 lower 1,96E-05 9 4
> 3 lower 2,44E-05 6 5
> 3 lower 2,44E-05 10 5
> 3 lower 2,44E-05 7 6
> 3 lower 2,44E-05 11 6
> 3 lower 2,44E-05 8 7
> 3 lower 2,44E-05 12 7
> 3 lower 2,20E-05 9 8
> 3 lower 2,44E-05 13 8
> 3 lower 1,96E-05 14 9
> 3 lower 2,44E-05 11 10
> 3 lower 2,44E-05 15 10
> 3 lower 2,44E-05 12 11
> 3 lower 2,44E-05 16 11
> 3 lower 2,44E-05 13 12
> 3 lower 2,44E-05 17 12
> 3 lower 2,20E-05 14 13
> 3 lower 2,44E-05 18 13
> 3 lower 1,96E-05 19 14
> 3 lower 2,44E-05 16 15
> 3 lower 2,20E-05 20 15
> 3 lower 2,44E-05 17 16
> 3 lower 2,20E-05 21 16
> 3 lower 2,44E-05 18 17
> 3 lower 2,20E-05 22 17
> 3 lower 2,20E-05 19 18
> 3 lower 2,20E-05 23 18
> 3 lower 1,80E-05 24 19
> 3 lower 1,96E-05 21 20
> 3 lower 1,96E-05 22 21
> 3 lower 1,96E-05 23 22
> 3 lower 1,80E-05 24 23
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181206/0094d409/attachment-0001.html>
More information about the petsc-users
mailing list