[petsc-users] Variable Block Row format in PETSc

Satish Balay balay at mcs.anl.gov
Tue May 14 17:25:56 CDT 2013


sure - the internal representation in the link below is different -
but if the AIJ matrix assembled is as shown below then the inodes
code will detect it and use it.

Satish

>>>>>>>>>>>>>>>>>>>>
   0  1    2  3  4    5    6  7  8

  +------+---------+----+-------+

0 | 1  2 |         |  3 |       |

1 | 4  5 |         |  6 |       |

  +------+---------+----+-------+

2 |      | 7  8  9 | 10 |       |

  +------+---------+----+-------+

3 |      |         | 11 | 12 13 |

4 |      |         | 14 | 15 16 |

5 |      |         | 17 | 18 19 |

  +------+---------+----+-------+

6
<<<<<<<<<<<


On Tue, 14 May 2013, Jed Brown wrote:

> It's not the same data layout, but I recommend that you just use AIJ and
> let it optimize internally. Partition the graph first, then allocate the
> matrix, then compute the Jacobian entries and use MatSetValues.
> On May 14, 2013 5:11 PM, "Satish Balay" <balay at mcs.anl.gov> wrote:
> 
> > AIJ matrix format internally supports VBR listed below [called inodes in
> > PETSc]
> >
> > So I'm not sure what problem you are having.
> >
> > Satish
> >
> > On Tue, 14 May 2013, Longxiang Chen wrote:
> >
> > > VBR like in this link, use 6 arrays to represent a matrix.
> > >
> > http://docs.oracle.com/cd/E19061-01/hpc.cluster5/817-0086-10/prog-sparse-support.html
> > >
> > > Each row is a vertex in the graph, , and use parmetis to partition the
> > > graph to minimize the number of cuts between different processors.
> > (reduce
> > > communication when calculate Matrix-Vector)
> > > The matrix is calculated from Jacobian and construct the A and b from the
> > > result of Jacobian (in VBR).
> > >
> > >
> > > Best regards,
> > > Longxiang Chen
> > >
> > > Do something every day that gets you closer to being done.
> > > --------------------------------------------------------------
> > > 465 Winston Chung Hall
> > > Computer Science Engineering
> > > University of California, Riverside
> > >
> > >
> > >
> > > On Tue, May 14, 2013 at 2:51 PM, Jed Brown <jedbrown at mcs.anl.gov> wrote:
> > >
> > > > What kind of VBR matrix? What are you partitioning using parmetis? A
> > mesh?
> > > > The blocks of the matrix? How do you create the entries in the matrix?
> > > > On May 14, 2013 4:36 PM, "Longxiang Chen" <suifengls at gmail.com> wrote:
> > > >
> > > >> To whom it may concern,
> > > >>
> > > >> I use parmetis to partition a mesh for a sparse matrix.
> > > >> Then I distribute the data to the  appropriate processors according to
> > > >> the result of partition.
> > > >>
> > > >> The sparse matrix is stored in Variable Block Row(VBR) format.
> > > >> After the distribution, I want to call PETSc KSP solver to solve Ax =
> > b.
> > > >> I tried to convert VBR to AIJ or CSR format, but the data would be
> > > >> re-distributed.
> > > >>
> > > >> The ideal method is to keep the distribution result from parmetis.
> > > >> For example, after parmetis, processor 0 has 0, 1, 4, and processor 1
> > > >> has 2, 3, 5. I wish the PETSc would not change this distribution and
> > > >> solve Ax = b.
> > > >>
> > > >> Are there any approaches to call KSP solver in VBR format from PETSc?
> > > >> Or any suggestions for solving Ax = b?
> > > >>
> > > >> Thanks in advance.
> > > >>
> > > >> Regards,
> > > >> Longxiang Chen
> > > >>
> > > >>
> > >
> >
> >
> 



More information about the petsc-users mailing list