[petsc-users] Block matrix data layout and access
Jed Brown
jedbrown at mcs.anl.gov
Thu Oct 25 08:52:23 CDT 2012
On Thu, Oct 25, 2012 at 8:43 AM, <Gregor.Matura at dlr.de> wrote:
> Hi!****
>
> ** **
>
> I'm rather new to PETSc and got along on my own so far. However, I've now
> run into the situation that the tutorial does not provide enough detail and
> on the other side going through every example found by grep with the
> possibly right command would take almost forever. So I would be very happy
> to get some hints where to start.****
>
> ** **
>
> My problem: read in a matrix in a special format, partition and
> redistribute it, solve.****
>
> ** **
>
> So far, I've read in my matrix, transformed it according to the input of
> MatCreateMPIAIJWithSplitArrays() and finally, for a start, KSPSolve.
>
Strongly recommend using MatSetValues[Blocked][Local]() instead of
MatCreateMPIAIJWithSplitArrays(). Just preallocate and set the values from
wherever you know them. This way, the same code will work immediately with
different matrix formats and it's easier to transition to generating the
matrix on-the-fly (rather than reading from disk which is always a
bottleneck).
> ****
>
> ** **
>
> Here comes the 'but': my matrix does have a "block" format:****
>
> ** **
>
> |a ... a|****
>
> |a ... a|****
>
> |b ... b|****
>
> |b ... b|****
>
> |c ... c|****
>
> |c ... c|****
>
> ** **
>
> The matrix is sparse (by far not every x is set), every a, b, c, ... is a
> small 5x5 block, each 25 double data values of a block are saved
> consecutively, processor 1 holds every a, #2 every b and so on.****
>
> ** **
>
> PETSc tutorial says (as far as I understood) that PETSc's blocks are
> _logically_ different, but are stored just this way.
>
Not sure which routine you are looking at. MATMPI*B*AIJ is a good format
for systems like this. When the method you intend to use is supported by
BAIJ matrices, they are generally faster by 30% or more. By using
MatSetValues-style matrix assembly, you can switch format a run-time.
> And: in some sense DMDA could be the right way to access matrix data
> layout as PETSc uses it (would be good to achieve best possible
> performance).
>
DMDA is a structured grid abstraction. Don't confuse 2D structured grids
with matrices.
> ****
>
> ** **
>
> This said, my precise questions are: Is DMDACreate2D the right way to go?
> Does it match my data layout best with PETSc's layout? Which example file
> is suited best to read up on? Does this really speeds up time to solution
> or should I stick with the transformed non-block variant?****
>
> ** **
>
> TIA,****
>
> ** **
>
> Gregor Matura****
>
> ** **
>
> ——————————————————————————****
>
> *Deutsches Zentrum für Luft- und Raumfahrt* e.V. (DLR)****
>
> German Aerospace Center****
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20121025/15af0973/attachment-0001.html>
More information about the petsc-users
mailing list