[petsc-users] Block matrix data layout and access

Gregor.Matura at dlr.de Gregor.Matura at dlr.de
Mon Nov 5 07:10:41 CST 2012


 Strongly recommend using MatSetValues[Blocked][Local]() instead of MatCreateMPIAIJWithSplitArrays(). Just preallocate and set the values from wherever you know them. This way, the same code will work immediately with different matrix formats and it's easier to transition to generating the matrix on-the-fly (rather than reading from disk which is always a bottleneck).


Maybe I have not explained my situation well enough: I do already have a matrix on the disk. This makes generating the matrix on-the-fly either a burden for the filesystem (because of to many accesses in a short time) or for the memory (because data has to be duplicated). I want to avoid both, thus read in my matrix in one go and pass this data conveniently to PETSc. As far as my insights go, there are two possibilities with certain drawbacks, maybe you could uncover more details on both (like convenience of data layout, assembly time, solution speed, …).

1.)MatCreateBAIJ (includes MatSetBlockSize and MatMPIBAIJSetPreallocation) > MatSetLocalToGlobalMapping > MatSetValuesBlockedLocal
drawbacks:
- data is copied (thus doubles memory needs)
- ALL local rows has to be read in to get information for preallocation
- but: SetValuesBlockedLocal requires "square" layout of data (not suited for sparse matrices). In my case this leads to setting values (block) row by (block) row.
- and: value array does not match block structure (cf. documentation example of MatSetValuesBlocked): my values are sorted sequentially by blocks, i.e. v[] = [1,2,5,6,3,4,7,8,...]

2.)MatCreateMPIAIJWithSplitArrays
drawbacks:
? doesn't match PETSc data layout -> slower?
- not blocked (conversion to MPIBAIJ possible?)

Regards,

Gregor Matura

——————————————————————————
Deutsches Zentrum für Luft- und Raumfahrt e.V. (DLR)
German Aerospace Center


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20121105/279892c7/attachment.html>


More information about the petsc-users mailing list