[petsc-users] Matload with given parallel layout
Karin&NiKo
niko.karin at gmail.com
Wed Oct 25 11:59:31 CDT 2017
Barry, Matt,
Thank you very much for that clarification. My question is mainly motivated
by the will to transfer test matrices from an industrial software to a more
light and agile PETSc code, in order to run numerical experiments. Since we
are dealing with DDM, I would like to ensure that we keep the parallel
layout of our matrices.
Regards,
Nicolas
2017-10-25 16:55 GMT+02:00 Barry Smith <bsmith at mcs.anl.gov>:
>
> On Oct 25, 2017, at 5:13 AM, Matthew Knepley <knepley at gmail.com> wrote:
>
> On Wed, Oct 25, 2017 at 6:10 AM, Karin&NiKo <niko.karin at gmail.com> wrote:
> Thank you very much for your answer.
> Is there an example in the PETSc tests that shows how to prescribe the
> layout of the Mat that is passed to MatLoad()? I see how to specify a local
> size with MatSetSize but I do not see how to assign an entry of the Mat to
> a given process...
>
> Mat object only have contiguous division of rows on processes, so
> specifying the size of each partition is all you can do.
>
>
> And indeed if you want the same layout as your MatViewed matrix then
> this is exactly what you need. MatView just stores from row 0 to n-1 in
> order and MatLoad reads from 0 to n-1 in order. If you want a different
> parallel ordering, like based on using a partitioning then you load the
> matrix and use MatPartitioningCreate() and then MatCreateSubMatrix() to
> redistribute the matrix (note in this case the "sub matrix" has the same
> size as the original matrix".
>
> Note we don't recommend saving big old matrices to files and then
> reloading them for solves etc. This is not scalable, better to write your
> applications so the entire process doesn't require saving and load matrices.
>
> Barry
>
>
>
> Matt
>
> Thanks,
> Nicolas
>
> 2017-10-25 11:40 GMT+02:00 Matthew Knepley <knepley at gmail.com>:
> On Wed, Oct 25, 2017 at 4:32 AM, Karin&NiKo <niko.karin at gmail.com> wrote:
> Dear PETSc team,
>
> I have a code that creates a parallel matrix based on domain
> decomposition. I serialize this matrix with MatView.
> Then, I would like to relaod it with MatLoad but not leaving PETSc decide
> the parallel layout but rather use the original distribution of the degrees
> of freedom.
> Does MatLoad help to do that? Shall I use the IS of the local/global
> mapping ?
>
> You can prescribe the layout of the Mat that is passed to MatLoad().
> However, if you are asking that the original
> layout be stored in the file, we do not do that. It could be made to work
> by writing the layout to the .info file and
> then having the option call MatSetSizes() on load. Should not be much code.
>
> Thanks,
>
> Matt
>
> I look forward to reading you,
> Nicolas
>
>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20171025/8fba4e52/attachment-0001.html>
More information about the petsc-users
mailing list