[petsc-users] Distribution of columns on mutliple ranks
Matthew Knepley
knepley at gmail.com
Tue Apr 14 08:58:37 CDT 2015
On Tue, Apr 14, 2015 at 4:11 AM, Florian Lindner <mailinglists at xgm.de>
wrote:
> Hello,
>
> given I have this piece of python code:
>
> rank = MPI.COMM_WORLD.Get_rank()
> sizes = [ 4, 5]
> n = sizes[rank]
>
> A = PETSc.Mat()
> A.create()
> A.setSizes( ((n, PETSc.DETERMINE), (n, PETSc.DETERMINE)) ) # sets the
> local size, let petsc determine global size
> A.setFromOptions(); A.setUp()
>
> print("Global Size = ", A.getSize())
> print("Local Size = ", A.getLocalSize())
>
>
> and I run it with mpirun -n 2 and -mat_type dense (this, I think, should
> not be relevant), getSize() returns 9x9 for both ranks of couse.
>
> getLocalSize returns 4x4 and 5x5.
>
> I understand that the A is distributed like that:
>
> /4x4 ? \
> \ ? 5x5 /
>
>
> When I would give ( (PETSC_DETERMINE, 9), (PETSC_DETERMINE, 9) ) to
> setSizes it would look like that:
>
> / 4 x 9 \
> \ 5 x 9 /
>
> because PETSc uses row based partitioning.
>
> What happens when I set local rows and local cols, where do the parts that
> I marked with ? live?
>
> Thanks, ;-)
PETSc always uses row partitioning, no matter what arguments are set for
local columns.
Thanks,
Matt
>
> Florian
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150414/eda0d9cf/attachment-0001.html>
More information about the petsc-users
mailing list