[petsc-users] petsc4py: parallel matrix-vector multiplication

Dave May dave.mayhem23 at gmail.com
Sun May 6 07:44:13 CDT 2018


On Sun, 6 May 2018 at 10:40, Robert Speck <r.speck at fz-juelich.de> wrote:

> Hi!
>
> I would like to do a matrix-vector multiplication (besides using linear
> solvers and so on) with petsc4py. I took the matrix from this example
>
(https://bitbucket.org/petsc/petsc4py/src/master/demo/kspsolve/petsc-mat.py)


This example only produces a matrix. And from the code the matrix produced
is identical in serial or parallel.



> and applied it to a PETSc Vector. All works well in serial, but in
> parallel (in particular if ordering becomes relevant) the resulting
> vector looks very different.


Given this, the way you defined the x vector in y = A x must be different
when run on 1 versus N mpi ranks.


Using the shell matrix of this example
> (
> https://bitbucket.org/petsc/petsc4py/src/master/demo/poisson2d/poisson2d.py
> )
> helps, but then I cannot use matrix-based preconditioners for KSP
> directly (right?). I also tried using DMDA for creating vectors and
> matrix and for taking care of their ordering (which seems to be my
> problem here), but that did not help either.
>
> So, my question is this: How do I do easy parallel matrix-vector
> multiplication with petsc4py in a way that allows me to use parallel
> linear solvers etc. later on? I want to deal with spatial decomposition
> as little as possible.


What's the context - are you solving a PDE?

Assuming you are using your own grid object (e.g. as you might have if
solving a PDE), and assuming you are not solving a 1D problem, you actually
have to "deal" with the spatial decomposition otherwise performance could
be quite terrible - even for something simple like a 5 point Laplacian on a
structured grid in 2D

What data structures should I use? DMDA or
> PETSc.Vec() and PETSc.Mat() or something else?


The mat vec product is not causing you a problem. Your issue appears to be
that you do not have a way to label entries in a vector in a consistent
manner.

What's the objective? Are you solving a PDE? If yes, structured grid? If
yes again, use the DMDA. It takes care of all the local-to-global and
global-to-local mapping you need.

Thanks,
  Dave


>
> Thanks!
> -Robert-
>
> --
> Dr. Robert Speck
> Juelich Supercomputing Centre
> Institute for Advanced Simulation
> Forschungszentrum Juelich GmbH
> 52425 Juelich, Germany
>
> Tel: +49 2461 61 1644
> Fax: +49 2461 61 6656
>
> Email:   r.speck at fz-juelich.de
> Website: http://www.fz-juelich.de/ias/jsc/speck_r
> PinT:    http://www.fz-juelich.de/ias/jsc/pint
>
>
>
>
> ------------------------------------------------------------------------------------------------
>
> ------------------------------------------------------------------------------------------------
> Forschungszentrum Juelich GmbH
> 52425 Juelich
> Sitz der Gesellschaft: Juelich
> Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
> Vorsitzender des Aufsichtsrats: MinDir Dr. Karl Eugen Huthmacher
> Geschaeftsfuehrung: Prof. Dr.-Ing. Wolfgang Marquardt (Vorsitzender),
> Karsten Beneke (stellv. Vorsitzender), Prof. Dr.-Ing. Harald Bolt,
> Prof. Dr. Sebastian M. Schmidt
>
> ------------------------------------------------------------------------------------------------
>
> ------------------------------------------------------------------------------------------------
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180506/cfed78b1/attachment.html>


More information about the petsc-users mailing list