[petsc-users] petsc4py: parallel matrix-vector multiplication

Robert Speck r.speck at fz-juelich.de
Sun May 6 10:52:39 CDT 2018


OK, thanks, I see. This is basically what happens in the poisson2d.py
example, too, right?

I tried it with the shell matrix (?) used in the poisson2d example and
it works right away, but then I fail to see how to make use of the
preconditioners for KSP (see my original message)..

Thanks again!
-Robert-


On 06.05.18 16:52, Jed Brown wrote:
> Robert Speck <r.speck at fz-juelich.de> writes:
> 
>> Thanks for your reply and help. Yes, this is going to be a PDE solver
>> for structured grids. The first goal would be IDC (or Crank-Nicholson)
>> for the heat equation, which would require both solving a linear system
>> and application of the matrix.
>>
>> The code I wrote for testing parallel matrix-vector multiplication can
>> be found here:
>> https://github.com/Parallel-in-Time/pySDC/blob/petsc/pySDC/playgrounds/PETSc/playground_matmult.py
>>
>> Both vectors and matrix come from the DMDA, but I guess filling them is
>> done in a wrong way? Or do I need to convert global/natural vectors to
>> local ones somewhere?
> 
> Global and Natural are not the same (see user's manual for details).
> The matrix acts on a Global vector.  See
> petsc4py/demo/bratu3d/bratu3d.py for examples of efficiently setting
> values (and computing residuals) using Global vectors.  It should be
> simpler/cleaner code than you currently have.
> 
>>
>> Best
>> -Robert-
>>
>>
>>
>> On 06.05.18 14:44, Dave May wrote:
>>> On Sun, 6 May 2018 at 10:40, Robert Speck <r.speck at fz-juelich.de
>>> <mailto:r.speck at fz-juelich.de>> wrote:
>>>
>>>     Hi!
>>>
>>>     I would like to do a matrix-vector multiplication (besides using linear
>>>     solvers and so on) with petsc4py. I took the matrix from this example
>>>
>>>     (https://bitbucket.org/petsc/petsc4py/src/master/demo/kspsolve/petsc-mat.py)
>>>
>>>
>>> This example only produces a matrix. And from the code the matrix
>>> produced is identical in serial or parallel. 
>>>
>>>
>>>
>>>     and applied it to a PETSc Vector. All works well in serial, but in
>>>     parallel (in particular if ordering becomes relevant) the resulting
>>>     vector looks very different. 
>>>
>>>
>>> Given this, the way you defined the x vector in y = A x must be
>>> different when run on 1 versus N mpi ranks. 
>>>
>>>
>>>     Using the shell matrix of this example
>>>     (https://bitbucket.org/petsc/petsc4py/src/master/demo/poisson2d/poisson2d.py)
>>>     helps, but then I cannot use matrix-based preconditioners for KSP
>>>     directly (right?). I also tried using DMDA for creating vectors and
>>>     matrix and for taking care of their ordering (which seems to be my
>>>     problem here), but that did not help either.
>>>
>>>     So, my question is this: How do I do easy parallel matrix-vector
>>>     multiplication with petsc4py in a way that allows me to use parallel
>>>     linear solvers etc. later on? I want to deal with spatial decomposition
>>>     as little as possible. 
>>>
>>>
>>> What's the context - are you solving a PDE?
>>>
>>> Assuming you are using your own grid object (e.g. as you might have if
>>> solving a PDE), and assuming you are not solving a 1D problem, you
>>> actually have to "deal" with the spatial decomposition otherwise
>>> performance could be quite terrible - even for something simple like a 5
>>> point Laplacian on a structured grid in 2D
>>>
>>>     What data structures should I use? DMDA or
>>>     PETSc.Vec() and PETSc.Mat() or something else?
>>>
>>>
>>> The mat vec product is not causing you a problem. Your issue appears to
>>> be that you do not have a way to label entries in a vector in a
>>> consistent manner.
>>>
>>> What's the objective? Are you solving a PDE? If yes, structured grid? If
>>> yes again, use the DMDA. It takes care of all the local-to-global and
>>> global-to-local mapping you need.
>>>
>>> Thanks,
>>>   Dave
>>>
>>>
>>>
>>>     Thanks!
>>>     -Robert-
>>>
>>>     --
>>>     Dr. Robert Speck
>>>     Juelich Supercomputing Centre
>>>     Institute for Advanced Simulation
>>>     Forschungszentrum Juelich GmbH
>>>     52425 Juelich, Germany
>>>
>>>     Tel: +49 2461 61 1644
>>>     Fax: +49 2461 61 6656
>>>
>>>     Email:   r.speck at fz-juelich.de <mailto:r.speck at fz-juelich.de>
>>>     Website: http://www.fz-juelich.de/ias/jsc/speck_r
>>>     PinT:    http://www.fz-juelich.de/ias/jsc/pint
>>>
>>>
>>>
>>>     ------------------------------------------------------------------------------------------------
>>>     ------------------------------------------------------------------------------------------------
>>>     Forschungszentrum Juelich GmbH
>>>     52425 Juelich
>>>     Sitz der Gesellschaft: Juelich
>>>     Eingetragen im Handelsregister des Amtsgerichts Dueren Nr. HR B 3498
>>>     Vorsitzender des Aufsichtsrats: MinDir Dr. Karl Eugen Huthmacher
>>>     Geschaeftsfuehrung: Prof. Dr.-Ing. Wolfgang Marquardt (Vorsitzender),
>>>     Karsten Beneke (stellv. Vorsitzender), Prof. Dr.-Ing. Harald Bolt,
>>>     Prof. Dr. Sebastian M. Schmidt
>>>     ------------------------------------------------------------------------------------------------
>>>     ------------------------------------------------------------------------------------------------
>>>
>>
>> -- 
>> Dr. Robert Speck
>> Juelich Supercomputing Centre
>> Institute for Advanced Simulation
>> Forschungszentrum Juelich GmbH
>> 52425 Juelich, Germany
>>
>> Tel: +49 2461 61 1644
>> Fax: +49 2461 61 6656
>>
>> Email:   r.speck at fz-juelich.de
>> Website: http://www.fz-juelich.de/ias/jsc/speck_r
>> PinT:    http://www.fz-juelich.de/ias/jsc/pint

-- 
Dr. Robert Speck
Juelich Supercomputing Centre
Institute for Advanced Simulation
Forschungszentrum Juelich GmbH
52425 Juelich, Germany

Tel: +49 2461 61 1644
Fax: +49 2461 61 6656

Email:   r.speck at fz-juelich.de
Website: http://www.fz-juelich.de/ias/jsc/speck_r
PinT:    http://www.fz-juelich.de/ias/jsc/pint



More information about the petsc-users mailing list