[petsc-users] kronecker products

Aron Ahmadia aron.ahmadia at kaust.edu.sa
Thu Nov 4 10:01:13 CDT 2010


It sounds like you will want to shift to matrix-free methods, but it
is hard to make that assessment without seeing your problem
formulation.  I can tell you that I have seen matrix-free methods
implemented more efficiently than if you had kept the entire sparse
system, but I cannot tell you if it will be more efficient and worth
the effort for your problem.

Aron

On Thu, Nov 4, 2010 at 5:58 PM, Benjamin Sanderse <B.Sanderse at cwi.nl> wrote:
> Some matrices are just used to compute matrix-vector products, and some have to be solved for. That's basically it.
> The matrices are really very sparse; on the order of 10-20 diagonals (typically independent of problem size).
> As far as I know, reordering is hardly necessary, because I am using a structured grid.
>
> Ben
>
> ----- Original Message -----
> From: "Jed Brown" <jed at 59A2.org>
> To: "PETSc users list" <petsc-users at mcs.anl.gov>
> Sent: Thursday, November 4, 2010 3:50:43 PM
> Subject: Re: [petsc-users] kronecker products
>
> On Thu, Nov 4, 2010 at 09:43, Benjamin Sanderse <B.Sanderse at cwi.nl> wrote:
>
>> I am working on a CFD code which calculates differences, averages,
>> interpolations, etc. by computing matrix-vector and matrix-matrix products.
>> The basis here is formed by (very) sparse matrices and extension to more
>> dimensions is done with kronecker products.
>> This works fine one a single processor; however, before implementing things
>> in parallel I have two questions:
>>
>> - is there a way to compute kronecker products efficiently in parallel with
>> Petsc?
>>
>
> You could store the constitutive pieces in a MatShell.  What do you have to
> do with the result* *(multiply with, solve with, compute singular values of,
> etc)?  What is the relative size and sparsity of each piece?  Would you be
> willing to reorder unknowns in the vector (perhaps with a scatter) for a
> more efficient implementation?
>
> Jed
>


More information about the petsc-users mailing list