[petsc-users] Galerkin projection using petsc4py

Pierre Jolivet pierre at joliv.et
Thu Oct 5 06:22:11 CDT 2023


How about using ptap which will use MatPtAP?
It will be more efficient (and it will help you bypass the issue).

Thanks,
Pierre

> On 5 Oct 2023, at 1:18 PM, Thanasis Boutsikakis <thanasis.boutsikakis at corintis.com> wrote:
> 
> Sorry, forgot function create_petsc_matrix()
> 
> def create_petsc_matrix(input_array sparse=True):
>     """Create a PETSc matrix from an input_array
> 
>     Args:
>         input_array (np array): Input array
>         partition_like (PETSc mat, optional): Petsc matrix. Defaults to None.
>         sparse (bool, optional): Toggle for sparese or dense. Defaults to True.
> 
>     Returns:
>         PETSc mat: PETSc matrix
>     """
>     # Check if input_array is 1D and reshape if necessary
>     assert len(input_array.shape) == 2, "Input array should be 2-dimensional"
>     global_rows, global_cols = input_array.shape
> 
>     size = ((None, global_rows), (global_cols, global_cols))
> 
>     # Create a sparse or dense matrix based on the 'sparse' argument
>     if sparse:
>         matrix = PETSc.Mat().createAIJ(size=size, comm=COMM_WORLD)
>     else:
>         matrix = PETSc.Mat().createDense(size=size, comm=COMM_WORLD)
>     matrix.setUp()
> 
>     local_rows_start, local_rows_end = matrix.getOwnershipRange()
> 
>     for counter, i in enumerate(range(local_rows_start, local_rows_end)):
>         # Calculate the correct row in the array for the current process
>         row_in_array = counter + local_rows_start
>         matrix.setValues(
>             i, range(global_cols), input_array[row_in_array, :], addv=False
>         )
> 
>     # Assembly the matrix to compute the final structure
>     matrix.assemblyBegin()
>     matrix.assemblyEnd()
> 
>     return matrix
> 
>> On 5 Oct 2023, at 13:09, Thanasis Boutsikakis <thanasis.boutsikakis at corintis.com> wrote:
>> 
>> Hi everyone,
>> 
>> I am trying a Galerkin projection (see MFE below) and I cannot get the Phi.transposeMatMult(A, A1) work. The error is
>> 
>>     Phi.transposeMatMult(A, A1)
>>   File "petsc4py/PETSc/Mat.pyx", line 1514, in petsc4py.PETSc.Mat.transposeMatMult
>> petsc4py.PETSc.Error: error code 56
>> [0] MatTransposeMatMult() at /Users/boutsitron/firedrake/src/petsc/src/mat/interface/matrix.c:10135
>> [0] MatProduct_Private() at /Users/boutsitron/firedrake/src/petsc/src/mat/interface/matrix.c:9989
>> [0] No support for this operation for this object type
>> [0] Call MatProductCreate() first
>> 
>> Do you know if these exposed to petsc4py or maybe there is another way? I cannot get the MFE to work (neither in sequential nor in parallel)
>> 
>> """Experimenting with PETSc mat-mat multiplication"""
>> 
>> import time
>> 
>> import numpy as np
>> from colorama import Fore
>> from firedrake import COMM_SELF, COMM_WORLD
>> from firedrake.petsc import PETSc
>> from mpi4py import MPI
>> from numpy.testing import assert_array_almost_equal
>> 
>> from utilities import (
>>     Print,
>>     create_petsc_matrix,
>> )
>> 
>> nproc = COMM_WORLD.size
>> rank = COMM_WORLD.rank
>> 
>> # --------------------------------------------
>> # EXP: Galerkin projection of an mpi PETSc matrix A with an mpi PETSc matrix Phi
>> #  A' = Phi.T * A * Phi
>> # [k x k] <- [k x m] x [m x m] x [m x k]
>> # --------------------------------------------
>> 
>> m, k = 11, 7
>> # Generate the random numpy matrices
>> np.random.seed(0)  # sets the seed to 0
>> A_np = np.random.randint(low=0, high=6, size=(m, m))
>> Phi_np = np.random.randint(low=0, high=6, size=(m, k))
>> 
>> # Create A as an mpi matrix distributed on each process
>> A = create_petsc_matrix(A_np)
>> 
>> # Create Phi as an mpi matrix distributed on each process
>> Phi = create_petsc_matrix(Phi_np)
>> 
>> A1 = create_petsc_matrix(np.zeros((k, m)))
>> 
>> # Now A1 contains the result of Phi^T * A
>> Phi.transposeMatMult(A, A1)
>> 
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20231005/7be1dccb/attachment-0001.html>


More information about the petsc-users mailing list