[petsc-users] Parallelism of the Mat.convert() function

Miguel Angel Salazar de Troya miguel.salazar at corintis.com
Tue Apr 23 11:13:08 CDT 2024


Hello,

The following code returns a different answer depending on how many
processors I use. With one processor, the last MPIAIJ matrix is correctly
formed:

row 0: (0, 1.)  (1, 2.)  (2, 3.)  (3, 4.)  (4, -1.)
row 1: (0, 5.)  (1, 6.)  (2, 7.)  (3, 8.)  (4, -2.)
row 2: (0, 9.)  (1, 10.)  (2, 11.)  (3, 12.)  (4, -3.)
row 3: (0, 13.)  (1, 14.)  (2, 15.)  (3, 16.)  (4, -4.)
row 4: (0, 17.)  (1, 18.)  (2, 19.)  (3, 20.)  (4, -5.)
row 5: (0, 21.)  (1, 22.)  (2, 23.)  (3, 24.)  (4, -6.)
row 6: (0, 25.)  (1, 26.)  (2, 27.)  (3, 28.)  (4, -7.)

With two processors though, the column matrix is placed in between:

row 0: (0, 1.)  (1, 2.)  (2, -1.)  (3, 3.)  (4, 4.)
row 1: (0, 5.)  (1, 6.)  (2, -2.)  (3, 7.)  (4, 8.)
row 2: (0, 9.)  (1, 10.)  (2, -3.)  (3, 11.)  (4, 12.)
row 3: (0, 13.)  (1, 14.)  (2, -4.)  (3, 15.)  (4, 16.)
row 4: (0, 17.)  (1, 18.)  (2, -5.)  (3, 19.)  (4, 20.)
row 5: (0, 21.)  (1, 22.)  (2, -6.)  (3, 23.)  (4, 24.)
row 6: (0, 25.)  (1, 26.)  (2, -7.)  (3, 27.)  (4, 28.)

Am I not building the nested matrix correctly, perhaps? I am using the
Firedrake PETSc fork. Can you reproduce it?

Thanks,
Miguel


```python
import numpy as np
from petsc4py import PETSc
from petsc4py.PETSc import COMM_WORLD


input_array = np.array(
    [
        [1, 2, 3, 4],
        [5, 6, 7, 8],
        [9, 10, 11, 12],
        [13, 14, 15, 16],
        [17, 18, 19, 20],
        [21, 22, 23, 24],
        [25, 26, 27, 28],
    ],
    dtype=np.float64,
)


n_11_global_rows, n_11_global_cols = input_array.shape
size = ((None, n_11_global_rows), (None, n_11_global_cols))
mat = PETSc.Mat().createAIJ(size=size, comm=COMM_WORLD)
mat.setUp()
mat.setValues(range(n_11_global_rows), range(n_11_global_cols), input_array)
mat.assemblyBegin()
mat.assemblyEnd()
mat.view()

input_array = np.array([[-1, -2, -3, -4, -5, -6, -7]], dtype=np.float64)
global_rows, global_cols = input_array.T.shape
size = ((None, global_rows), (None, global_cols))
mat_2 = PETSc.Mat().createAIJ(size=size, comm=COMM_WORLD)
mat_2.setUp()
mat_2.setValues(range(global_rows), range(global_cols), input_array)
mat_2.assemblyBegin()
mat_2.assemblyEnd()

N = PETSc.Mat().createNest([[mat, mat_2]], comm=COMM_WORLD)
N.assemblyBegin()
N.assemblyEnd()

PETSc.Sys.Print(f"N sizes: {N.getSize()}")

N.convert("mpiaij").view()
```


-- 

* Miguel Angel Salazar de Troya *
Head of Software Engineering


EPFL Innovation Park Building C
1015 Lausanne
Email: miguel.salazar at corintis.com
Website: https://urldefense.us/v3/__http://www.corintis.com__;!!G_uCfscf7eWS!bC7zvBxQx0RuDXxzlOgxr_PdSp5N9ZdzjgTPmjG_ZU5WbNvHboZHFBhZksYgyDF2nO1IRXABTx5zmJLaL2NK_EYg2deym84H$ 
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240423/29cb7672/attachment.html>


More information about the petsc-users mailing list