[petsc-users] Parallelism of the Mat.convert() function
Pierre Jolivet
pierre at joliv.et
Tue Apr 23 11:44:16 CDT 2024
The code is behaving as it should, IMHO.
Here is a way to have the Mat stored the same independently of the number of processes.
[…]
global_rows, global_cols = input_array.T.shape
size = ((None, global_rows), (0 if COMM_WORLD.Get_rank() < COMM_WORLD.Get_size() - 1 else global_cols, global_cols))
[…]
I.e., you want to enforce that the lone column is stored by the last process, otherwise, it will be stored by the first one, and interleaved with the rest of the (0,0) block.
Thanks,
Pierre
> On 23 Apr 2024, at 6:13 PM, Miguel Angel Salazar de Troya <miguel.salazar at corintis.com> wrote:
>
> This Message Is From an External Sender
> This message came from outside your organization.
> Hello,
>
> The following code returns a different answer depending on how many processors I use. With one processor, the last MPIAIJ matrix is correctly formed:
>
> row 0: (0, 1.) (1, 2.) (2, 3.) (3, 4.) (4, -1.)
> row 1: (0, 5.) (1, 6.) (2, 7.) (3, 8.) (4, -2.)
> row 2: (0, 9.) (1, 10.) (2, 11.) (3, 12.) (4, -3.)
> row 3: (0, 13.) (1, 14.) (2, 15.) (3, 16.) (4, -4.)
> row 4: (0, 17.) (1, 18.) (2, 19.) (3, 20.) (4, -5.)
> row 5: (0, 21.) (1, 22.) (2, 23.) (3, 24.) (4, -6.)
> row 6: (0, 25.) (1, 26.) (2, 27.) (3, 28.) (4, -7.)
>
> With two processors though, the column matrix is placed in between:
>
> row 0: (0, 1.) (1, 2.) (2, -1.) (3, 3.) (4, 4.)
> row 1: (0, 5.) (1, 6.) (2, -2.) (3, 7.) (4, 8.)
> row 2: (0, 9.) (1, 10.) (2, -3.) (3, 11.) (4, 12.)
> row 3: (0, 13.) (1, 14.) (2, -4.) (3, 15.) (4, 16.)
> row 4: (0, 17.) (1, 18.) (2, -5.) (3, 19.) (4, 20.)
> row 5: (0, 21.) (1, 22.) (2, -6.) (3, 23.) (4, 24.)
> row 6: (0, 25.) (1, 26.) (2, -7.) (3, 27.) (4, 28.)
>
> Am I not building the nested matrix correctly, perhaps? I am using the Firedrake PETSc fork. Can you reproduce it?
>
> Thanks,
> Miguel
>
>
> ```python
> import numpy as np
> from petsc4py import PETSc
> from petsc4py.PETSc import COMM_WORLD
>
>
> input_array = np.array(
> [
> [1, 2, 3, 4],
> [5, 6, 7, 8],
> [9, 10, 11, 12],
> [13, 14, 15, 16],
> [17, 18, 19, 20],
> [21, 22, 23, 24],
> [25, 26, 27, 28],
> ],
> dtype=np.float64,
> )
>
>
> n_11_global_rows, n_11_global_cols = input_array.shape
> size = ((None, n_11_global_rows), (None, n_11_global_cols))
> mat = PETSc.Mat().createAIJ(size=size, comm=COMM_WORLD)
> mat.setUp()
> mat.setValues(range(n_11_global_rows), range(n_11_global_cols), input_array)
> mat.assemblyBegin()
> mat.assemblyEnd()
> mat.view()
>
> input_array = np.array([[-1, -2, -3, -4, -5, -6, -7]], dtype=np.float64)
> global_rows, global_cols = input_array.T.shape
> size = ((None, global_rows), (None, global_cols))
> mat_2 = PETSc.Mat().createAIJ(size=size, comm=COMM_WORLD)
> mat_2.setUp()
> mat_2.setValues(range(global_rows), range(global_cols), input_array)
> mat_2.assemblyBegin()
> mat_2.assemblyEnd()
>
> N = PETSc.Mat().createNest([[mat, mat_2]], comm=COMM_WORLD)
> N.assemblyBegin()
> N.assemblyEnd()
>
> PETSc.Sys.Print(f"N sizes: {N.getSize()}")
>
> N.convert("mpiaij").view()
> ```
>
>
> --
>
> Miguel Angel Salazar de Troya
> Head of Software Engineering
>
>
> EPFL Innovation Park Building C
> 1015 Lausanne
> Email: miguel.salazar at corintis.com <mailto:miguel.salazar at corintis.com>
> Website: https://urldefense.us/v3/__http://www.corintis.com__;!!G_uCfscf7eWS!YORvPf_X3sveJywRYWyVNAFZ4TMboGzB8VrtDIonGAbCuwi-L_km8zcUwHsovFRYTKz6sEij9ppa0U9uf4u2Ew$ <https://urldefense.us/v3/__https://www.corintis.com__;!!G_uCfscf7eWS!bC7zvBxQx0RuDXxzlOgxr_PdSp5N9ZdzjgTPmjG_ZU5WbNvHboZHFBhZksYgyDF2nO1IRXABTx5zmJLaL2NK_EYg2SnKUCzq$>
>
>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20240423/d5a05ad8/attachment-0001.html>
More information about the petsc-users
mailing list