[petsc-users] Copy dense matrix into half part of another dense matrix
Pierre Jolivet
pierre at joliv.et
Mon Jan 27 13:19:31 CST 2025
Please always keep the list in copy.
The way you create A is not correct, I’ve attached a fixed code.
If you want to keep your own distribution for A (and not the one associated to R_part), you’ll need to first call https://urldefense.us/v3/__https://petsc.org/main/manualpages/Mat/MatCreateSubMatrix/__;!!G_uCfscf7eWS!YjRNPHiOB2cmuRYkj3oAk-pZq_o8h3NlpeO9PlDH0X9SBfFvdi3ClO4y8ytxjLkg8u16l6dmVO7PZsCIAdueNw$ to redistribute A and then do a MatCopy() of the resulting Mat into R_part
Thanks,
Pierre
$ /Volumes/Data/repositories/petsc/arch-darwin-c-debug-real/bin/mpirun -n 4 ./ex1234
Mat Object: 4 MPI processes
type: mpidense
0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00
0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00
0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00
0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00
0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00
0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00
0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00
0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00
Mat Object: 4 MPI processes
type: mpidense
2.6219599187040323e+00 1.9661197867318445e+00 1.5218640363910978e+00
3.5202261875977947e+00 3.6311893358251384e+00 2.2279492868785069e+00
2.7505403755038014e+00 3.1546072728892756e+00 1.8416294994524489e+00
2.4676055638467314e+00 2.3185625557889602e+00 2.0401666986599833e+00
0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00
0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00
0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00
0.0000000000000000e+00 0.0000000000000000e+00 0.0000000000000000e+00

> On 27 Jan 2025, at 6:53 PM, medane.tchakorom at univ-fcomte.fr wrote:
>
> Re:
>
> This is a small reproductible example using MatDenseGetSubMatrix
>
>
> Command: petscmpiexec -n 4 ./example
>
> ==========================================================
>
> PetscInt nlines = 8; // lines
> PetscInt ncolumns = 3; // columns
> PetscInt random_size = 12;
> PetscInt rank;
> PetscInt size;
>
> // Initialize PETSc
> PetscInitialize(&argc, &args, NULL, NULL);
>
> MPI_Comm_rank(MPI_COMM_WORLD, &rank);
> MPI_Comm_size(MPI_COMM_WORLD, &size);
>
> // R_full with all values to zero
> Mat R_full;
> MatCreateDense(PETSC_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE, nlines, ncolumns, NULL, &R_full);
> MatZeroEntries(R_full);
> MatView(R_full, PETSC_VIEWER_STDOUT_WORLD);
>
> // Creating and setting A and S to rand values
> Mat A, S;
> MatCreateDense(PETSC_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE, nlines / 2, random_size, NULL, &A);
> MatCreateDense(PETSC_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE, random_size, ncolumns, NULL, &S);
> MatSetRandom(A, NULL);
> MatSetRandom(S, NULL);
>
> // Computing R_part
> Mat R_part;
> MatDenseGetSubMatrix(R_full, PETSC_DECIDE, nlines / 2, PETSC_DECIDE, PETSC_DECIDE, &R_part);
> MatMatMult(A, S, MAT_REUSE_MATRIX, PETSC_DECIDE, &R_part);
>
> // Visualizing R_full
> MatDenseRestoreSubMatrix(R_full, &R_part);
> MatView(R_full, PETSC_VIEWER_STDOUT_WORLD);
>
> // Destroying matrices
> MatDestroy(&R_part);
> MatDestroy(&R_full);
>
> PetscFinalize();
> return 0;
>
> ==========================================================
>
>
> Part of the error output contains….:
>
> "Cannot change/reset row sizes to 1 local 4 global after previously setting them to 2 local 4 global ….”
>
>
>
>
> ==========================================================
>
> PetscInt nlines = 8; // lines
> PetscInt ncolumns = 3; // columns
> PetscInt random_size = 12;
> PetscInt rank;
> PetscInt size;
>
> // Initialize PETSc
> PetscInitialize(&argc, &args, NULL, NULL);
>
> MPI_Comm_rank(MPI_COMM_WORLD, &rank);
> MPI_Comm_size(MPI_COMM_WORLD, &size);
>
> // R_full with all values to zero
> Mat R_full;
> MatCreateDense(PETSC_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE, nlines, ncolumns, NULL, &R_full);
> MatZeroEntries(R_full);
> MatView(R_full, PETSC_VIEWER_STDOUT_WORLD);
>
> // Creating and setting A and S to rand values
> Mat A, S;
> MatCreateDense(PETSC_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE, nlines / 2, random_size, NULL, &A);
> MatCreateDense(PETSC_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE, random_size, ncolumns, NULL, &S);
> MatSetRandom(A, NULL);
> MatSetRandom(S, NULL);
>
> // Computing R_part
> Mat R_part;
> MatMatMult(A, S, MAT_INITIAL_MATRIX, PETSC_DECIDE, &R_part);
> MatView(R_part, PETSC_VIEWER_STDOUT_WORLD);
>
> Mat R_sub;
> MatDenseGetSubMatrix(R_full, PETSC_DECIDE, nlines / 2, PETSC_DECIDE, PETSC_DECIDE, &R_sub);
>
> PetscScalar *storage = NULL;
> MatDenseGetArray(R_part, &storage);
> PetscScalar *storage_sub = NULL;
> MatDenseGetArray(R_sub, &storage_sub);
>
> PetscArraycpy(storage_sub, storage, (nlines / 2) * ncolumns);
>
> MatDenseRestoreArray(R_part, &storage);
> MatDenseRestoreArray(R_sub, &storage_sub);
>
> MatDenseRestoreSubMatrix(R_full, &R_sub);
>
> MatView(R_full, PETSC_VIEWER_STDOUT_WORLD);
>
> // Destroying matrices
> MatDestroy(&R_part);
> MatDestroy(&R_full);
>
> PetscFinalize();
> return 0;
> ==========================================================
>
>
> Now Using MatDenseGetArray
>
>
>
> Please let me know if I need to clarify something.
>
>
>
>
> Thanks
> Medane
>
>
>> On 27 Jan 2025, at 16:26, Pierre Jolivet <pierre at joliv.et> wrote:
>>
>>
>>
>>> On 27 Jan 2025, at 3:52 PM, Matthew Knepley <knepley at gmail.com> wrote:
>>>
>>> On Mon, Jan 27, 2025 at 9:23 AM medane.tchakorom at univ-fcomte.fr <mailto:medane.tchakorom at univ-fcomte.fr> <medane.tchakorom at univ-fcomte.fr <mailto:medane.tchakorom at univ-fcomte.fr>> wrote:
>>>> Re:
>>>>
>>>> MatDenseGetSubMatrix in fact could be the best alternative (cleaner code), but as mentioned earlier, I would like to use the smaller matrix R_part to get the result of a MatMatMult operation.
>>>>
>>>> MatMatMult(A, S, MAT_INITIAL_MATRIX, PETSC_DECIDE, &R_part);
>>>>
>>>> When trying to use MAT_REUSE_MATRIX, this gave an error (expected error I think). On the other side, I mentionned on MatDenseGetSubMatrix page, "The output matrix is not redistributed by PETSc”. So will R_part be a valid output matrix for MatMatMult?
>>>
>>> I believe it is, but I am not the expert.
>>>
>>> Pierre, can the SubMatrix be used as the output of a MatMatMult()?
>>
>> It should be, but there may be some limitations due to leading dimensions and what not.
>> By looking at just the single line of code we got from you, I can see at least one issue: it should be MAT_REUSE_MATRIX, not MAT_INITIAL_MATRIX (assuming you got R_part from MatDenseGetSubMatrix).
>> Feel free to share a (minimal) reproducer.
>>
>> Thanks,
>> Pierre
>>
>>> Thanks,
>>>
>>> Matt
>>>
>>>> Thanks
>>>>
>>>>
>>>>> On 27 Jan 2025, at 14:52, Matthew Knepley <knepley at gmail.com <mailto:knepley at gmail.com>> wrote:
>>>>>
>>>>> On Mon, Jan 27, 2025 at 8:42 AM Pierre Jolivet <pierre at joliv.et <mailto:pierre at joliv.et>> wrote:
>>>>>> > On 27 Jan 2025, at 2:23 PM, medane.tchakorom at univ-fcomte.fr <mailto:medane.tchakorom at univ-fcomte.fr> wrote:
>>>>>> >
>>>>>> > Dear PETSc users,
>>>>>> >
>>>>>> > I hope this message finds you well. I don’t know If my question is relevant, but I’am currently working with DENSE type matrix, and would like to copy one matrix R_part [ n/2 x m] (resulted from a MatMatMult operation) into another dense matrix R_full [n x m].
>>>>>> > Both matrices being on the same communicator, I would like to efficiently copy R_part in the first half of R_full.
>>>>>> > I have being using MatSetValues, but for large matrices, the subsequent assembling operation is costly.
>>>>>>
>>>>>> Could you please share the output of -log_view as well as a single file that will be generated with -info dump (e.g., dump.0, the file associated to process #0)?
>>>>>> This shouldn’t be that costly, so there may be an option missing, like MAT_NO_OFF_PROC_ENTRIES.
>>>>>> Anyway, if you want to optimize this, the fastest way would be to call MatDenseGetArray[Read,Write]() and then do the necessary PetscArraycpy().
>>>>>
>>>>> The other alternative (which I think makes cleaner code) is to use
>>>>>
>>>>> https://urldefense.us/v3/__https://petsc.org/main/manualpages/Mat/MatDenseGetSubMatrix/__;!!G_uCfscf7eWS!YjRNPHiOB2cmuRYkj3oAk-pZq_o8h3NlpeO9PlDH0X9SBfFvdi3ClO4y8ytxjLkg8u16l6dmVO7PZsCM1NrN_g$
>>>>>
>>>>> to create your R_part matrix. Then you are directly acting on the memory you want when assemble the smaller matrix.
>>>>>
>>>>> THanks,
>>>>>
>>>>> Matt
>>>>>
>>>>>> Thanks,
>>>>>> Pierre
>>>>>>
>>>>>> > Please could you suggest me some strategies or functions to do this efficiently.
>>>>>> >
>>>>>> > Thank you for your time and assistance.
>>>>>> >
>>>>>> > Best regards,
>>>>>> > Tchakorom Medane
>>>>>> >
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>>>>> -- Norbert Wiener
>>>>>
>>>>> https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!YjRNPHiOB2cmuRYkj3oAk-pZq_o8h3NlpeO9PlDH0X9SBfFvdi3ClO4y8ytxjLkg8u16l6dmVO7PZsBr2sh27w$ <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!YjRNPHiOB2cmuRYkj3oAk-pZq_o8h3NlpeO9PlDH0X9SBfFvdi3ClO4y8ytxjLkg8u16l6dmVO7PZsB1_9gXCg$ >
>>>>
>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!YjRNPHiOB2cmuRYkj3oAk-pZq_o8h3NlpeO9PlDH0X9SBfFvdi3ClO4y8ytxjLkg8u16l6dmVO7PZsBr2sh27w$ <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!YjRNPHiOB2cmuRYkj3oAk-pZq_o8h3NlpeO9PlDH0X9SBfFvdi3ClO4y8ytxjLkg8u16l6dmVO7PZsB1_9gXCg$ >
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250127/df33b73a/attachment-0002.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: ex1234.c
Type: application/octet-stream
Size: 1849 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250127/df33b73a/attachment-0001.obj>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20250127/df33b73a/attachment-0003.html>
More information about the petsc-users
mailing list