[petsc-users] Load a dense matrix, distribute it over several MPI nodes and retrieve it afterwards
Pierre Jolivet
pierre at joliv.et
Fri Dec 4 06:16:19 CST 2020
> On 4 Dec 2020, at 1:13 PM, Matthew Knepley <knepley at gmail.com> wrote:
>
> On Fri, Dec 4, 2020 at 6:51 AM Roland Richter <roland.richter at ntnu.no <mailto:roland.richter at ntnu.no>> wrote:
> Hei,
>
> is it possible to fill a dense distributed matrix from existing data in rank 0, distribute it to all involved nodes and retrieve it afterwards, such that it can be stored in a single matrix in rank 0 again? The background behind this question is the following thought process:
>
> I generate the matrix locally on node 0
> I do not understand why you would ever want to do this.
> I distribute it to all involved nodes
> Just create your distributed matrix, but set the values from rank 0.
> I use the matrix for data processing/process the matrix data on all nodes
> I assume you change the matrix here.
> For storing the data I need it back on node 0 in a single matrix
> You can use https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatCreateRedundantMatrix.html <https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatCreateRedundantMatrix.html>
> to easily create serial matrices from it. You can throw away any you do not want.
>
> Thanks,
>
> Matt
> For testing I wrote the following code (and using armadillo for generating the initial matrices within arma-namespace):
>
> Mat C, F;
> Vec x, y, z;
> PetscViewer viewer;
> PetscMPIInt rank, size;
> PetscInitialize(&argc, &args, (char*) 0, help);
>
> MPI_Comm_size(PETSC_COMM_WORLD, &size);
> MPI_Comm_rank(PETSC_COMM_WORLD, &rank);
>
> PetscPrintf(PETSC_COMM_WORLD,"Number of processors = %d, rank = %d\n", size, rank);
>
> PetscViewerCreate(PETSC_COMM_WORLD, &viewer);
> PetscViewerSetType(viewer, PETSCVIEWERASCII);
> arma::cx_mat local_mat, local_zero_mat;
> const size_t matrix_size = 5;
> if(rank == 0) {
> local_mat = arma::randu<arma::cx_mat>(matrix_size, matrix_size);
> local_zero_mat = arma::zeros<arma::cx_mat>(matrix_size, matrix_size);
> MatCreateDense(PETSC_COMM_WORLD, matrix_size, matrix_size, PETSC_DECIDE, PETSC_DECIDE, local_mat.memptr(), &C);
> MatAssemblyBegin(C, MAT_FINAL_ASSEMBLY);
> MatAssemblyEnd(C, MAT_FINAL_ASSEMBLY);
> }
> MatCreateDense(PETSC_COMM_WORLD, matrix_size, matrix_size, PETSC_DECIDE, PETSC_DECIDE, NULL, &F);
> if(rank == 0)
> MatCopy(C, F, SAME_NONZERO_PATTERN);
> MatAssemblyBegin(F, MAT_FINAL_ASSEMBLY);
> MatAssemblyEnd(F, MAT_FINAL_ASSEMBLY);
> if(rank == 0) {
> MatView(C, viewer);
> std::cout << local_mat << '\n';
> std::cout << local_zero_mat << '\n';
> }
> //How can I retrieve the distributed data from F to C again when running on multiple nodes?
>
> if(rank == 0) {
> MatDestroy(&C);
> }
> MatDestroy(&F);
> PetscViewerDestroy(&viewer);
>
> Running the code with mpirun -n 1 works fine, but when running it with mpirun -n 2 it just stops after PetscPrintf() and hangs.
>
> Moreover, how can I retrieve the data from matrix F afterwards? For vectors I have the function/class VecScatter, but can I apply this for matrices (mpidense) too?
>
With respect to this very last question, this is currently an open issue https://gitlab.com/petsc/petsc/-/issues/693 <https://gitlab.com/petsc/petsc/-/issues/693>.
Thanks,
Pierre
> Thanks!
>
> Cheers,
>
> Roland
>
>
>
> --
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20201204/b5f8019e/attachment-0001.html>
More information about the petsc-users
mailing list