[petsc-users] Load a dense matrix, distribute it over several MPI nodes and retrieve it afterwards

Roland Richter roland.richter at ntnu.no
Fri Dec 4 05:51:45 CST 2020


Hei,

is it possible to fill a dense distributed matrix from existing data in
rank 0, distribute it to all involved nodes and retrieve it afterwards,
such that it can be stored in a single matrix in rank 0 again? The
background behind this question is the following thought process:

  * I generate the matrix locally on node 0
  * I distribute it to all involved nodes
  * I use the matrix for data processing/process the matrix data on all
    nodes
  * For storing the data I need it back on node 0 in a single matrix

For testing I wrote the following code (and using armadillo for
generating the initial matrices within arma-namespace):

/    Mat C, F;//
//    Vec x, y, z;//
//    PetscViewer viewer;//
//    PetscMPIInt rank, size;//
//    PetscInitialize(&argc, &args, (char*) 0, help);//
//
//    MPI_Comm_size(PETSC_COMM_WORLD, &size);//
//    MPI_Comm_rank(PETSC_COMM_WORLD, &rank);//
//
//    PetscPrintf(PETSC_COMM_WORLD,"Number of processors = %d, rank =
%d\n", size, rank);//
////
//    PetscViewerCreate(PETSC_COMM_WORLD, &viewer);//
//    PetscViewerSetType(viewer, PETSCVIEWERASCII);//
//    arma::cx_mat local_mat, local_zero_mat;//
//    const size_t matrix_size = 5;//
//    if(rank == 0) {//
//        local_mat = arma::randu<arma::cx_mat>(matrix_size, matrix_size);//
//        local_zero_mat = arma::zeros<arma::cx_mat>(matrix_size,
matrix_size);//
//        MatCreateDense(PETSC_COMM_WORLD, matrix_size, matrix_size,
PETSC_DECIDE, PETSC_DECIDE, local_mat.memptr(), &C);//
//        MatAssemblyBegin(C, MAT_FINAL_ASSEMBLY);//
//        MatAssemblyEnd(C, MAT_FINAL_ASSEMBLY);//
//    }//
//    MatCreateDense(PETSC_COMM_WORLD, matrix_size, matrix_size,
PETSC_DECIDE, PETSC_DECIDE, NULL, &F);//
//    if(rank == 0)//
//        MatCopy(C, F, SAME_NONZERO_PATTERN);//
//    MatAssemblyBegin(F, MAT_FINAL_ASSEMBLY);//
//    MatAssemblyEnd(F, MAT_FINAL_ASSEMBLY);//
//    if(rank == 0) {//
//        MatView(C, viewer);//
//        std::cout << local_mat << '\n';//
//        std::cout << local_zero_mat << '\n';//
//    }//
//    //How can I retrieve the distributed data from F to C again when
running on multiple nodes?//
//
//    if(rank == 0) {//
//        MatDestroy(&C);//
//    }//
//    MatDestroy(&F);//
//    PetscViewerDestroy(&viewer);/

Running the code with mpirun -n 1 works fine, but when running it with
mpirun -n 2 it just stops after /PetscPrintf() /and hangs.

Moreover, how can I retrieve the data from matrix F afterwards? For
vectors I have the function/class VecScatter, but can I apply this for
matrices (mpidense) too?

Thanks!

Cheers,

Roland

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20201204/c55dec99/attachment.html>


More information about the petsc-users mailing list