[petsc-users] Newbie question : sequential vs. parallel and matrix creation

lixin chu lixin_chu at yahoo.com
Sat Feb 18 16:58:20 CST 2017


Hello,
Some newbie questions I have wrt matrix creation, thank you for any help:
1. Is it correct to say that a sequential matrix is created in one process (for example, the root process), and then distributed to all processes with MatAssemblyBegin and MatAssemblyEnd ? Or sequential matrix only works with running with 1 MPI process only?
2. For a parallel matrix creation, each process will set its values, so I need to provide the data for each process on all the machine ?
3. MatSetValues(Mat mat,PetscInt m,const PetscInt idxm[],PetscInt n,const PetscInt idxn[],const PetscScalar v[],InsertMode addv) According to the manual page #59 : This routine inserts or adds a logically dense subblock of dimension m*n into the matrix ...
    I am not sure if extracting the non zero elements and forming a 'dense block' of data from a large sparse matrix is efficient. My original matrix data is column major. I am thinking of creating and loading the matrix in a column by column way, with n = 1 and using MAT_ROW_ORIENTED = FALSE. Is it efficient ?    I think I need to pre-allocate memory, but the API for parallel matrix MatMPIAIJSetPreallocation () requires to have the none zero info for DIAGONAL portion and NON-DIAGONAL portion separately.  This seems to add more work when converting my sparse matrix to PETSc format... 4. Ideally, I would like to load the matrix data in the main process only, then distribute to all other processes. What is the best way to do this ?    
5. MatView and MatLoad    MatView seems to create one file with data for all processes (I only tested with one machine) :        Vec Object: 4 MPI processes        type: mpi        Process [0]        0.        Process [1]        1.        Process [2]        2.        Process [3]        3.
    So do I have to manually distribute this file to all machines ?


Many thanks again !        rgds
lixin
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170218/58d8f0ac/attachment.html>


More information about the petsc-users mailing list