[petsc-users] storage of parallel dense matrices and (anti)symmetric matrices

Gao Bin bin.gao at uit.no
Thu Mar 22 15:13:49 CDT 2012


Hi there,

I am a beginner of using PETSc, and confused about the parallel distribution of matrix in PETSc. For instance, as regards the parallel dense matrices, as described in docs/developers.pdf,

"The parallel dense matrices are partitioned by rows across the processors, so that each local rectangular submatrix is stored in the dense format described above."

Does it mean each processor will have several continuous rows and all columns of the matrix? If yes, why do we need to specify "n" -- the number of local columns when calling MatCreateMPIDense?

I am sorry to raise this simple question, since I have read the manual and tutorials, but I have not found a clear answer. Moreover, the reason I am asking this question is that I would like to use PETSc for matrix operations, but the elements of matrices need to be calculate via my own code. If I know the distribution of the matrix, I could let each processor only calculate and set local values (the rows and columns possessed on the processor itself) for efficiency.

My second question is if PETSc provides symmetric and anti-symmetric matrices. I have read the manual, the answer seems to be no. Am I right?

Thank you in advance.

Cheers

Gao
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120322/57b423be/attachment.htm>


More information about the petsc-users mailing list