[petsc-users] storage of parallel dense matrices and (anti)symmetric matrices
Jed Brown
jedbrown at mcs.anl.gov
Thu Mar 22 15:17:47 CDT 2012
2012/3/22 Gao Bin <bin.gao at uit.no>
> "The parallel dense matrices are partitioned by rows across the
> processors, so that each local rectangular submatrix is stored in the dense
> format described above."
>
> Does it mean each processor will have several continuous rows and all
> columns of the matrix? If yes, why do we need to specify "n" -- the number
> of local columns when calling MatCreateMPIDense?
>
Interpret the local column size n as the local size of the Vec that the Mat
will be applied to.
> I am sorry to raise this simple question, since I have read the manual and
> tutorials, but I have not found a clear answer. Moreover, the reason I am
> asking this question is that I would like to use PETSc for matrix
> operations, but the elements of matrices need to be calculate via my own
> code. If I know the distribution of the matrix, I could let each processor
> only calculate and set local values (the rows and columns possessed on the
> processor itself) for efficiency.
>
> My second question is if PETSc provides symmetric and anti-symmetric
> matrices. I have read the manual, the answer seems to be no. Am I right?
See the SBAIJ format (it is sparse).
With a parallel dense matrix, there isn't any point using a symmetric
format unless you use a different distribution of the entries.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120322/65d1b919/attachment.htm>
More information about the petsc-users
mailing list