[petsc-users] mpiaijperm and mpiaijcrl
Gaetan Kenway
kenway at utias.utoronto.ca
Thu Aug 30 15:51:52 CDT 2012
Hi
I was wondering if anyone had any experience with using these new matrix
formats in PETSc. I have a block aij matrix (with block size 5) and tried
converting to either of these types and it just trashed memory for 10
minutes and I then killed it. Matrix assemble is 13 seconds for reference.
I then tried to construct the matrix directly with "MatCreateMPIAIJPERM"
or "MatCreateMPIAIJCRL". Neither of these functions are wrapped in
Fortran. Also the documentation for "MatCreateMPIAIJCRL" is extremely
confusing as it says that comm is set to PETSC_COMM_SELF which defeats the
purpose of a MPI matrix.
Therefor, two questions:
1. Have any users seen a performance benefit of either of these formats (in
parallel)?
2. How does one actually efficiently create such matrices? (from Fortran)
On a side note, is it possible somehow to indicate on the manual pages all
the functions that are not actually wrapped in Fortran? It would make it
easier to know which one you can and can't use instead of having the
compiler tell you it can't find a symbol.
Thanks,
Gaetan
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120830/09f75fde/attachment.html>
More information about the petsc-users
mailing list