[petsc-users] mpiaijperm and mpiaijcrl

Gaetan Kenway kenway at utias.utoronto.ca
Thu Aug 30 18:56:38 CDT 2012


Good to know.

I did actually manage to get them to run after. I had to convert the
mpibaij matrix to mpiaij and then to one of those formats. They had almost
identically the same performance on a modern intel processor.

Thanks for the reply

Gaetan

On Thu, Aug 30, 2012 at 7:21 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:

>
>   You have no reason to use those formats.   They are specialized for the
> Cray X1 vector machine which no longer exists. They will not be faster on
> another machine. Sorry for the confusion.
>
>     Barry
>
> On Aug 30, 2012, at 3:51 PM, Gaetan Kenway <kenway at utias.utoronto.ca>
> wrote:
>
> > Hi
> >
> > I was wondering if anyone had any experience with using these new matrix
> formats in PETSc. I have a block aij matrix (with block size 5) and tried
> converting to either of these types and it just trashed memory for 10
> minutes and I then killed it. Matrix assemble is 13 seconds for reference.
>  I then tried to construct the matrix directly with  "MatCreateMPIAIJPERM"
> or  "MatCreateMPIAIJCRL". Neither of these functions are  wrapped in
> Fortran. Also the documentation for "MatCreateMPIAIJCRL" is extremely
> confusing as it says that comm is set to PETSC_COMM_SELF which defeats the
> purpose of a MPI matrix.
> >
> > Therefor, two questions:
> > 1. Have any users seen a performance benefit of either of these formats
> (in parallel)?
> > 2. How does one actually efficiently create such matrices? (from Fortran)
> >
> > On a side note, is it possible somehow to indicate on the manual pages
> all the functions that are not actually wrapped in Fortran? It would make
> it easier to know which one you can and can't use instead of having the
> compiler tell you it can't find a symbol.
> >
> > Thanks,
> > Gaetan
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120830/6fe2b44f/attachment.html>


More information about the petsc-users mailing list