[petsc-dev] [petsc-users] MatTransposeMatMult ends up with an MPI error

Jed Brown jedbrown at mcs.anl.gov
Wed Oct 17 17:01:48 CDT 2012


This is why I'm writing a non-one-sided implementation of PetscSF (and that
will become the default).

It will likely be faster even on MPICH due to the way one-sided is actually
implemented.

The only work-around for MatTranspose is to fall back to the old code that
didn't preallocate correctly and was much slower.

On Wed, Oct 17, 2012 at 4:57 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:

> I agree but if you start using SF in "normal" parts of PETSc OpenMPI is
> constantly going to cause these emails and that annoys both users and us;
> better to just turn off one-sided for OpenMPI and you can have configure
> report that in a nice box if you like.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20121017/8f77d070/attachment.html>


More information about the petsc-dev mailing list