[petsc-users] [petsc-dev] How to compute RARt with A and R as distributed (MPI) matrices ?

Jed Brown jed at jedbrown.org
Wed Jun 21 22:26:57 CDT 2017


Hong <hzhang at mcs.anl.gov> writes:

> Jed :
>
>>
>> Is it better this way or as a fallback when !A->ops->rart?  MatPtAP
>> handles other combinations like MAIJ.
>>
>
> Do you mean
> if ( !A->ops->rart) {
>     Mat Rt;
>     ierr = MatTranspose(R,MAT_INITIAL_MATRIX,&Rt);CHKERRQ(ierr);
>     ierr = MatMatMatMult(R,A,Rt,scall,fill,C);CHKERRQ(ierr);
>     ierr = MatDestroy(&Rt);CHKERRQ(ierr);
> }
> This does NOT work for most matrix formats because we do not have fallbacks
> for MatTranspose() and MatMatMult().

That's fine; they'll trigger an error and we'll be able to see from the
stack that it can be made to work by either implementing the appropriate
MatRARt or MatTranspose and MatMatMatMult.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 832 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170621/28d04bff/attachment.pgp>


More information about the petsc-users mailing list