[petsc-dev] [petsc-users] How to compute RARt with A and R as distributed (MPI) matrices ?
Jed Brown
jed at jedbrown.org
Thu Jun 22 11:17:33 CDT 2017
Hong <hzhang at mcs.anl.gov> writes:
> Jed:
>>
>> >> Is it better this way or as a fallback when !A->ops->rart? MatPtAP
>> >> handles other combinations like MAIJ.
>> >>
>> >
>> > Do you mean
>> > if ( !A->ops->rart) {
>> > Mat Rt;
>> > ierr = MatTranspose(R,MAT_INITIAL_MATRIX,&Rt);CHKERRQ(ierr);
>> > ierr = MatMatMatMult(R,A,Rt,scall,fill,C);CHKERRQ(ierr);
>> > ierr = MatDestroy(&Rt);CHKERRQ(ierr);
>> > }
>> > This does NOT work for most matrix formats because we do not have
>> fallbacks
>> > for MatTranspose() and MatMatMult().
>>
>> That's fine; they'll trigger an error and we'll be able to see from the
>> stack that it can be made to work by either implementing the appropriate
>> MatRARt or MatTranspose and MatMatMatMult.
>>
>
> You prefer adding this default, even though it gives error in either
> MatTranspose() or MatMatMatMult() depends on input matrix format?
Yeah, in the sense that it gives more opportunities to succeed.
> If so, we need add this type of 'default' to all mat operations --
> currently, all routines do
> if (!mat->ops-> )
> SETERRQ1(PetscObjectComm((PetscObject)mat),PETSC_ERR_SUP,"Mat type
> %s",((PetscObject)mat)->type_name);
Probably.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 832 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20170622/6ddf4a95/attachment.sig>
More information about the petsc-dev
mailing list