[petsc-users] MatMult inside a for loop
Ronal Celaya
ronalcelayavzla at gmail.com
Sun Feb 8 18:14:03 CST 2015
Thank you Barry.
Is there a way to reuse the vector x? I don't want to gather the vector in
each iteration, I'd rather replicate the vector x in each process.
Thanks in advance.
On Sun, Feb 8, 2015 at 7:17 PM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>
> > On Feb 8, 2015, at 5:41 PM, Ronal Celaya <ronalcelayavzla at gmail.com>
> wrote:
> >
> > Hello
> > If I have a MatMult operation inside a for loop (e. g. CG algorithm),
> and the matrix A is MPIAIJ, vector x is gathered to local process in every
> loop?
>
> Yes, internal to MatMult() it calls MatMult_MPIAIJ() which is in
> src/mat/impls/aij/mpi/mpiaij,c which has the following code:
>
> PetscErrorCode MatMult_MPIAIJ(Mat A,Vec xx,Vec yy)
> {
> Mat_MPIAIJ *a = (Mat_MPIAIJ*)A->data;
> PetscErrorCode ierr;
> PetscInt nt;
>
> PetscFunctionBegin;
> ierr = VecGetLocalSize(xx,&nt);CHKERRQ(ierr);
> if (nt != A->cmap->n)
> SETERRQ2(PETSC_COMM_SELF,PETSC_ERR_ARG_SIZ,"Incompatible partition of A
> (%D) and xx (%D)",A->cmap->n,nt);
> ierr =
> VecScatterBegin(a->Mvctx,xx,a->lvec,INSERT_VALUES,SCATTER_FORWARD);CHKERRQ(ierr);
> ierr = (*a->A->ops->mult)(a->A,xx,yy);CHKERRQ(ierr);
> ierr =
> VecScatterEnd(a->Mvctx,xx,a->lvec,INSERT_VALUES,SCATTER_FORWARD);CHKERRQ(ierr);
> ierr = (*a->B->ops->multadd)(a->B,a->lvec,yy,yy);CHKERRQ(ierr);
> PetscFunctionReturn(0);
>
> The needed values of x are communicated in the VecScatterBegin() to
> VecScatterEnd(). Note only exactly those values needed by each process are
> communicated in the scatter so not all values are communicated to all
> processes. Since the matrix is very sparse (normally) only a small
> percentage of the values need to be communicated.
>
> Barry
>
> >
> > I'm sorry for my English.
> >
> > Regards,
> >
> > --
> > Ronal Celaya
>
>
--
Ronal Celaya
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150208/699c3c06/attachment.html>
More information about the petsc-users
mailing list