[petsc-users] -log_summary for MatMult

Matthew Knepley knepley at gmail.com
Fri Jun 15 09:14:28 CDT 2012


On Fri, Jun 15, 2012 at 9:18 PM, Alexander Grayver
<agrayver at gfz-potsdam.de>wrote:

> **
> On 15.06.2012 14:46, Matthew Knepley wrote:
>
> On Fri, Jun 15, 2012 at 8:31 PM, Alexander Grayver <
> agrayver at gfz-potsdam.de> wrote:
>
>>  Matt,
>>
>> According to that code:
>>
>> 486: *PetscErrorCode MatMult_MPIDense(Mat mat,Vec xx,Vec yy)*487: {488:   Mat_MPIDense   *mdn = (Mat_MPIDense*)mat->data;
>> 492:  VecScatterBegin(mdn->Mvctx,xx,mdn->lvec,INSERT_VALUES,SCATTER_FORWARD);493:   VecScatterEnd(mdn->Mvctx,xx,mdn->lvec,INSERT_VALUES,SCATTER_FORWARD);494:   MatMult_SeqDense(mdn->A,mdn->lvec,yy);495:   return(0);496: }
>>
>>
>> Each process has its own local copy of the vector?
>>
>
>  I am not sure what your point is. VecScatter is just an interface that
> has many implementations.
>
>
> I'm trying to estimate the amount of data needs to be communicated over
> all processes during this operation.
> In debugger I see VecScatter from the code above reduces to the
> MPI_Allgatherv and results in (assuming vector is distributed uniformly)
>
> bytes_send_received = num_of_proc * ((num_of_proc - 1) * vec_size_local) *
> 2 * sizeof(PetscScalar)
>
> Does that look reasonable?
>

This is not really a useful exercise, since

  a) PETSc does not currently have an optimized parallel dense
implementation

  b) We are implementing an Elemental interface this summer. You can try it
out in petsc-dev

  c) Elemental is much more efficient than our simple implementation, and
uses a unique
      approach to communication (all reductions)

I would take the comp+comm estimates from Jack's slides on Elemental

   Matt


> Thanks.
>
> --
> Regards,
> Alexander
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120615/13ee2058/attachment-0001.html>


More information about the petsc-users mailing list