Scattering context options

Matthew Knepley knepley at gmail.com
Fri Jul 27 08:11:21 CDT 2007


Maybe you can give more high level motivation for this setup. Without
knowing anything, it appears that you are communicating almost every
entry of your matrix, sometimes more than once. This will definitely be
very very slow. Parallel computing (at least scalable computing) relies
on a model where most work and memory access is local. Its not clear
to me that you have this. If not, parallel computing will not help very much.
However, a reorganization might help considerably. For one, fewer, larger
communications is always better.

    Matt

On 7/27/07, Graham Kells <gkells at thphys.nuim.ie> wrote:
> Hi ,
>
> I'm scattering from a global vector (used as a kind of  look up table)
> to a bunch of local vectors and then using the values in the local
> vectors as indices to populate a global matrix.
>
> The basic population routine goes something like this
>
>   do myrow=Istart,Iend-1
>
>
>     ! Returns column indices of full matrix for rowindex list(myrow+1)
>     call getrow(list(myrow+1),rowCOO,colCOO,valCOO)
>
>
>     call ISCreateGeneral(PETSC_COMM_WORLD,n,(colCOO-1),from,ierr)
>     call ISCreateGeneral(PETSC_COMM_SELF,n,idx_to-1,to,ierr)
>     call VecScatterCreate(vlist,from,outvec,to,scatter,ierr)
>     call
> VecScatterBegin(scatter,vlist,outvec,INSERT_VALUES,SCATTER_FORWARD,ierr)
>     call
> VecScatterEnd(scatter,vlist,outvec,INSERT_VALUES,SCATTER_FORWARD,ierr)
>     call VecScatterDestroy(scatter,ierr)
>     call ISDestroy(from,ierr)
>     call ISDestroy(to,ierr)
>
>     call VecGetValues(outvec,n,idx_to-1,idx_real,ierr)
>     idx_from=int(idx_real)
>
>
>     call MatSetValues(A,1,myrow,n,idx_from-1,valCOO,INSERT_VALUES,ierr)
>
>  end do
>
> While this works, it is prohibitively slow. Any ideas on why this is?
> Of course if you can suggest a better way of doing this that would be great.
>
> Supposing you can't and I want to experiment with different MPI
> communication modes within the scatter context. How do specify the
> MPI_Ssend option, for example?
>
> Thanks in advance,
>
> Graham
>
>
>
>
>
>
>
>
>
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which
their experiments lead.
-- Norbert Wiener




More information about the petsc-users mailing list