redistribution of vectors

Barry Smith bsmith at mcs.anl.gov
Tue Feb 10 17:08:43 CST 2009


On Feb 10, 2009, at 5:01 PM, Stephan Kramer wrote:

>>

>>>
>>>
>>
>> You can certainly use a VecScatter to redistribute a vector. You
>> already have the
>> indices from MatGetSubMatrix() so it should be easy.
>>
>> However, you can also use VecLoadIntoVector() and provide an initial
>> partitioning.
>>
>>  Matt
>>
> Ah brilliant, I must have overlooked these options (there are so  
> many in PETSc!). In that case is there also something like  
> VecLoadIntoVector for MatLoad, i.e. where I specify the distribution  
> beforehand, otherwise I'll just go for the MatGetSubMatrix() +  
> VecScatter().

    We don't currently have a MatLoadIntoVector(). The current plan is  
to actually remove VecLoadIntoVector() and have both VecLoad() and  
MatLoad() be essentially "load into" operations where one is free to  
define the layout of the vector and matrix before or not thus getting  
the full spectrum of possibilities in one clean interface. Sadly we  
haven't done this reorganization due to having many other things to do.
>
>
> Just a small related question, for MatGetSubMatrix where I ask for  
> the rows owned by each process but all columns, should I really  
> assemble an index set ranging over all global indices? This seems a  
> bit wasteful and non-scalable. Or should I put in the effort and,  
> using information I have over my halo regions, work out which  
> columns might have nonzeros for these rows. Or am I missing  
> something and is there an easier short-cut?

    It is true that it is not totally scalable but then dumping some  
huge honkying sparse matrix to a file and reading it in is not  
scalable either. We've never had a problem with this yet for  
reasonable matrix sizes. Rather than fixing this MatGetSubMatrix() ,  
in your case, having a better MatLoad() merged with  
MatLoadIntoMatrix() would be the thing I recommend optimizing. But  
frankly until you get up to 1 billion unknowns what is there now is  
probably good enough.

    Barry

>
>
> Thanks a lot
> Stephan
>
>>
>>> Cheers
>>> Stephan
>>>
>



More information about the petsc-users mailing list