[petsc-dev] Vector operations on ghosted vectors

Matthew Knepley knepley at gmail.com
Sat Mar 2 07:06:15 CST 2013


On Fri, Mar 1, 2013 at 12:40 PM, John Mousel <elafint.john at gmail.com> wrote:

> So am I right in reading this as you suggesting to:
>
> 1. Compute VecAXPY on all the u(n),u(n-1)...locally
> 2. Scatter the result of step (1)
>

Why is this here? You don't need that at all since the local space is
"ghosted".

    Matt


> 3. Perform the divergence on the resulting ghosted vector from (2).
>
> I don't see how that is either cleaner or more efficient than just summing
> up the ghosted portions of the vector since they have not been altered
> since the solve at the previous time step.
> I hope I'm being clear that what I am proposing is just redundant
> computation to avoid communication?
>
>
> On Fri, Mar 1, 2013 at 11:31 AM, Matthew Knepley <knepley at gmail.com>wrote:
>
>> On Fri, Mar 1, 2013 at 10:37 AM, John Mousel <elafint.john at gmail.com>wrote:
>>
>>> Let me be even more specific. I'm implementing a velocity correction
>>> scheme in rotational form (reference given below). In the first step,
>>> forgetting even a sub-iterative loop, a projecting pressure is solved for
>>> from
>>>
>>> L(p(n+1)) = -div(beta*u(n) + gamma*u(n-1) + exci*u(n-2) + H(n+1) - H(n)
>>> + f(n+1) - f(n))
>>>
>>> Here, L and div are Laplacian and divergence operators, and H(n) and
>>> f(n) are the non-linear and body force at time level n. In my mind, to form
>>> part of the RHS, I can use VecAXPY on all the terms in the div(), and then
>>> perform the divergence on that. To avoid communication, I can just use the
>>> VecAXPY on the ghosted portion as well, which since I've copied the entire
>>> ghosted vector during the VecCopy(u(n),u(n-1)) operation, should make sense.
>>> Is that not a wise thing to do?
>>>
>>
>> You have some communication of the ghosts before the solve. Why not work
>> completely in the local space
>> for this assembly, and then completely in the global space for the solve?
>> We usually organize this way.
>>
>>   Matt
>>
>>
>>> John
>>>
>>> J. L. Guermond, P. Minev, and J. Shen. An overview of projection methods
>>> for incompressible flows. Computer Methods in Applied Mechanics and
>>> Engineering, 195(44-47):6011–6045, 2006.
>>>
>>>
>>> On Fri, Mar 1, 2013 at 9:28 AM, Matthew Knepley <knepley at gmail.com>wrote:
>>>
>>>> On Fri, Mar 1, 2013 at 10:24 AM, John Mousel <elafint.john at gmail.com>wrote:
>>>>
>>>>> I use the data in the vectors for things besides pure vector
>>>>> operations. In the context of a sub-iteration scheme for an incompressible
>>>>> solver, I want to use the values stored in the vector at the previous
>>>>> iteration to construction the non-linear term. This requires copying not
>>>>> just the local portion of the vector in the u(k)->u(k-1) operation, but the
>>>>> ghosts as well. Doesn't copying the entire ghosted vector make sense in
>>>>> this context?
>>>>>
>>>>
>>>> It seems like a strange work flow. Since none of the vector operations
>>>> touch these, you will be getting
>>>> stale values anyway. It would make more sense to me if you were using
>>>> GlobalToLocal() during the iteration.
>>>>
>>>>    Matt
>>>>
>>>>
>>>>> John
>>>>>
>>>>>
>>>>> On Fri, Mar 1, 2013 at 8:09 AM, Jed Brown <jedbrown at mcs.anl.gov>wrote:
>>>>>
>>>>>> On Wed, Feb 27, 2013 at 4:56 PM, John Mousel <john.mousel at gmail.com>wrote:
>>>>>>
>>>>>>> Is there a possibility of adding a wrapper function around a few
>>>>>>> basic vector operations such as VecCopy, VecAXPY, VECAXPYPZ... to operate
>>>>>>> on ghosted vectors? I perform a lot of vector operations including the
>>>>>>> ghost region to avoid communication.
>>>>>>>
>>>>>>
>>>>>> Can you explain the context where operating with local forms in this
>>>>>> way makes sense? Usually you would either be working locally, in which case
>>>>>> you copy between local forms (or purely local work vectors) or you are
>>>>>> operating globally in which case there is nothing to gain by applying
>>>>>> operations to the local form.
>>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20130302/8a5bb41b/attachment.html>


More information about the petsc-dev mailing list