[petsc-dev] VecNestSetSubVec for VecNest.

Jed Brown jedbrown at mcs.anl.gov
Mon Oct 17 21:25:12 CDT 2011


We discussed Vec local/global semantics on the libmesh list a year ago or
more. I didn't think there was a failing in the current system, but libmesh
imposes some stricter consistency on local vectors, which sometimes causes
unnecessary communication. I don't recall all the details, but we can dig up
the thread.
On Oct 17, 2011 9:17 PM, "Vijay S. Mahadevan" <vijay.m at gmail.com> wrote:

> > I would not try to simultaneously change the distribution of the Vec.
> That's
> > what VecScatter is for. VecConvert() would keep the same distribution and
> > give you back a semantically identical vector of a different type.
>
> Well, I implied changing the parallel layout because of the code I've
> seen in say libMesh and other packages using PETSc. Their idea of
> localize() is often to convert a MPI vector to a locally serial vector
> with/without ghost nodes. I see your point on using VecScatter and so
> VECSEQ can still be disallowed but some form of Ghosted parallel
> vector conversion would still be useful.
>
> On Mon, Oct 17, 2011 at 9:00 PM, Jed Brown <jedbrown at mcs.anl.gov> wrote:
> > On Mon, Oct 17, 2011 at 20:55, Vijay S. Mahadevan <vijay.m at gmail.com>
> wrote:
> >>
> >> Actually, that is quite consistent in philosophy to the merge
> >> operation I'm trying to perform. SEQ->MPI might still be an invalid
> >> operation for Vec though. Perhaps with a PETSC_DECIDE for local, it
> >> still could be relevant ? You can definitely specialize this for
> >> MPI->SEQ and Nest Vectors with a new VecReuse enum with relevant
> >> names.
> >
> > I would not try to simultaneously change the distribution of the Vec.
> That's
> > what VecScatter is for. VecConvert() would keep the same distribution and
> > give you back a semantically identical vector of a different type.
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20111017/d37f8655/attachment.html>


More information about the petsc-dev mailing list