common nodla points on neighbouring subdomains

Matthew Knepley knepley at gmail.com
Wed May 17 06:13:37 CDT 2006


It is unclear how you have actually mapped unknowns into your linear
system. However, a global PETSc Mat would naively have only one copy
of a given unknown, and thus the duplicate unknowns would indeed be
out of synch. You have to do extra work to communicate the values after
the solve.

However, all this is handled for you if you use the DA object. It only
handles
logically Cartesian grids, but does all the ghosting. We will soon have an
unstructured counterpart.

   Matt

On 5/17/06, Thomas Geenen <geenen at gmail.com> wrote:
>
> Dear Petsc users,
>
> I am solving a system of equations Ax=b resulting from a Finite element
> package.  The parallelization strategy for this program is to minimize
> communication. therefore each subdomain has its own copy of common nodal
> points between subdomains (local numbering). This strategy is already
> implemented and I don't want to change that
>
> I fill the matrix and right hand side with a call to MatSetValuesLocal and
> VecSetValuesLocal. Some entries are filled on different subdomains (of
> course
> with the same value)
>
> The solution for the first subdomain is correct. however the solution
> vector
> on the second subdomain does not contain the solution on the common nodal
> points it shares with subdomain 1.
> This is probably a feature but it is not consistent with my program setup.
> The question is:
> how can i tell petsc to return a solution for all the positions i filled
> with
> VecSetValuesLocal?
>
> Thanks in advance
> Thomas Geenen
> ps I got the impression that the assemble routines also exchange common
> nodal
> point info.  It would be nice if this could be skipped as well.
> Of course some extra communication could solve my problem but I would
> prefer a
> more elegant solution if one exists
>
>


-- 
"Failure has a thousand explanations. Success doesn't need one" -- Sir Alec
Guiness
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20060517/11f28f54/attachment.htm>


More information about the petsc-users mailing list