[petsc-users] AOApplicationToPetscIS
Hui Zhang
mike.hui.zhang at hotmail.com
Mon Apr 15 14:10:19 CDT 2013
On Apr 15, 2013, at 5:55 PM, Matthew Knepley wrote:
> On Mon, Apr 15, 2013 at 10:52 AM, Hui Zhang <mike.hui.zhang at hotmail.com> wrote:
> I'm implementing a domain decomposition preconditioner. The dof is ordered by myapp and using AO (and LocalToGlobalMapping for assembly) to map to petsc ordering.
> The task I'm doing is building VecScatter's from subdomains to the global domain. So my program is
>
> I do not understand why you would need AOs for this. They are for global reordering, whereas you seem to only
> need a local mapping here.
To use
ISLocalToGlobalMappingCreate(MPI_Comm cm,PetscInt n,const PetscInt indices[],PetscCopyMode mode,ISLocalToGlobalMapping *mapping)
In the manual page, it says "Not Collective, but communicator may have more than one process". What is the purpose of using a communicator other than SELF_COMM? Will the input indices[] be gathered in the communicator 'cm'?
Now I understand why AOCreateBasic is not scalable. But I still need to use it in the beginning for construction of LocalToGlobalMapping. How did you implement the LocalToGlobalMapping for element-based decomposition? Did you avoid using any AO?
>
> Matt
>
> Step 1. I can map subdomain petsc ordering to subdomain natural ordering.
>
> Step 2. I can also map subdomain natural ordering to global domain natural ordering.
>
> Step 3. I have an AO for mapping global domain natural ordering to petsc ordering.
>
> Since each subdomain is defined on a sub-communicator of the communicator of the global domain. My question is for
>
> AOApplicationToPetscIS(AO ao,IS is)
>
> can ao and is have different communicators? Will my program be bad for large problems? How would you do it?
>
>
>
> --
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
More information about the petsc-users
mailing list