[petsc-users] multiblock question

Shao-Ching Huang sch at ucla.edu
Sat Oct 29 23:55:39 CDT 2011


Matt, thanks again for the helpful comments. Really appreciate it. I
will now work on putting things together.

Shao-Ching

On Sat, Oct 29, 2011 at 7:03 PM, Matthew Knepley <knepley at gmail.com> wrote:
> On Sat, Oct 29, 2011 at 6:58 PM, Shao-Ching Huang <huangsc at gmail.com> wrote:
>>
>> Hi, I have additional questions:
>>
>> 1. Suppose I create DMDA0 on two processes {0,1} (communicator comm0)
>> and DMDA1 on another two processes {2,3} (communicator comm1). Can I
>> DMCompositeAddDM() them into DMComposite (created using communicator
>> MPI_COMM_WORLD, containing all 4 processes, 0-3)?
>
> No. This is due to the complexity of MPI for collectives, etc between
> communicators.
> Instead, you should use the full communicator for both, but give no vertices
> to
> ranks you want to leave out. This means you will have to partition the DMDA
> yourself,
> but this is straightforward. There are no performance hits when
> communicating ghost
> values, and the reductions inside the solve would need all the procs anyway.
>
>>
>> 2. Suppose DMDA0 and DMDA1 are 2D Cartesian domains, and that the
>> right-hand-side of DMDA0 is "connected" to DMDA1 (just like the
>> subdomains within a regular DMDA). Which API should I use to tell
>> DMComposite that the "right side" (say i=Nx, all j) of DMDA0 is
>> connected to the "left side" (say i=0, all j) of DMDA1. I suppose I
>> need to use IS index set somewhere.
>
> This is more complicated. All our examples (like SNES ex28) are not grids or
> scalar which
> are not coupled. You would need to construct the LocalToGlobal mapping for
> this collection
> of grids (which is a set of two ISes). Here is the current code:
>   http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/src/dm/impls/composite/pack.c.html#DMCompositeGetISLocalToGlobalMappings
> Notice that the mapping for each DMDA is just concatenated. You code would
> look similar,
> except that you would knit together one edge.
> Since we have never had anyone ask for this, the interface is still
> primitive. If
> you have a suggestion for a nice way to construct this IS, please let us
> know.
>   Thanks,
>       Matt
>
>>
>> Thanks,
>>
>> Shao-Ching
>>
>> On Fri, Oct 28, 2011 at 5:17 PM, Matthew Knepley <knepley at gmail.com>
>> wrote:
>> > On Fri, Oct 28, 2011 at 11:48 PM, Shao-Ching Huang <huangsc at gmail.com>
>> > wrote:
>> >>
>> >> Hi
>> >>
>> >> We are planning a new (finite volume) multiblock code, in which each
>> >> block has logically structured mesh. We plan to create one DMDA for
>> >> one block (which could span across 1 or more processes; we already
>> >> have code). What would be the recommended PETSc-way to couple these
>> >> blocks together for implicit solves? We also need ghosted region
>> >> between two connected blocks (just like the ghost regions among the
>> >> subdomains within a DMDA) for interpolation.
>> >
>> > I think the idea here is to use a DMComposite to couple together these
>> > DMDAs. You
>> > would have to specify the coupling explicitly since we have no way of
>> > knowing how they
>> > are connected, but after that, the GlobalToLocal() should work just the
>> > same.
>> >   Thanks,
>> >      Matt
>> >
>> >>
>> >> Thanks.
>> >>
>> >> Shao-Ching
>> >
>> >
>> >
>> > --
>> > What most experimenters take for granted before they begin their
>> > experiments
>> > is infinitely more interesting than any results to which their
>> > experiments
>> > lead.
>> > -- Norbert Wiener
>> >
>
>
>
> --
> What most experimenters take for granted before they begin their experiments
> is infinitely more interesting than any results to which their experiments
> lead.
> -- Norbert Wiener
>


More information about the petsc-users mailing list