[petsc-users] multiblock question
Shao-Ching Huang
huangsc at gmail.com
Sat Oct 29 13:58:45 CDT 2011
Hi, I have additional questions:
1. Suppose I create DMDA0 on two processes {0,1} (communicator comm0)
and DMDA1 on another two processes {2,3} (communicator comm1). Can I
DMCompositeAddDM() them into DMComposite (created using communicator
MPI_COMM_WORLD, containing all 4 processes, 0-3)?
2. Suppose DMDA0 and DMDA1 are 2D Cartesian domains, and that the
right-hand-side of DMDA0 is "connected" to DMDA1 (just like the
subdomains within a regular DMDA). Which API should I use to tell
DMComposite that the "right side" (say i=Nx, all j) of DMDA0 is
connected to the "left side" (say i=0, all j) of DMDA1. I suppose I
need to use IS index set somewhere.
Thanks,
Shao-Ching
On Fri, Oct 28, 2011 at 5:17 PM, Matthew Knepley <knepley at gmail.com> wrote:
> On Fri, Oct 28, 2011 at 11:48 PM, Shao-Ching Huang <huangsc at gmail.com>
> wrote:
>>
>> Hi
>>
>> We are planning a new (finite volume) multiblock code, in which each
>> block has logically structured mesh. We plan to create one DMDA for
>> one block (which could span across 1 or more processes; we already
>> have code). What would be the recommended PETSc-way to couple these
>> blocks together for implicit solves? We also need ghosted region
>> between two connected blocks (just like the ghost regions among the
>> subdomains within a DMDA) for interpolation.
>
> I think the idea here is to use a DMComposite to couple together these
> DMDAs. You
> would have to specify the coupling explicitly since we have no way of
> knowing how they
> are connected, but after that, the GlobalToLocal() should work just the
> same.
> Thanks,
> Matt
>
>>
>> Thanks.
>>
>> Shao-Ching
>
>
>
> --
> What most experimenters take for granted before they begin their experiments
> is infinitely more interesting than any results to which their experiments
> lead.
> -- Norbert Wiener
>
More information about the petsc-users
mailing list