On Fri, Dec 2, 2011 at 10:43 PM, Barry Smith <span dir="ltr"><<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
<br>
On Dec 2, 2011, at 4:52 PM, Tim Gallagher wrote:<br>
<br>
> There are only unknowns at the cell centers. The coordinates of the nodes may be used during computation though.<br>
><br>
> I'd have to think if it's possible to specify the nodes as dof's of each cell center... but that may duplicate a lot of data and be sort of convoluted to understand what was going on in the code.<br>
<br>
Tim,<br>
<br>
So the issue is that there is no way to insure that two different DA's with different dimensions get the same parallel layout so the coordinate information you need is always easily available with the cell data.<br>
<br>
Someday we hope to have support for staggered grids with DAs and that would make your task much much easier. Sorry it doesn't exist now.<br>
<br>
Here is what I would do now because it is so easy to do and the parallelism becomes trivial. Create 2 DMDAs both of the same size but have the first have a dof equal to your number of equations and have the second one have a dof of 8 (for 2d problems) and 16 for 3d problems. Both DMDA's will have the same parallel layout and the ghosted versions while also have the same layout. Now store the x,y coordinates of all four vertices of each cell (in 2d) into the 8 locations of a vector associated with the second DMDA. Now you will access coordinates of cell vertices with the same indexing that you use for the equations (after calling DMDAVecGetArray()). The code will be very clean.<br>
</blockquote><div><br></div><div>This is one option. For slightly less clean code that use less memory, you can adopt the convention in SNES ex30. Here</div><div>staggered grid unknowns are all kept on the same DA, enough ghosting is used to make sure you have everything, and</div>
<div>a convention tells you what thing you are addressing for each location. It was done by a user who is capable of keeping</div><div>a lot of complexity in his head, so its not necessarily the easiest way to do it but its possible.</div>
<div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
The only drawback is the extra storage of the coordinates but frankly that is likely to be a relatively small amount of additional memory compared to everything else need in the simulation.<br>
<br>
Note: do not use the DMDACoordinates interfaces that is not appropriate for what you need, just keep the coordinates as I described above.<br>
<br>
Barry<br>
<br>
><br>
> Tim<br>
><br>
> ----- Original Message -----<br>
> From: "Barry Smith" <<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>><br>
> To: <a href="mailto:gtg085x@mail.gatech.edu">gtg085x@mail.gatech.edu</a>, "PETSc users list" <<a href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a>><br>
> Sent: Friday, December 2, 2011 5:14:59 PM<br>
> Subject: Re: [petsc-users] DMDA Question<br>
><br>
><br>
> On Dec 2, 2011, at 1:19 PM, Tim Gallagher wrote:<br>
><br>
>> Yes, I would like two DMDA's, one for the nodes and the other for the cell centers. The grid is structured, but is non-uniform.<br>
><br>
> Do you have degrees of freedom on the "nodes" and cell centers, or are the only degrees of freedom on the cell centers and the nodes are used for intermediate computations? That is does your nonlinear system involve unknowns on both the nodes and centers?<br>
><br>
> Barry<br>
><br>
><br>
>><br>
>> I think I want that anyway, is that the best structure for a case like this?<br>
>><br>
>> Thanks,<br>
>><br>
>> Tim<br>
>><br>
>> ----- Original Message -----<br>
>> From: "Barry Smith" <<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>><br>
>> To: <a href="mailto:gtg085x@mail.gatech.edu">gtg085x@mail.gatech.edu</a>, "PETSc users list" <<a href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a>><br>
>> Sent: Friday, December 2, 2011 2:12:54 PM<br>
>> Subject: Re: [petsc-users] DMDA Question<br>
>><br>
>><br>
>> Tim,<br>
>><br>
>> Unfortunately the DMDA was not designed with the idea of staggered grids (or mixed discretizations) in mind so they are not straightforward like we would like them to be.<br>
>><br>
>> Do I understand you want two DMDA<br>
>><br>
>> 1) one that has locations for cell vertices (or face centers)?<br>
>><br>
>> 2) and one that locations for cell centers?<br>
>><br>
>> Once we understand what you want we may be able to make a suggestion.<br>
>><br>
>> Barry<br>
>><br>
>><br>
>> On Dec 2, 2011, at 1:08 PM, Tim Gallagher wrote:<br>
>><br>
>>> Hi,<br>
>>><br>
>>> I'm trying to create some grids for a finite volume simulation and am a little stuck on the best way to tackle it. The approach now is to use DMDACreate3d with known number of I, J, K points and PETSC_DECIDE for the processors. Then I am trying to create another DMDA with the same distribution to store the cell centers and actual solution vector. The problem with is is the following:<br>
>>><br>
>>> Consider 9 points and two processors. Processor one gets points [1,5] with point 6 as a ghost and processor two gets points [6,9] with point 5 as a ghost. But now to create/store cell centers, processor two needs point 4 to construct a cell center for it's ghost.<br>
>>><br>
>>> I can certainly fetch that point and do the calculation, but I feel like there is a more elegant way to do this out there. Has anybody used a DMDA to create a dual DMDA?<br>
>>><br>
>>> Any advice would be appreciated,<br>
>>><br>
>>> Tim<br>
>><br>
><br>
<br>
</blockquote></div><br><br clear="all"><div><br></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener<br>