[petsc-users] Mapping between application ordering and Petsc ordering

Matthew Knepley knepley at gmail.com
Sat Aug 8 14:48:43 CDT 2015


On Sat, Aug 8, 2015 at 2:45 PM, Mani Chandra <mc0710 at gmail.com> wrote:

> Thanks. Any suggestions for a fix?
>

You have to deal with the right part of the domain in your application
code. I have no idea
how you are handling this, and its not in the code below.

   Matt


> Reorder the indices in arrayApplication?
>
> On Sat, Aug 8, 2015 at 2:19 PM, Matthew Knepley <knepley at gmail.com> wrote:
>
>> On Sat, Aug 8, 2015 at 1:52 PM, Mani Chandra <mc0710 at gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I'm having trouble interfacing petsc to an application which I think is
>>> related to the ordering of the nodes. Here's what I'm trying to do:
>>>
>>> The application uses a structured grid with a global array having
>>> dimensions N1 x N2, which is then decomposed into a local array with
>>> dimensions NX1 x NX2.
>>>
>>> I create a Petsc DMDA using
>>>
>>>     DMDACreate2d(MPI_COMM_WORLD,
>>>                  DM_BOUNDARY_PERIODIC, DM_BOUNDARY_PERIODIC,
>>>                  DMDA_STENCIL_BOX,
>>>                  N1, N2,
>>>                  N1/NX1, N2/NX2,
>>>                  1, nghost, PETSC_NULL, PETSC_NULL,
>>>                  &dmda);
>>>
>>> and then use this to create a vec:
>>>
>>>   DMCreateGlobalVector(dmda, &vec);
>>>
>>> Now I copy the local contents of the application array to the petsc
>>> array using the following:
>>>
>>> Let i, j be the application indices and iPetsc and jPetsc be petsc's
>>> indices, then:
>>>
>>> DMDAGetCorners(dmda, &iStart, &jStart, &kStart,
>>>                                          &iSize, &jSize, &kSize
>>>                               );
>>>
>>>
>>> double **arrayPetsc;
>>> DMDAVecGetArray(dmda, vec, &arrayPetsc);
>>>
>>> for (int j=0, jPetsc=jStart; j<NX2, jPetsc<jStart+jSize; j++, jPetsc++)
>>> {
>>>   for (int i=0, iPetsc=iStart; i<NX1, iPetsc<iStart+iSize; i++, iPetsc++)
>>>   {
>>>      arrayPetsc[jPetsc][iPetsc] = arrayApplication[j][i];
>>>   }
>>> }
>>>
>>> DMDAVecRestoreArray(dmda, vec, &arrayPetsc);
>>>
>>> Now if I VecView(vec, viewer) and look at the data that petsc has, it
>>> looks right when run with 1 proc, but if I use 4 procs it's all messed up
>>> (see attached plots).
>>>
>>> I should probably be using the AO object but its not clear how. Could
>>> you help me out?
>>>
>>
>> It looks like you have the global order of processes reversed, meaning
>> you have
>>
>>   1   3
>>
>>   0   2
>>
>> and it should be
>>
>>   2  3
>>
>>   0  1
>>
>>   Thanks,
>>
>>       Matt
>>
>>
>>> Thanks,
>>> Mani
>>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150808/bdfa3dcb/attachment.html>


More information about the petsc-users mailing list