[petsc-users] DMDAGetElements returns unexpected values
Mikhail Artemyev
artemiev.mikhail at gmail.com
Thu Jul 16 09:05:31 CDT 2015
Barry,
Thank you for the explanation! I finally got where I was wrong - I mixed
up ghost nodes and cells, so that trying to have 0 ghost cells I set up
0 ghost nodes which was wrong as you pointed out.
Best regards,
Mikhail
On 07/15/2015 07:19 PM, Barry Smith wrote:
> Mikhail
>
> DMDAGetElements - Gets an array containing the indices (in local coordinates)
> of all the local elements
>
> Since it returns the "local indices" of the vertices of the elements it does not make sense for a stencil width of zero (with parallel DMDA) since the vertex of the last element on each process is actually owned by the next process so it is a ghost vertex on the process but with a stencil width of zero there are no ghost vertices hence no "local index" to represent that vertex.
>
> I have changed the code to return an error instead of "garbage" values if the stencil width is zero.
>
> So you should just use a stencil width of 1 if you want to do finite elements (or finite differences actually also :-). Stencil width of 0 is only for strange situations and doesn't make much sense for PDEs.
>
> Thanks for reporting the problem.
>
> Barry
>
>
>> On Jul 15, 2015, at 10:46 AM, Mikhail Artemyev <artemiev.mikhail at gmail.com> wrote:
>>
>> Dear all,
>>
>> Here is a minimal working example that I'm testing:
>>
>> #include "petscsys.h"
>> #include "petscdmda.h"
>>
>> int main(int argc, char **argv)
>> {
>> PetscInitialize(&argc, &argv, 0, 0);
>>
>> PetscInt rank;
>> MPI_Comm_rank(PETSC_COMM_WORLD, &rank);
>>
>> DM da;
>> DMDACreate1d(PETSC_COMM_WORLD, DM_BOUNDARY_NONE, -9, 1, 0, NULL, &da);
>>
>> PetscInt n_local_cells, n_cell_nodes;
>> const PetscInt *local_indices;
>> DMDAGetElements(da, &n_local_cells, &n_cell_nodes, &local_indices);
>>
>> PetscSynchronizedPrintf(PETSC_COMM_WORLD,
>> "rank %d n_local_cells %d n_cell_nodes %d\n",
>> rank, n_local_cells, n_cell_nodes);
>> PetscSynchronizedFlush(PETSC_COMM_WORLD, PETSC_STDOUT);
>>
>> DMDARestoreElements(da, &n_local_cells, &n_cell_nodes, &local_indices);
>>
>> PetscFinalize();
>> return 0;
>> }
>>
>>
>> I believe it creates a 1D DM object, and outputs the number of local cells assigned to each process.
>>
>> Here is what I have as an output:
>>
>> $ mpirun -np 1 ./test
>> rank 0 n_local_cells 8 n_cell_nodes 2 // OK
>>
>> $ mpirun -np 2 ./test
>> rank 0 n_local_cells 4 n_cell_nodes 2 // OK
>> rank 1 n_local_cells 3 n_cell_nodes 2 // I expected 4 local cells here
>>
>> $ mpirun -np 4 ./test
>> rank 0 n_local_cells 2 n_cell_nodes 2 // OK
>> rank 1 n_local_cells 1 n_cell_nodes 2 // I expected 2 local cells here
>> rank 2 n_local_cells 1 n_cell_nodes 2 // I expected 2 local cells here
>> rank 3 n_local_cells 1 n_cell_nodes 2 // I expected 2 local cells here
>>
>>
>> What am I missing?
>>
>> Thank you.
>>
>> Best,
>> Mikhail
>>
>>
More information about the petsc-users
mailing list