[petsc-users] Error: DM global to natural SF was not created when DMSetUseNatural has already been called
Matthew Knepley
knepley at gmail.com
Wed Nov 28 21:34:46 CST 2018
On Wed, Nov 28, 2018 at 8:58 PM Danyang Su via petsc-users <
petsc-users at mcs.anl.gov> wrote:
> Dear All,
>
> I got the following error when using DMPlexGlobalToNatural function
> using 1 processor.
>
We do not create that mapping on 1 proc because the orderings are the same.
Reordering happens
when we redistribute.
Thanks,
Matt
> [0]PETSC ERROR: --------------------- Error Message
> --------------------------------------------------------------
> [0]PETSC ERROR: Object is in wrong state
> [0]PETSC ERROR: DM global to natural SF was not created.
> You must call DMSetUseNatural() before DMPlexDistribute().
>
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html
> for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018
>
> The same code does not return error when using more than 2 processors,
> however, the vec_natural is always zero after calling
> DMPlexGlobalToNaturalEnd.
>
>
> DMSetUseNatural() has already been used before calling DMPlexDistribute.
> The code section looks like below
>
> if (rank == 0) then
>
> call DMPlexCreateFromCellList(Petsc_Comm_World,ndim,0,0, &
> num_nodes_per_cell, &
> Petsc_False,dmplex_cells,ndim, & !use Petsc_True
> to create intermediate mesh entities (faces, edges),
> dmplex_verts,dmda_flow%da,ierr)
> CHKERRQ(ierr)
>
> end if
>
> !c Set the flag for creating a mapping to the natural order on
> distribution
> call DMSetUseNatural(dmda_flow%da,PETSC_TRUE,ierr)
> CHKERRQ(ierr)
>
> !c distribute mesh over processes
> call DMPlexDistribute(dmda_flow%da,stencil_width, &
> PETSC_NULL_SF, distributedMesh,ierr)
>
> CHKERRQ(ierr)
>
> !c destroy original global mesh after distribution
> if (distributedMesh /= PETSC_NULL_DM) then
> call DMDestroy(dmda_flow%da,ierr)
> CHKERRQ(ierr)
> !c set the global mesh as distributed mesh
> dmda_flow%da = distributedMesh
> end if
>
> ...
>
> call DMPlexCreateSection(dmda_flow%da,dmda_flow%dim, &
> numFields,pNumComp,pNumDof, &
> numBC,pBcField, &
> pBcCompIS,pBcPointIS, &
> PETSC_NULL_IS, &
> section,ierr)
> CHKERRQ(ierr)
>
> call PetscSectionSetFieldName(section,0,'flow',ierr)
> CHKERRQ(ierr)
>
>
> call DMSetSection(dmda_flow%da,section,ierr)
> CHKERRQ(ierr)
>
> call PetscSectionDestroy(section,ierr)
> CHKERRQ(ierr)
>
> call DMSetUp(dmda_flow%da,ierr)
> CHKERRQ(ierr)
>
> ...
>
>
> !c global - natural order
>
> call DMCreateLocalVector(dmda_flow%da,vec_loc,ierr)
> CHKERRQ(ierr)
>
> call DMCreateGlobalVector(dmda_flow%da,vec_global,ierr)
> CHKERRQ(ierr)
>
> call DMCreateGlobalVector(dmda_flow%da,vec_natural,ierr)
> CHKERRQ(ierr)
>
> !c zero entries
> call VecZeroEntries(vec_loc,ierr)
> CHKERRQ(ierr)
>
> !Get a pointer to vector data when you need access to the array
> call VecGetArrayF90(vec_loc,vecpointer,ierr)
> CHKERRQ(ierr)
>
> do inode = 1, num_nodes
> vecpointer(inode) = node_idx_lg2pg(inode) !vector value using
> PETSc global order, negative ghost index has been reversed
> end do
>
> !Restore the vector when you no longer need access to the array
> call VecRestoreArrayF90(vec_loc,vecpointer,ierr)
> CHKERRQ(ierr)
>
> !Insert values into global vector
> call DMLocalToGlobalBegin(dmda_flow%da,vec_loc,INSERT_VALUES, &
> vec_global,ierr)
> CHKERRQ(ierr)
>
> call DMLocalToGlobalEnd(dmda_flow%da,vec_loc,INSERT_VALUES, &
> vec_global,ierr)
> CHKERRQ(ierr)
>
>
> !c global to natural ordering
> call DMPlexGlobalToNaturalBegin(dmda_flow%da,vec_global, &
> vec_natural,ierr)
> CHKERRQ(ierr)
>
> call DMPlexGlobalToNaturalEnd(dmda_flow%da,vec_global, &
> vec_natural,ierr)
> CHKERRQ(ierr)
>
>
> Is there anything missing in the code that DMPlexGlobalToNatural... does
> not work properly?
>
> Thanks,
>
> Danyang
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181128/5e933fd5/attachment.html>
More information about the petsc-users
mailing list