[petsc-users] Add unstructured grid capability to existing structured grid code
Matthew Knepley
knepley at gmail.com
Wed Feb 14 16:26:53 CST 2018
On Wed, Feb 14, 2018 at 1:47 PM, Danyang Su <danyang.su at gmail.com> wrote:
> Dear All,
>
> I have a reactive transport code that was first developed using structured
> grid and parallelized using PETSc. Both sequential version (with or without
> PETSc) and parallel version work fine. Recently I have finished the
> unstructured grid capability for the sequential version. Next step work is
> to modify the necessary part to make the code parallelized using
> unstructured grid.
>
> For the structured grid code, it follows the following steps.
>
> !domain decomposition
>
> DMDACreate3D()
>
> DMDAGetInfo()
>
> DMDAGetCorners()
>
> !timeloop begins
>
> !calculate matrix entry and rhs
> ...
> Solve Ax=b using PETSc
>
> DMGlobalToLocalBegin()
> DMGlobalToLocalEnd()
> ...
> !end of timeloop
>
>
> So far as I know, the domain decomposition part need to be modified. I
> plan to use PETSc DMPlex class to do this job. Is this the best way to port
> the code?
>
Yes, this is basically correct.
Thanks,
Matt
>
> DMPlexCreateFromFile()
>
> DMPlexDistribute()
>
> !timeloop begins
>
> !calculate matrix entry and rhs
> ...
> Solve Ax=b using PETSc
>
> DMGlobalToLocalBegin()
> DMGlobalToLocalEnd()
> ...
> !end of timeloop
>
> Thanks,
>
> Danyang
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20180214/24afe3b9/attachment.html>
More information about the petsc-users
mailing list