[petsc-users] Using dmplexdistribute do parallel FEM code.

Matthew Knepley knepley at gmail.com
Wed May 17 18:30:31 CDT 2023


On Wed, May 17, 2023 at 6:58 PM neil liu <liufield at gmail.com> wrote:

> Dear Petsc developers,
>
> I am writing my own code to calculate the FEM matrix. The following is my
> general framework,
>
> DMPlexCreateGmsh();
> MPI_Comm_rank (Petsc_comm_world, &rank);
> DMPlexDistribute (.., .., &dmDist);
>
> dm = dmDist;
> //This can create separate dm s for different processors. (reordering.)
>
> MatCreate (Petsc_comm_world, &A)
> // Loop over every tetrahedral element to calculate the local matrix for
> each processor. Then we can get a local matrix A for each processor.
>
> *My question is : it seems we should build a global matrix B (assemble all
> the As for each partition) and then transfer B to KSP. KSP will do the
> parallelization correctly, right? *
>

I would not suggest this. The more common strategy is to assemble each
element matrix directly into the
global matrix B, by mapping the cell indices directly to global indices
(rather than to local indices in the matrix A). You can do this in two
stages. You can create a LocalToGlobalMapping in PETSc that maps
every local index to a global index. Then you can assemble into B exactly
as you would assemble into A by calling MatSetValuesLocal().

DMPlex handles these mappings for you automatically, but I realize that it
is a large number of things to buy into.

  Thanks,

     Matt


> If that is right, I should define a whole domain matrix B before the
> partitioning (MatCreate (Petsc_comm_world, &B); ), and then use
> localtoglobal (which petsc function should I use? Do you have any
> examples.) map to add A to B at the right positions (MatSetValues) ?
>
> Does that make sense?
>
> Thanks,
>
> Xiaodong
>
>
>
>
>
>
>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20230517/86ae7778/attachment.html>


More information about the petsc-users mailing list