[petsc-users] Combining DMDA and DMStag

Patrick Sanan patrick.sanan at gmail.com
Mon Apr 25 01:47:52 CDT 2022


In that case, from your original message, it seems like the issue might be
in the

    // populate edofF, edofT

Or, perhaps, it's simply due to the fact that the numbering/ordering
depends on the number of ranks. This is illustrated in this image in the
manual: https://petsc.org/release/docs/manual/vec/#fig-daao

This is for DMDA but DMStag does something similar. There is a "natural"
ordering, which is the ordering you'd get with a single rank. There's also
"PETSc ordering" which depends on the number of ranks - in this case each
rank's portion of the vector is a contiguous range.

If this might be the explanation for your problem, note that for DMDA there
are already some helper functions which can map "PETSc" to "natural"
ordering. While these obviously require a lot of communication between
ranks and so you'd want to avoid them in production runs, these are useful
in situations where you want to directly compare vectors on different
numbers of ranks for diagnostic or I/O purposes.

E.g. see this man page:
https://petsc.org/release/docs/manualpages/DMDA/DMDANaturalToGlobalBegin.html

I haven't implemented the equivalent for DMStag, but if it's needed I can
look at adding it.

Am Fr., 22. Apr. 2022 um 17:28 Uhr schrieb Carl-Johan Thore <
carl-johan.thore at liu.se>:

> Thanks for the quick replies!
>
> Using dmstag for the temperature, or even for both fields is a possibility
> of course. Preferably however I'd like to keep the temp. code as it is and
> be able to quickly switch between different methods for the flow, based on
> dmda, dmstag and so on, so I'll try a bit more with the dmda-dmstag
> approach first.
>
>
>
>
>
> ------------------------------
> *From:* Patrick Sanan <patrick.sanan at gmail.com>
> *Sent:* 22 April 2022 17:04
> *To:* Barry Smith <bsmith at petsc.dev>
> *Cc:* Carl-Johan Thore <carl-johan.thore at liu.se>; petsc-users at mcs.anl.gov
> <petsc-users at mcs.anl.gov>
> *Subject:* Re: [petsc-users] Combining DMDA and DMStag
>
>
>
> Am Fr., 22. Apr. 2022 um 16:04 Uhr schrieb Barry Smith <bsmith at petsc.dev>:
>
>
>    We would need more details as to exactly what goes wrong to determine
> any kind of fix; my guess would be that the layout of the velocity vectors
> and temperature vectors is slightly different since the DMDA uses nelx+1
> while the stag uses nelx and may split things up slightly differently in
> parallel. You could try very small problems with say 2 or 4 ranks and put
> known values into the vectors and look at the ghost point update locations
> and exact locations in the local and global vectors to make sure everything
> is where you expect it to be.
>
>   There is also the possibility of using a DMStag for the temperature
> instead of DMDA since DMDA provides essentially a subset of the
> functionality of DMDA to get a more consistent layout of the unknowns in
> the two vectors.
>
>
> This was going to be my first suggestion as well - one way to ensure
> compatibility would be to use DMStag for everything. E.g. you could create
> your temperature DMStag with only vertex/corner (dof30 degrees of freedom,
> and then create one or more a "compatible" DMStags (same elements on each
> MPI rank) for your other fields, using DMStagCreateCompatibleDMStag() .
>
>   Finally, one could use a single DMStag with all the unknowns but treat
> subvectors of the unknowns differently in your discretization and solve
> process. So assign velocity values (I guess) to the cell faces and
> temperature to the cell vertices, this will give consistent parallel
> decomposition of values but you will have to manage the fact that the
> unknowns are interlaced into a single vector so your solver portions of the
> code may need to "pull" out appropriate subvectors.
>
>   Barry
>
>
> On Apr 22, 2022, at 9:45 AM, Carl-Johan Thore <carl-johan.thore at liu.se>
> wrote:
>
> Hi!
>
> I'm working on a convection-diffusion heat transfer problem. The
> temperature
> is discretized using standard Q1 elements and a DMDA. The flow is modelled
> using a stabilized Q1-Q0 method for which DMStag seemed like a good
> choice. The codes for the temperature
> and flow work fine separately (both in serial and parallel), but when
> combined and running
> in parallel, a problem sometimes arises in the assembly of the thermal
> system matrix.
> Here’s a rough sketch of the combined code:
>
> // Create dmda for thermal problem and dmstag for flow problem
> DMDACreate3d(PETSC_COMM_WORLD, bx, by, bz, DMDA_STENCIL_BOX, nelx+1,
> nely+1, nelz+1, PETSC_DECIDE, PETSC_DECIDE, PETSC_DECIDE,
>                 1, stencilWidth, 0, 0, 0,
> &dmthermal);
>> // A bit of code to adjust Lx,Ly,Lz so that dmthermal and dmflow are
> compatible in the sense of having the same
> // local elements
>> DMStagCreate3d(PETSC_COMM_WORLD, bx, by, bz, nelx, nely, nelz, md,nd,pd,
> 3,0,0,0,DMSTAG_STENCIL_BOX,stencilWidth,Lx,Ly,Lz,&dmflow);
>
>
> PetscInt edofT[8];          // 8-noded element with 1 temp DOF per node
> DMStagStencil edofF[24];       // 8 nodes with 3 velocity DOFs each
>
> // Assemble thermal system matrix K
> for (PetscInt e=0 ...)   // Loop over local elements
> {
> // Populate edofF, edofT
>                              // Get element velocities in ue from local
> velocity vector uloc
>
> DMStagVecGetValuesStencil(dmflow,uloc,24,edof,ue);
>                                                           ...
>                              Ke = Ke_diffusion + Ke_convection(ue)
>                                                           ...
>                              MatSetValuesLocal(K, 8, edofT, 8, edofT, Ke,
> ADD_VALUES);
> }
>
> This always works fine in serial, but depending on the mesh and the number
> of ranks,
> we don't always get the correct values in the element velocity vector ue.
> I suspect
> this has something to do with the ordering of the elements and/or the
> DOFs, because
> the elements in the global velocity vector are always the same but their
> order may change
> (judging from the output of VecView at least).
>
> Is it possible to ensure compatibility between the dm:s, or find some kind
> of mapping
> between them, so that something along the lines of the code above always
> works?
>
> Kind regards,
> Carl-Johan
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220425/7a6681a3/attachment-0001.html>


More information about the petsc-users mailing list