[petsc-users] TS scheme with different DAs

Manuel Valera mvalera-w at sdsu.edu
Thu Sep 26 16:27:22 CDT 2019


Hi all,

Just retouching in case my mail was lost, does my problem make enough
sense? I could try to be clearer if you like. I tried using TSGetStages()
in the middle of the RHS function but that didn't work. Maybe we can find a
different approach for this ?

Thanks,

On Tue, Sep 24, 2019 at 12:58 PM Manuel Valera <mvalera-w at sdsu.edu> wrote:

> Hello all,
>
> I finally implemented the TS routine operating in several DAs at the same
> time, hacking it as you suggested. I still have a problem with my algorithm
> though. It is not DMDA related so there's that.
>
> My algorithm needs to update u,v,w with information from the updated
> T,S,rho. My problem, or what I don't understand yet, is how to operate in
> the intermediate runge-kutta time integration states inside the
> RHSFunction.
>
> If I can be more clear, I would need the intermediate T,S states to obtain
> an updated rho (density) to in turn, obtain the correct intermediate
> velocities, and keep the loop going. As I understand right now, the RHS
> vector is different from this intermediate state, and it would be only the
> RHS input to the loop, so operating on this would be incorrect.
>
> As of now, my algorithm still creates artifacts because of this lack of
> information to accurately update all of the variables at the same time. The
> problem happens as well in serial.
>
> Thanks for your help,
>
>
>
>
>
>
>
> On Wed, Sep 18, 2019 at 4:36 AM Matthew Knepley <knepley at gmail.com> wrote:
>
>> On Tue, Sep 17, 2019 at 8:27 PM Smith, Barry F. <bsmith at mcs.anl.gov>
>> wrote:
>>
>>>
>>>   Don't be too quick to dismiss switching to the DMStag you may find
>>> that it actually takes little time to convert and then you have a much less
>>> cumbersome process to manage the staggered grid. Take a look at
>>> src/dm/impls/stag/examples/tutorials/ex2.c where
>>>
>>> const PetscInt dof0 = 0, dof1 = 1,dof2 = 1; /* 1 dof on each edge and
>>> element center */
>>>     const PetscInt stencilWidth = 1;
>>>     ierr =
>>> DMStagCreate2d(PETSC_COMM_WORLD,DM_BOUNDARY_NONE,DM_BOUNDARY_NONE,7,9,PETSC_DECIDE,PETSC_DECIDE,dof0,dof1,dof2,DMSTAG_STENCIL_BOX,stencilWidth,NULL,NULL,&dmSol);CHKERRQ(ierr);
>>>
>>> BOOM, it has set up a staggered grid with 1 cell centered variable and 1
>>> on each edge. Adding more the cell centers, vertices, or edges is trivial.
>>>
>>>   If you want to stick to DMDA you
>>>
>>> "cheat". Depending on exactly what staggering you have you make the DMDA
>>> for the "smaller problem" as large as the other ones and just track zeros
>>> in those locations. For example if velocities are "edges" and T, S are on
>>> cells, make your "cells" DMDA one extra grid width wide in all three
>>> dimensions. You may need to be careful on the boundaries deepening on the
>>> types of boundary conditions.
>>>
>>
>> Yes, SNES ex30 does exactly this. However, I still recommend looking at
>> DMStag. Patrick created it because managing the DMDA
>> became such as headache.
>>
>>   Thanks,
>>
>>     Matt
>>
>>
>>> > On Sep 17, 2019, at 7:04 PM, Manuel Valera via petsc-users <
>>> petsc-users at mcs.anl.gov> wrote:
>>> >
>>> > Thanks Matthew, but my code is too complicated to be redone on DMStag
>>> now after spending a long time using DMDAs,
>>> >
>>> > Is there a way to ensure PETSc distributes several DAs in the same
>>> way? besides manually distributing the points,
>>> >
>>> > Thanks,
>>> >
>>> > On Tue, Sep 17, 2019 at 3:28 PM Matthew Knepley <knepley at gmail.com>
>>> wrote:
>>> > On Tue, Sep 17, 2019 at 6:15 PM Manuel Valera via petsc-users <
>>> petsc-users at mcs.anl.gov> wrote:
>>> > Hello, petsc users,
>>> >
>>> > I have integrated the TS routines in my code, but i just noticed i
>>> didn't do it optimally. I was using 3 different TS objects to integrate
>>> velocities, temperature and salinity, and it works but only for small DTs.
>>> I suspect the intermediate Runge-Kutta states are unphased and this creates
>>> the discrepancy for broader time steps, so I need to integrate the 3
>>> quantities in the same routine.
>>> >
>>> > I tried to do this by using a 5 DOF distributed array for the RHS,
>>> where I store the velocities in the first 3 and then Temperature and
>>> Salinity in the rest. The problem is that I use a staggered grid and T,S
>>> are located in a different DA layout than the velocities. This is creating
>>> problems for me since I can't find a way to communicate the information
>>> from the result of the TS integration back to the respective DAs of each
>>> variable.
>>> >
>>> > Is there a way to communicate across DAs? or can you suggest an
>>> alternative solution to this problem?
>>> >
>>> > If you have a staggered discretization on a structured grid, I would
>>> recommend checking out DMStag.
>>> >
>>> >   Thanks,
>>> >
>>> >      MAtt
>>> >
>>> > Thanks,
>>> >
>>> >
>>> >
>>> > --
>>> > What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> > -- Norbert Wiener
>>> >
>>> > https://www.cse.buffalo.edu/~knepley/
>>>
>>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190926/3144b98f/attachment.html>


More information about the petsc-users mailing list