[petsc-users] PetscFV and TS implicit
Thibault Bridel-Bertomeu
thibault.bridelbertomeu at gmail.com
Fri Aug 21 10:58:24 CDT 2020
Thank you Barry for the tip ! I’ll make sure to do that when everything is
set.
What I also meant is that there will not be any more direct way to set the
preconditioner than to go through SNESSetJacobian after having assembled
everything by hand ? Like, in my case, or in the more general case of fluid
dynamics equations, the preconditioner is not a fun matrix to assemble,
because for every cell the derivative of the physical flux jacobian has to
be taken and put in the right block in the matrix - finite element style if
you want. Is there a way to do that with Petsc methods, maybe
short-circuiting the FEM based methods ?
Thanks !
Thibault
Le ven. 21 août 2020 à 17:22, Barry Smith <bsmith at petsc.dev> a écrit :
>
>
> On Aug 21, 2020, at 8:35 AM, Thibault Bridel-Bertomeu <
> thibault.bridelbertomeu at gmail.com> wrote:
>
>
>
> Le ven. 21 août 2020 à 15:23, Matthew Knepley <knepley at gmail.com> a
> écrit :
>
>> On Fri, Aug 21, 2020 at 9:10 AM Thibault Bridel-Bertomeu <
>> thibault.bridelbertomeu at gmail.com> wrote:
>>
>>> Sorry, I sent too soon, I hit the wrong key.
>>>
>>> I wanted to say that context.npoints is the local number of cells.
>>>
>>> PetscRHSFunctionImpl allows to generate the hyperbolic part of the right
>>> hand side.
>>> Then we have :
>>>
>>> PetscErrorCode PetscIJacobian(
>>> TS ts, /*!< Time stepping object (see PETSc TS)*/
>>> PetscReal t, /*!< Current time */
>>> Vec Y, /*!< Solution vector */
>>> Vec Ydot, /*!< Time-derivative of solution vector */
>>> PetscReal a, /*!< Shift */
>>> Mat A, /*!< Jacobian matrix */
>>> Mat B, /*!< Preconditioning matrix */
>>> void *ctxt /*!< Application context */
>>> )
>>> {
>>> PETScContext *context = (PETScContext*) ctxt;
>>> HyPar *solver = context->solver;
>>> _DECLARE_IERR_;
>>>
>>> PetscFunctionBegin;
>>> solver->count_IJacobian++;
>>> context->shift = a;
>>> context->waqt = t;
>>> /* Construct preconditioning matrix */
>>> if (context->flag_use_precon) { IERR PetscComputePreconMatImpl(B,Y,
>>> context); CHECKERR(ierr); }
>>>
>>> PetscFunctionReturn(0);
>>> }
>>>
>>> and PetscJacobianFunction_JFNK which I bind to the matrix shell,
>>> computes the action of the jacobian on a vector : say U0 is the state of
>>> reference and Y the vector upon which to apply the JFNK method, then the
>>> PetscJacobianFunction_JFNK returns shift * Y - 1/epsilon * (F(U0 +
>>> epsilon*Y) - F(U0)) where F allows to evaluate the hyperbolic flux (shift
>>> comes from the TS).
>>> The preconditioning matrix I compute as an approximation to the actual
>>> jacobian, that is shift * Identity - Derivative(dF/dU) where dF/dU is, in
>>> each cell, a 4x4 matrix that is known exactly for the system of equations I
>>> am solving, i.e. Euler equations. For the structured grid, I can loop on
>>> the cells and do that 'Derivative' thing at first order by simply taking a
>>> finite-difference like approximation with the neighboring cells,
>>> Derivative(phi) = phi_i - phi_{i-1} and I assemble the B matrix block by
>>> block (JFunction is the dF/dU)
>>>
>>> /* diagonal element */
>>>
>>>
>>> for (v=0; v<nvars; v++) { rows[v] = nvars*pg + v; cols[v] = nvars*pg
>>> + v; }
>>>
>>>
>>> ierr = solver->JFunction
>>> <http://hypar.github.io/a00144.html#a25c87c7c874c7e08981688fe6c51167b>
>>> (values,(u+nvars*p),solver->physics
>>> <http://hypar.github.io/a00144.html#a48e81a1806b5774943fd9a26e9a190f2>
>>> ,dir,0);
>>>
>>>
>>> _ArrayScale1D_
>>> <http://hypar.github.io/a00165.html#a1dab5f429ccd03c4a7fbb605d8ef75b6>
>>> (values,(dxinv*iblank),(nvars*nvars));
>>>
>>>
>>> ierr = MatSetValues(Pmat,nvars,rows,nvars,cols,values,ADD_VALUES);
>>> CHKERRQ(ierr);
>>>
>>>
>>>
>>>
>>>
>>> /* left neighbor */
>>>
>>>
>>> if (pgL >= 0) {
>>>
>>>
>>> for (v=0; v<nvars; v++) { rows[v] = nvars*pg + v; cols[v] =
>>> nvars*pgL + v; }
>>>
>>>
>>> ierr = solver->JFunction
>>> <http://hypar.github.io/a00144.html#a25c87c7c874c7e08981688fe6c51167b>
>>> (values,(u+nvars*pL),solver->physics
>>> <http://hypar.github.io/a00144.html#a48e81a1806b5774943fd9a26e9a190f2>
>>> ,dir,1);
>>>
>>>
>>> _ArrayScale1D_
>>> <http://hypar.github.io/a00165.html#a1dab5f429ccd03c4a7fbb605d8ef75b6>
>>> (values,(-dxinv*iblank),(nvars*nvars));
>>>
>>>
>>> ierr = MatSetValues(Pmat,nvars,rows,nvars,cols,values,ADD_VALUES);
>>> CHKERRQ(ierr);
>>>
>>>
>>> }
>>>
>>>
>>>
>>>
>>>
>>> /* right neighbor */
>>>
>>>
>>> if (pgR >= 0) {
>>>
>>>
>>> for (v=0; v<nvars; v++) { rows[v] = nvars*pg + v; cols[v] =
>>> nvars*pgR + v; }
>>>
>>>
>>> ierr = solver->JFunction
>>> <http://hypar.github.io/a00144.html#a25c87c7c874c7e08981688fe6c51167b>
>>> (values,(u+nvars*pR),solver->physics
>>> <http://hypar.github.io/a00144.html#a48e81a1806b5774943fd9a26e9a190f2>
>>> ,dir,-1);
>>>
>>>
>>> _ArrayScale1D_
>>> <http://hypar.github.io/a00165.html#a1dab5f429ccd03c4a7fbb605d8ef75b6>
>>> (values,(-dxinv*iblank),(nvars*nvars));
>>>
>>>
>>> ierr = MatSetValues(Pmat,nvars,rows,nvars,cols,values,ADD_VALUES);
>>> CHKERRQ(ierr);
>>>
>>>
>>> }
>>>
>>>
>>>
>>> I do not know if I am clear here ...
>>> Anyways, I am trying to figure out how to do this shell matrix and this
>>> preconditioner using all the FV and DMPlex artillery.
>>>
>>
>> Okay, that is very clear. We should be able to get the JFNK just with
>> -snes_mf_operator, and put the approximate J construction in
>> DMPlexComputeJacobian_Internal().
>> There is an FV section already, and we could just add this. I would need
>> to understand those entries in the pointwise Riemann sense that the other
>> stuff is now.
>>
>
> Ok i had a quick look and if I understood correctly it would do the job.
> Setting the snes-mf-operator flag would mean however that we have to go
> through SNESSetJacobian to set the jacobian and the preconditioning matrix
> wouldn't it ?
>
>
> Thibault,
>
> Since the TS implicit methods end up using SNES internally the option
> should be available to you without requiring you to be calling the SNES
> routines directly
>
> Once you have finalized your approach and if for the implicit case you
> always work in the snes mf operator mode you can hardwire
>
> TSGetSNES(ts,&snes);
> SNESSetUseMatrixFree(snes,PETSC_TRUE,PETSC_FALSE);
>
> in your code so you don't need to always provide the option
> -snes-mf-operator
>
> Barry
>
>
>
>
> There might be calls to the Riemann solver to evaluate the dRHS / dU part
> yes but maybe it's possible to re-use what was computed for the RHS^n ?
> In the FV section the jacobian is set to identity which I missed before,
> but it could explain why when I used the following :
>
> TSSetType(ts, TSBEULER);
> DMTSSetIFunctionLocal(dm, DMPlexTSComputeIFunctionFEM <https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/TS/DMPlexTSComputeIFunctionFEM.html#DMPlexTSComputeIFunctionFEM>, &ctx);
> DMTSSetIJacobianLocal(dm, DMPlexTSComputeIJacobianFEM <https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/TS/DMPlexTSComputeIJacobianFEM.html#DMPlexTSComputeIJacobianFEM>, &ctx);
>
> with my FV discretization nothing happened, right ?
>
> Thank you,
>
> Thibault
>
> Thanks,
>>
>> Matt
>>
>>
>>> Le ven. 21 août 2020 à 14:55, Thibault Bridel-Bertomeu <
>>> thibault.bridelbertomeu at gmail.com> a écrit :
>>>
>>>> Hi,
>>>>
>>>> Thanks Matthew and Jed for your input.
>>>> I indeed envision an implicit solver in the sense Jed mentioned - Jiri
>>>> Blazek's book is a nice intro to this concept.
>>>>
>>>> Matthew, I do not know exactly what to change right now because
>>>> although I understand globally what the DMPlexComputeXXXX_Internal methods
>>>> do, I cannot say for sure line by line what is happening.
>>>> In a structured code, I have a an implicit FVM solver with PETSc but I
>>>> do not use any of the FV structure, not even a DM - I just use C arrays
>>>> that I transform to PETSc Vec and Mat and build my IJacobian and my
>>>> preconditioner and gives all that to a TS and it runs. I cannot figure out
>>>> how to do it with the FV and the DM and all the underlying "shortcuts" that
>>>> I want to use.
>>>>
>>>> Here is the top method for the structured code :
>>>>
>>>> int total_size = context.npoints * solver->nvars
>>>> ierr = TSSetRHSFunction(ts,PETSC_NULL,PetscRHSFunctionImpl,&context);
>>>> CHKERRQ(ierr);
>>>> SNES snes;
>>>> KSP ksp;
>>>> PC pc;
>>>> SNESType snestype;
>>>> ierr = TSGetSNES(ts,&snes); CHKERRQ(ierr);
>>>> ierr = SNESGetType(snes,&snestype); CHKERRQ(ierr);
>>>>
>>>> flag_mat_a = 1;
>>>> ierr = MatCreateShell(MPI_COMM_WORLD,total_size,total_size,
>>>> PETSC_DETERMINE,
>>>> PETSC_DETERMINE,&context,&A); CHKERRQ(ierr);
>>>> context.jfnk_eps = 1e-7;
>>>> ierr = PetscOptionsGetReal(NULL,NULL,"-jfnk_epsilon",&context.jfnk_eps,
>>>> NULL); CHKERRQ(ierr);
>>>> ierr = MatShellSetOperation(A,MATOP_MULT,(void (*)(void))
>>>> PetscJacobianFunction_JFNK); CHKERRQ(ierr);
>>>> ierr = MatSetUp(A); CHKERRQ(ierr);
>>>>
>>>> context.flag_use_precon = 0;
>>>> ierr = PetscOptionsGetBool(PETSC_NULL,PETSC_NULL,"-with_pc",(PetscBool
>>>> *)(&context.flag_use_precon),PETSC_NULL); CHKERRQ(ierr);
>>>>
>>>> /* Set up preconditioner matrix */
>>>> flag_mat_b = 1;
>>>> ierr = MatCreateAIJ(MPI_COMM_WORLD,total_size,total_size,
>>>> PETSC_DETERMINE,PETSC_DETERMINE,
>>>> (solver->ndims*2+1)*solver->nvars,NULL,
>>>> 2*solver->ndims*solver->nvars,NULL,&B); CHKERRQ(ierr);
>>>> ierr = MatSetBlockSize(B,solver->nvars);
>>>> /* Set the RHSJacobian function for TS */
>>>> ierr = TSSetIJacobian(ts,A,B,PetscIJacobian,&context); CHKERRQ(ierr);
>>>>
>>>> Thibault Bridel-Bertomeu
>>>> —
>>>> Eng, MSc, PhD
>>>> Research Engineer
>>>> CEA/CESTA
>>>> 33114 LE BARP
>>>> Tel.: (+33)557046924
>>>> Mob.: (+33)611025322
>>>> Mail: thibault.bridelbertomeu at gmail.com
>>>>
>>>>
>>>> Le jeu. 20 août 2020 à 18:43, Jed Brown <jed at jedbrown.org> a écrit :
>>>>
>>>>> Matthew Knepley <knepley at gmail.com> writes:
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> > I could never get the FVM stuff to make sense to me for implicit
>>>>> methods.
>>>>>
>>>>>
>>>>> > Here is my problem understanding. If you have an FVM method, it
>>>>> decides
>>>>>
>>>>>
>>>>> > to move "stuff" from one cell to its neighboring cells depending on
>>>>> the
>>>>>
>>>>>
>>>>> > solution to the Riemann problem on each face, which computed the
>>>>> flux. This
>>>>>
>>>>>
>>>>> > is
>>>>>
>>>>>
>>>>> > fine unless the timestep is so big that material can flow through
>>>>> into the
>>>>>
>>>>>
>>>>> > cells beyond the neighbor. Then I should have considered the effect
>>>>> of the
>>>>>
>>>>>
>>>>> > Riemann problem for those interfaces. That would be in the Jacobian,
>>>>> but I
>>>>>
>>>>>
>>>>> > don't know how to compute that Jacobian. I guess you could do
>>>>> everything
>>>>>
>>>>>
>>>>> > matrix-free, but without a preconditioner it seems hard.
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> So long as we're using method of lines, the flux is just instantaneous
>>>>> flux, not integrated over some time step. It has the same meaning for
>>>>> implicit and explicit.
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> An explicit method would be unstable if you took such a large time
>>>>> step (CFL) and an implicit method will not simultaneously be SSP and higher
>>>>> than first order, but it's still a consistent discretization of the problem.
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> It's common (done in FUN3D and others) to precondition with a
>>>>> first-order method, where gradient reconstruction/limiting is skipped.
>>>>> That's what I'd recommend because limiting creates nasty nonlinearities and
>>>>> the resulting discretizations lack h-ellipticity which makes them very hard
>>>>> to solve.
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>
>>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>> <http://www.cse.buffalo.edu/~knepley/>
>>
>>
>>
>
>
> --
Thibault Bridel-Bertomeu
—
Eng, MSc, PhD
Research Engineer
CEA/CESTA
33114 LE BARP
Tel.: (+33)557046924
Mob.: (+33)611025322
Mail: thibault.bridelbertomeu at gmail.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20200821/fc2f5660/attachment-0001.html>
More information about the petsc-users
mailing list