[petsc-users] Recommendations of possible preconditioner/ksp solver/snes solver for three phase flow reservoir simulator
Matthew Knepley
knepley at gmail.com
Mon Dec 17 09:54:51 CST 2018
On Sun, Dec 16, 2018 at 10:12 PM Chukwudi Chukwudozie via petsc-users <
petsc-users at mcs.anl.gov> wrote:
> Barry,
>
> Thanks for the reply. I have attached two files with the information.
>
> On Sat, Dec 15, 2018 at 8:58 PM Smith, Barry F. <bsmith at mcs.anl.gov>
> wrote:
>
>>
>> Chuks,
>>
>> 1) Please run a successful case with -snes_view and send the output so
>> we can see the exact solver options being used.
>>
>> 2) Run a less successful case with -snes_monitor
>> -ksp_monitor_true_residual -ksp_converged_reason
>> -fieldsplit_ksp_monitor_true_residual -fieldsplit_ksp_converged reason so
>> we can see how the linear solver (and the sub linear solves) are (not)
>> converging.
>>
>
The first thing I would do is to put Krylov wrappers around the ILU on the
subblocks. ILU could be good enough
for your simple case, but I bet the blocks are not begin accurately solved
in the non-convergent case. You could
turn on monitor for the blocks to confirm this:
-P_fieldsplit_0_ksp_type gmres
-P_fieldsplit_0_ksp_monitor_true_residual
Thanks,
Matt
>
>> Barry
>>
>>
>> > On Dec 15, 2018, at 7:54 PM, Chukwudi Chukwudozie via petsc-users <
>> petsc-users at mcs.anl.gov> wrote:
>> >
>> > Hi,
>> >
>> > I am developing a fully implicit control volume finite element black
>> oil reservoir simulator using petscdmplex and snes. So far, for 2 phase
>> flow problems (3 by 3 block for pressure, saturation and bottomhole
>> pressure), my simulator runs fine for small time step sizes in the order of
>> less than 5 days. However, increasing the time step size, it doesn't
>> converge with -3 (-5 sometimes) diverge reason. Considering that this is a
>> fully implicit but nonlinear problem, I expect unconditional stability for
>> reasonable time step sizes.
>> >
>> > The background DM for the 3 block problem is a composite one with the
>> first two being DMPLES's (bag->plexScalNode for pressure and saturation)
>> and the last one, bag->WellRedun, a redundant DM for the wellbore
>> pressures. See below.
>> >
>> > ierr =
>> DMRedundantCreate(PETSC_COMM_WORLD,0,bag->numWells,&bag->WellRedun);CHKERRQ(ierr);
>> > ierr =
>> DMCompositeCreate(PETSC_COMM_WORLD,&bag->MultiPhasePacker);CHKERRQ(ierr);
>> > ierr =
>> DMCompositeAddDM(bag->MultiPhasePacker,bag->plexScalNode);CHKERRQ(ierr);
>> > ierr =
>> DMCompositeAddDM(bag->MultiPhasePacker,bag->plexScalNode);CHKERRQ(ierr);
>> > ierr =
>> DMCompositeAddDM(bag->MultiPhasePacker,bag->WellRedun);CHKERRQ(ierr);
>> >
>> > I have used nested matrices as shown below. In addition, I used the
>> fieldsplit (composite multiplicative) preconditioner with FGRES and least
>> square for newton solver. See below.
>> >
>> > ierr =
>> MatCreateNest(PETSC_COMM_WORLD,Msize,is,Msize,is,&bK[0],K);CHKERRQ(ierr);
>> >
>> >
>> > ierr = SNESGetKSP(bag->snesP,&kspP);CHKERRQ(ierr);
>> > ierr =
>> KSPSetTolerances(kspP,1.e-8,1.e-8,PETSC_DEFAULT,PETSC_DEFAULT);CHKERRQ(ierr);
>> > ierr = KSPSetType(kspP,KSPFGMRES);CHKERRQ(ierr);
>> > ierr = KSPSetFromOptions(kspP);CHKERRQ(ierr);
>> > ierr = KSPGetPC(kspP,&pcP);CHKERRQ(ierr);
>> > ierr = PCSetType(pcP, PCFIELDSPLIT);CHKERRQ(ierr);
>> > ierr =
>> DMCompositeGetGlobalISs(bag->MultiPhasePacker,&is);CHKERRQ(ierr);
>> > for(i = 0; i < Msize; i++) ierr =
>> PCFieldSplitSetIS(pcP,NULL,is[i]);CHKERRQ(ierr);
>> > ierr =
>> PCFieldSplitSetType(pcP,PC_COMPOSITE_MULTIPLICATIVE);CHKERRQ(ierr);
>> >
>> > Any ideas on possible precoditioner options/solver options/combinations
>> to improve stability will be appreciated.
>> >
>> > Chuks
>> >
>>
>>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181217/5d8a5d47/attachment.html>
More information about the petsc-users
mailing list