[petsc-users] How to improve this solver for Navier-Stokes equation (based on fieldsplit and lsc)

Barry Smith bsmith at mcs.anl.gov
Thu Jun 14 17:07:21 CDT 2012


On Jun 14, 2012, at 9:07 AM, Thomas Witkowski wrote:

> Am 14.06.2012 15:06, schrieb Matthew Knepley:
>> On Thu, Jun 14, 2012 at 9:01 PM, Thomas Witkowski <thomas.witkowski at tu-dresden.de> wrote:
>> I try to implement an efficient solver for my FEM based unsteady Navier Stokes code. The scenario I consider is realy simple: 2D FEM with Taylor-Hood element, equi-spaced grid, simple channel flow with prescribed inflow boundaries for the velocity, no boundary conditions for the pressure. Because I want to
>> 
>> With no pressure BC, you need to specify the null space. Look at SNES ex62.
> As so often, I forgotten about this :) As I have free outflow boundary for the velocity (so no Dirichlet boundary), I think, the nullspace must be somehow different than just setting a constant on all pressure nodes. So I want back to the even more simple case, the driven cavity scenario. Here I have dirichlet BC on all velocity components and boundaries and no pressure BC. There is no influence on the solver, whether I set the null space or not (I checked for it with -ksp_view that there is a null space set). Only when going from fgmres to gmres for the outer solver, setting the null space has some (positive) change in the solver convergence. 

  Since you have a gmres in your pressure solver you must use fgmres in the outer solver. You cannot use gmres inside gmres. 

  Also you can use -fieldsplit_pressure_ksp_monitor_true_residual -fieldsplit_pressure_ksp_converged_reason etc to track the inner solvers also to get a more complete picture of what is happening in terms of convergence. Might be useful.


   Barry

> 
>>  
>> solve unsteady flows, at each timepoint an Ossen problem is solved (implicit Euler time discretization, "trivial" linearization). For using PETSc, I created a fieldsplit preconditioner, that is configured in the following way:
>> 
>> mpirun -np 2 ./navier_stokes init/channel_parallel.dat.2d \
>> -ksp_type fgmres \
>> -pc_type fieldsplit \
>> -pc_fieldsplit_type SCHUR \
>> -pc_fieldsplit_schur_fact_type FULL \
>> -ksp_converged_reason \
>> -ksp_monitor_true_residual \
>> -fieldsplit_velocity_ksp_type preonly \
>> -fieldsplit_velocity_pc_type lu \
>> -fieldsplit_velocity_pc_factor_mat_solver_package mumps \
>> -fieldsplit_pressure_ksp_type gmres \
>> -fieldsplit_pressure_ksp_max_it 10 \
>> -fieldsplit_pressure_ksp_rtol 1.0e-2 \
>> -fieldsplit_pressure_pc_type lsc \
>> 
>> This makes no sense unless you specify an auxiliary operator. Just leave it at jacobi. When you use LU
>> for velocity, it will converge in 1 iterate. Since it doesn't, it means you have a problem with the null space.
> So even for the driven cavity, the null space is set, and again I use the same solver, it needs something like 10 or 15 iterations. I don't understand your argument, that using LU for the velocity part,     this solver should converge in 1 iterations. It still has an inexact solver for the Schur complement and the solver for the composite matrix in LSC is also inexact.
> 
> Thomas
> 
>> 
>>    Matt
>>  
>> -fieldsplit_pressure_lsc_ksp_type bcgs \
>> -fieldsplit_pressure_lsc_ksp_max_it 10 \
>> -fieldsplit_pressure_lsc_ksp_rtol 1.0e-2 \
>> -fieldsplit_pressure_lsc_pc_type hypre
>> 
>> The direct solver for the velocity part is just for debugging and should be replaced when everything else works fine. The point is, that I found this constellation to be not very efficient. It takes around 20 to 30 iterations, which takes around 30 seconds on a very small problem size (around 20000 global unknows for each velocity component and 5000 global unknowns for the pressure) on a very fast desktop CPU (some new Intel Xeno with 4 core). Any hints for improvements?
>> 
>> Thomas
>> 
>> 
>> 
>> -- 
>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>> -- Norbert Wiener
> 



More information about the petsc-users mailing list