[petsc-users] How to improve this solver for Navier-Stokes equation (based on fieldsplit and lsc)
Thomas Witkowski
thomas.witkowski at tu-dresden.de
Thu Jun 14 08:01:33 CDT 2012
I try to implement an efficient solver for my FEM based unsteady Navier
Stokes code. The scenario I consider is realy simple: 2D FEM with
Taylor-Hood element, equi-spaced grid, simple channel flow with
prescribed inflow boundaries for the velocity, no boundary conditions
for the pressure. Because I want to solve unsteady flows, at each
timepoint an Ossen problem is solved (implicit Euler time
discretization, "trivial" linearization). For using PETSc, I created a
fieldsplit preconditioner, that is configured in the following way:
mpirun -np 2 ./navier_stokes init/channel_parallel.dat.2d \
-ksp_type fgmres \
-pc_type fieldsplit \
-pc_fieldsplit_type SCHUR \
-pc_fieldsplit_schur_fact_type FULL \
-ksp_converged_reason \
-ksp_monitor_true_residual \
-fieldsplit_velocity_ksp_type preonly \
-fieldsplit_velocity_pc_type lu \
-fieldsplit_velocity_pc_factor_mat_solver_package mumps \
-fieldsplit_pressure_ksp_type gmres \
-fieldsplit_pressure_ksp_max_it 10 \
-fieldsplit_pressure_ksp_rtol 1.0e-2 \
-fieldsplit_pressure_pc_type lsc \
-fieldsplit_pressure_lsc_ksp_type bcgs \
-fieldsplit_pressure_lsc_ksp_max_it 10 \
-fieldsplit_pressure_lsc_ksp_rtol 1.0e-2 \
-fieldsplit_pressure_lsc_pc_type hypre
The direct solver for the velocity part is just for debugging and should
be replaced when everything else works fine. The point is, that I found
this constellation to be not very efficient. It takes around 20 to 30
iterations, which takes around 30 seconds on a very small problem size
(around 20000 global unknows for each velocity component and 5000 global
unknowns for the pressure) on a very fast desktop CPU (some new Intel
Xeno with 4 core). Any hints for improvements?
Thomas
More information about the petsc-users
mailing list