On Thu, Jun 14, 2012 at 9:01 PM, Thomas Witkowski <span dir="ltr"><<a href="mailto:thomas.witkowski@tu-dresden.de" target="_blank">thomas.witkowski@tu-dresden.de</a>></span> wrote:<br><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
I try to implement an efficient solver for my FEM based unsteady Navier Stokes code. The scenario I consider is realy simple: 2D FEM with Taylor-Hood element, equi-spaced grid, simple channel flow with prescribed inflow boundaries for the velocity, no boundary conditions for the pressure. Because I want to</blockquote>
<div><br></div><div>With no pressure BC, you need to specify the null space. Look at SNES ex62.</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">solve unsteady flows, at each timepoint an Ossen problem is solved (implicit Euler time discretization, "trivial" linearization). For using PETSc, I created a fieldsplit preconditioner, that is configured in the following way:<br>
<br>
mpirun -np 2 ./navier_stokes init/channel_parallel.dat.2d \<br>
-ksp_type fgmres \<br>
-pc_type fieldsplit \<br>
-pc_fieldsplit_type SCHUR \<br>
-pc_fieldsplit_schur_fact_type FULL \<br>
-ksp_converged_reason \<br>
-ksp_monitor_true_residual \<br>
-fieldsplit_velocity_ksp_type preonly \<br>
-fieldsplit_velocity_pc_type lu \<br>
-fieldsplit_velocity_pc_<u></u>factor_mat_solver_package mumps \<br>
-fieldsplit_pressure_ksp_type gmres \<br>
-fieldsplit_pressure_ksp_max_<u></u>it 10 \<br>
-fieldsplit_pressure_ksp_rtol 1.0e-2 \<br>
-fieldsplit_pressure_pc_type lsc \<br></blockquote><div><br></div><div>This makes no sense unless you specify an auxiliary operator. Just leave it at jacobi. When you use LU</div><div>for velocity, it will converge in 1 iterate. Since it doesn't, it means you have a problem with the null space.</div>
<div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
-fieldsplit_pressure_lsc_ksp_<u></u>type bcgs \<br>
-fieldsplit_pressure_lsc_ksp_<u></u>max_it 10 \<br>
-fieldsplit_pressure_lsc_ksp_<u></u>rtol 1.0e-2 \<br>
-fieldsplit_pressure_lsc_pc_<u></u>type hypre<br>
<br>
The direct solver for the velocity part is just for debugging and should be replaced when everything else works fine. The point is, that I found this constellation to be not very efficient. It takes around 20 to 30 iterations, which takes around 30 seconds on a very small problem size (around 20000 global unknows for each velocity component and 5000 global unknowns for the pressure) on a very fast desktop CPU (some new Intel Xeno with 4 core). Any hints for improvements?<span class="HOEnZb"><font color="#888888"><br>
<br>
Thomas<br>
</font></span></blockquote></div><br><br clear="all"><div><br></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener<br>