[petsc-users] PF+Navier stokes

Matthew Knepley knepley at gmail.com
Mon Mar 22 13:02:36 CDT 2021


On Mon, Mar 22, 2021 at 2:00 PM Sepideh Kavousi <skavou1 at lsu.edu> wrote:

> I found some discussions in the following link and
> https://lists.mcs.anl.gov/pipermail/petsc-users/2010-May/006422.html and
> the following paper:
> https://www.sciencedirect.com/science/article/pii/S0021999107004330
> But I am engineer and the discussions are confusing for me. Would you
> please tell me what is the correct path to follow?
> Should I go ahead and apply SIMPLER algorithm for this problem, or I
> should learn to apply fieldsplit to determine individual preconditioning on
> each unknown?
>

If you implement SIMPLER you will have done all the work you need to do to
use PCFIELDSPLIT, but FieldSplit has a wide array of solvers you can try.
Thus, I would first make the ISes that split your system into fields.

  Thanks,

     Matt


> Best,
> Sepideh
> ------------------------------
> *From:* Matthew Knepley <knepley at gmail.com>
> *Sent:* Monday, March 22, 2021 9:15 AM
> *To:* Sepideh Kavousi <skavou1 at lsu.edu>
> *Cc:* petsc-users at mcs.anl.gov <petsc-users at mcs.anl.gov>
> *Subject:* Re: [petsc-users] PF+Navier stokes
>
> On Mon, Mar 22, 2021 at 10:04 AM Sepideh Kavousi <skavou1 at lsu.edu> wrote:
>
> Hello,
> I want to solve PF solidification+Navier stokes using Finite different
> method, and I have a strange problem. My code runs fine for some system
> sizes and fails for some of the system sizes. When I run with the following
> options:
> mpirun -np 2 ./one.out -ts_monitor -snes_fd_color -ts_max_snes_failures -1
>  -ts_type bdf -ts_bdf_adapt -pc_type bjacobi  -snes_linesearch_type l2
> -snes_type ksponly -ksp_type gmres -ksp_gmres_restart 1001 -sub_pc_type ilu
> -sub_ksp_type preonly -snes_monitor -ksp_monitor -snes_linesearch_monitor
> -ksp_monitor_true_residual -ksp_converged_reason -log_view
>
>     0 SNES Function norm 1.465357113711e+01
>     0 SNES Function norm 1.465357113711e+01
>     Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
>                    PC_FAILED due to SUBPC_ERROR
>     Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
>                    PC_FAILED due to SUBPC_ERROR
>     0 SNES Function norm 1.465357113711e+01
>     0 SNES Function norm 1.465357113711e+01
>     Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
>                    PC_FAILED due to SUBPC_ERROR
>     Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
>                    PC_FAILED due to SUBPC_ERROR
>     0 SNES Function norm 1.465357113711e+01
>     0 SNES Function norm 1.465357113711e+01
> ^C    Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
>                    PC_FAILED due to SUBPC_ERROR
>     0 SNES Function norm 1.465357113711e+01
>     Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
>                    PC_FAILED due to SUBPC_ERROR
>     0 SNES Function norm 1.465357113711e+01
>
> Even setting pc_type to LU does not solve the problem.
> 0 TS dt 0.0001 time 0.
> copy!
> copy!
> Write output at step= 0!
> Write output at step= 0!
>     0 SNES Function norm 1.465357113711e+01
>     0 SNES Function norm 1.465357113711e+01
>     Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
>                    PC_FAILED due to FACTOR_NUMERIC_ZEROPIVOT
>     Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
>                    PC_FAILED due to FACTOR_NUMERIC_ZEROPIVOT
>
> I guess the problem is that in mass conservation I used forward
> discretization for u (velocity in x) and for the moment in x , I used
> forward discretization for p (pressure) to ensure non-zero terms on the
> diagonal of matrix. I tried to run it with valgrind but it did not output
> anything.
>
> Does anyone have suggestions on how to solve this issue?
>
>
> Your subproblems in Block_jacobi are singular. With multiphysics problems
> like this, definition of blocks can be
> tricky. I would first try to find a good preconditioner for this system in
> the literature, and then we can help you
> try it out.
>
>   Thanks,
>
>       Matt
>
>
>
> Best,
> Sepideh
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <https://nam04.safelinks.protection.outlook.com/?url=http:%2F%2Fwww.cse.buffalo.edu%2F~knepley%2F&data=04%7C01%7Cskavou1%40lsu.edu%7Ce18d4d92dee141c0fb8d08d8ed3cf14b%7C2d4dad3f50ae47d983a09ae2b1f466f8%7C0%7C0%7C637520193296009424%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000&sdata=AWCIKQQC65VDxZ%2BkufAE%2BgHb7aKd4FaISZYpCCgumgo%3D&reserved=0>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20210322/4b324339/attachment.html>


More information about the petsc-users mailing list