[petsc-users] KSP linear solver returns inf
Barry Smith
bsmith at mcs.anl.gov
Thu Mar 26 11:14:59 CDT 2015
> On Mar 26, 2015, at 10:51 AM, Manav Bhatia <bhatiamanav at gmail.com> wrote:
>
> Barry,
>
> On a related note, I have another elasticity problem that I am trying to solver with HEX8 elements. It is an isotropic solid structure. Do you have a recommended preconditioned for this problem?
Yes, this one clearly requires PCGAMG. Make sure you read all the docs on PCGMAG, you will need to supply either the coordinates with PCSetCoordiantes() or the near null space of the operator. Unfortunately our documentation for this sucks and everyone refuses to improve it.
Barry
>
> The problem has about 770K dofs and the default ILU(0) has not done well. I have also tried ILU(1), and that too has been unhelpful. I am observing stagnation of residuals after a drop of couple of magnitudes.
>
> Any recommendations would be greatly appreciated.
>
> Thanks,
> Manav
>
>
>
>
>
>
>> On Mar 26, 2015, at 10:35 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>>
>>
>>> On Mar 26, 2015, at 10:19 AM, Manav Bhatia <bhatiamanav at gmail.com> wrote:
>>>
>>> Thanks, Barry. I will try that.
>>>
>>> This is Euler flow equations discretized with SUPG. The mesh is made of 4-noded tetrahedra. The flow parameters correspond to transonic flow.
>>
>> Yes, ILU could easily fail on this and really isn't appropriate. Likely you should be using PCFIELDSPLIT for preconditioning.
>>
>> Barry
>>
>>>
>>> -Manav
>>>
>>>
>>>> On Mar 26, 2015, at 10:17 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>>>>
>>>>
>>>> The default preconditioner with ILU(0) on each process is not appropriate for your problem and is producing overflow. Try -sub_pc_type lu and see if that produces a different result.
>>>>
>>>> Is this a Stokes-like problem?
>>>>
>>>> Barry
>>>>
>>>>> On Mar 26, 2015, at 10:10 AM, Manav Bhatia <bhatiamanav at gmail.com> wrote:
>>>>>
>>>>> Thanks, Matt.
>>>>>
>>>>> Following is the output with: -ksp_monitor_lg_residualnorm -ksp_log -ksp_view -ksp_monitor_true_residual -ksp_converged_reason
>>>>>
>>>>> 0 KSP preconditioned resid norm inf true resid norm 2.709083260443e+06 ||r(i)||/||b|| 1.000000000000e+00
>>>>> Linear solve did not converge due to DIVERGED_NANORINF iterations 0
>>>>> KSP Object: 12 MPI processes
>>>>> type: gmres
>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
>>>>> GMRES: happy breakdown tolerance 1e-30
>>>>> maximum iterations=1000
>>>>> tolerances: relative=1e-10, absolute=1e-50, divergence=10000
>>>>> left preconditioning
>>>>> using nonzero initial guess
>>>>> using PRECONDITIONED norm type for convergence test
>>>>> PC Object: 12 MPI processes
>>>>> type: bjacobi
>>>>> block Jacobi: number of blocks = 12
>>>>> Local solve is same for all blocks, in the following KSP and PC objects:
>>>>> KSP Object: (sub_) 1 MPI processes
>>>>> type: preonly
>>>>> maximum iterations=10000, initial guess is zero
>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000
>>>>> left preconditioning
>>>>> using NONE norm type for convergence test
>>>>> PC Object: (sub_) 1 MPI processes
>>>>> type: ilu
>>>>> ILU: out-of-place factorization
>>>>> 0 levels of fill
>>>>> tolerance for zero pivot 2.22045e-14
>>>>> using diagonal shift on blocks to prevent zero pivot [INBLOCKS]
>>>>> matrix ordering: natural
>>>>> factor fill ratio given 1, needed 1
>>>>> Factored matrix follows:
>>>>> Mat Object: 1 MPI processes
>>>>> type: seqaij
>>>>> rows=667070, cols=667070
>>>>> package used to perform factorization: petsc
>>>>> total: nonzeros=4.6765e+07, allocated nonzeros=4.6765e+07
>>>>> total number of mallocs used during MatSetValues calls =0
>>>>> using I-node routines: found 133414 nodes, limit used is 5
>>>>> linear system matrix = precond matrix:
>>>>> Mat Object: () 1 MPI processes
>>>>> type: seqaij
>>>>> rows=667070, cols=667070
>>>>> total: nonzeros=4.6765e+07, allocated nonzeros=5.473e+07
>>>>> total number of mallocs used during MatSetValues calls =0
>>>>> using I-node routines: found 133414 nodes, limit used is 5
>>>>> linear system matrix = precond matrix:
>>>>> Mat Object: () 12 MPI processes
>>>>> type: mpiaij
>>>>> rows=6723030, cols=6723030
>>>>> total: nonzeros=4.98852e+08, allocated nonzeros=5.38983e+08
>>>>> total number of mallocs used during MatSetValues calls =0
>>>>> using I-node (on process 0) routines: found 133414 nodes, limit used is 5
>>>>>
>>>>>
>>>>> Anything jumps out at you as odd?
>>>>>
>>>>> -Manav
>>>>>
>>>>>
>>>>>
>>>>>> On Mar 26, 2015, at 9:34 AM, Matthew Knepley <knepley at gmail.com> wrote:
>>>>>>
>>>>>> On Thu, Mar 26, 2015 at 9:21 AM, Manav Bhatia <bhatiamanav at gmail.com> wrote:
>>>>>> Hi,
>>>>>>
>>>>>> I am using the KSP linear solver for my system of equations, without any command line options at this point. I have checked that the L1 norms of my system matrix and the force vector are finite values, but the KSP solver is returning with an “inf” residual in the very first iteration.
>>>>>>
>>>>>> The problem has 6.7M dofs and I have tried this on multiple machines with different number of nodes with the same result.
>>>>>>
>>>>>> Is there a reason why the solver would return after the first iteration with an inf?
>>>>>>
>>>>>> I am not sure on where to start debugging this case, so I would appreciate any pointers.
>>>>>>
>>>>>> For all solver questions, we want to see the output of
>>>>>>
>>>>>> -ksp_view -ksp_monitor_true_residual -ksp_converged_reason
>>>>>>
>>>>>> The problem here would be that there is an error, so we would never see the output
>>>>>> of -ksp_view and know what solver you are using. If you are using something complex,
>>>>>> can you try using
>>>>>>
>>>>>> -pc_type jacobi
>>>>>>
>>>>>> and send the output from the options above? Then we can figure out why the other solver
>>>>>> gets an inf.
>>>>>>
>>>>>> Thanks,
>>>>>>
>>>>>> Matt
>>>>>>
>>>>>> Thanks,
>>>>>> Manav
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>>>>>> -- Norbert Wiener
>>>>>
>>>>
>>>
>>
>
More information about the petsc-users
mailing list