[petsc-users] KSP linear solver returns inf

Manav Bhatia bhatiamanav at gmail.com
Thu Mar 26 11:31:38 CDT 2015


Ok, so I ran my fluids problem with -pc_type jacobi. This time it did not return with “inf”, but there was no convergence. 

Here is the first 10 iterations: 
  0 KSP preconditioned resid norm 7.840061446913e+07 true resid norm 2.709083260443e+06 ||r(i)||/||b|| 1.000000000000e+00
  1 KSP preconditioned resid norm 7.838673732620e+07 true resid norm 2.709686618724e+06 ||r(i)||/||b|| 1.000222716773e+00
  2 KSP preconditioned resid norm 7.838673500462e+07 true resid norm 2.709685645990e+06 ||r(i)||/||b|| 1.000222357709e+00
  3 KSP preconditioned resid norm 7.838080292706e+07 true resid norm 2.709691690751e+06 ||r(i)||/||b|| 1.000224589003e+00
  4 KSP preconditioned resid norm 7.837871329477e+07 true resid norm 2.709550983943e+06 ||r(i)||/||b|| 1.000172650101e+00
  5 KSP preconditioned resid norm 7.837788814104e+07 true resid norm 2.709528734769e+06 ||r(i)||/||b|| 1.000164437296e+00
  6 KSP preconditioned resid norm 7.837690087318e+07 true resid norm 2.709494763671e+06 ||r(i)||/||b|| 1.000151897594e+00
  7 KSP preconditioned resid norm 7.836061163366e+07 true resid norm 2.709622438419e+06 ||r(i)||/||b|| 1.000199025989e+00
  8 KSP preconditioned resid norm 7.835981946243e+07 true resid norm 2.709587889985e+06 ||r(i)||/||b|| 1.000186273176e+00
  9 KSP preconditioned resid norm 7.828072351639e+07 true resid norm 2.710062032107e+06 ||r(i)||/||b|| 1.000361292574e+00
 10 KSP preconditioned resid norm 7.828054329393e+07 true resid norm 2.710056988152e+06 ||r(i)||/||b|| 1.000359430706e+00

and here are the last 10 iterations: 

991 KSP preconditioned resid norm 7.801323853477e+07 true resid norm 2.710182819026e+06 ||r(i)||/||b|| 1.000405878475e+00
992 KSP preconditioned resid norm 7.801323853477e+07 true resid norm 2.710182817684e+06 ||r(i)||/||b|| 1.000405877980e+00
993 KSP preconditioned resid norm 7.801323853476e+07 true resid norm 2.710182817272e+06 ||r(i)||/||b|| 1.000405877828e+00
994 KSP preconditioned resid norm 7.801323853470e+07 true resid norm 2.710182825637e+06 ||r(i)||/||b|| 1.000405880916e+00
995 KSP preconditioned resid norm 7.801323853470e+07 true resid norm 2.710182825705e+06 ||r(i)||/||b|| 1.000405880941e+00
996 KSP preconditioned resid norm 7.801323853470e+07 true resid norm 2.710182826222e+06 ||r(i)||/||b|| 1.000405881131e+00
997 KSP preconditioned resid norm 7.801323853469e+07 true resid norm 2.710182826506e+06 ||r(i)||/||b|| 1.000405881236e+00
998 KSP preconditioned resid norm 7.801323853424e+07 true resid norm 2.710182832779e+06 ||r(i)||/||b|| 1.000405883552e+00
999 KSP preconditioned resid norm 7.801323852813e+07 true resid norm 2.710182815446e+06 ||r(i)||/||b|| 1.000405877154e+00
1000 KSP preconditioned resid norm 7.801323817948e+07 true resid norm 2.710182749096e+06 ||r(i)||/||b|| 1.000405852662e+00

Any recommendations? 

Thanks,
Manav


> On Mar 26, 2015, at 11:28 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:
> 
> 
>   This is fine. PCGAMG does algebraic multigrid so the mesh doesn't matter in its use.
> 
>  Barry
> 
>> On Mar 26, 2015, at 11:20 AM, Manav Bhatia <bhatiamanav at gmail.com> wrote:
>> 
>> Thanks. 
>> Quick question (out of ignorance): does it matter that the HEX8 may still be arranged in an unstructured fashion? Meaning, that although I use brick elements, my grid does not have a structured grid appearance.
>> 
>> -Manav
>> 
>> 
>>> On Mar 26, 2015, at 11:14 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>>> 
>>> 
>>>> On Mar 26, 2015, at 10:51 AM, Manav Bhatia <bhatiamanav at gmail.com> wrote:
>>>> 
>>>> Barry, 
>>>> 
>>>> On a related note, I have another elasticity problem that I am trying to solver with HEX8 elements. It is an isotropic solid structure. Do you have a recommended preconditioned for this problem? 
>>> 
>>> Yes, this one clearly requires PCGAMG. Make sure you read all the docs on PCGMAG, you will need to supply either the coordinates with PCSetCoordiantes() or the near null space of the operator. Unfortunately our documentation for this sucks and everyone refuses to improve it.
>>> 
>>> Barry
>>> 
>>> 
>>> 
>>>> 
>>>> The problem has about 770K dofs and the default ILU(0) has not done well. I have also tried ILU(1), and that too has been unhelpful. I am observing stagnation of residuals after a drop of couple of magnitudes. 
>>>> 
>>>> Any recommendations would be greatly appreciated. 
>>>> 
>>>> Thanks,
>>>> Manav
>>>> 
>>>> 
>>>> 
>>>> 
>>>> 
>>>> 
>>>>> On Mar 26, 2015, at 10:35 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>>>>> 
>>>>> 
>>>>>> On Mar 26, 2015, at 10:19 AM, Manav Bhatia <bhatiamanav at gmail.com> wrote:
>>>>>> 
>>>>>> Thanks, Barry. I will try that. 
>>>>>> 
>>>>>> This is Euler flow equations discretized with SUPG. The mesh is made of 4-noded tetrahedra. The flow parameters correspond to transonic flow. 
>>>>> 
>>>>> Yes, ILU could easily fail on this and really isn't appropriate.  Likely you should be using PCFIELDSPLIT for preconditioning.
>>>>> 
>>>>> Barry
>>>>> 
>>>>>> 
>>>>>> -Manav
>>>>>> 
>>>>>> 
>>>>>>> On Mar 26, 2015, at 10:17 AM, Barry Smith <bsmith at mcs.anl.gov> wrote:
>>>>>>> 
>>>>>>> 
>>>>>>> The default preconditioner with ILU(0) on each process is not appropriate for your problem and is producing overflow. Try -sub_pc_type lu and see if that produces a different result. 
>>>>>>> 
>>>>>>> Is this a Stokes-like problem?
>>>>>>> 
>>>>>>> Barry
>>>>>>> 
>>>>>>>> On Mar 26, 2015, at 10:10 AM, Manav Bhatia <bhatiamanav at gmail.com> wrote:
>>>>>>>> 
>>>>>>>> Thanks, Matt. 
>>>>>>>> 
>>>>>>>> Following is the output with: -ksp_monitor_lg_residualnorm -ksp_log -ksp_view -ksp_monitor_true_residual -ksp_converged_reason
>>>>>>>> 
>>>>>>>> 0 KSP preconditioned resid norm            inf true resid norm 2.709083260443e+06 ||r(i)||/||b|| 1.000000000000e+00
>>>>>>>> Linear solve did not converge due to DIVERGED_NANORINF iterations 0
>>>>>>>> KSP Object: 12 MPI processes
>>>>>>>> type: gmres
>>>>>>>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
>>>>>>>> GMRES: happy breakdown tolerance 1e-30
>>>>>>>> maximum iterations=1000
>>>>>>>> tolerances:  relative=1e-10, absolute=1e-50, divergence=10000
>>>>>>>> left preconditioning
>>>>>>>> using nonzero initial guess
>>>>>>>> using PRECONDITIONED norm type for convergence test
>>>>>>>> PC Object: 12 MPI processes
>>>>>>>> type: bjacobi
>>>>>>>> block Jacobi: number of blocks = 12
>>>>>>>> Local solve is same for all blocks, in the following KSP and PC objects:
>>>>>>>> KSP Object:  (sub_)   1 MPI processes
>>>>>>>> type: preonly
>>>>>>>> maximum iterations=10000, initial guess is zero
>>>>>>>> tolerances:  relative=1e-05, absolute=1e-50, divergence=10000
>>>>>>>> left preconditioning
>>>>>>>> using NONE norm type for convergence test
>>>>>>>> PC Object:  (sub_)   1 MPI processes
>>>>>>>> type: ilu
>>>>>>>> ILU: out-of-place factorization
>>>>>>>> 0 levels of fill
>>>>>>>> tolerance for zero pivot 2.22045e-14
>>>>>>>> using diagonal shift on blocks to prevent zero pivot [INBLOCKS]
>>>>>>>> matrix ordering: natural
>>>>>>>> factor fill ratio given 1, needed 1
>>>>>>>>  Factored matrix follows:
>>>>>>>>    Mat Object:           1 MPI processes
>>>>>>>>      type: seqaij
>>>>>>>>      rows=667070, cols=667070
>>>>>>>>      package used to perform factorization: petsc
>>>>>>>>      total: nonzeros=4.6765e+07, allocated nonzeros=4.6765e+07
>>>>>>>>      total number of mallocs used during MatSetValues calls =0
>>>>>>>>        using I-node routines: found 133414 nodes, limit used is 5
>>>>>>>> linear system matrix = precond matrix:
>>>>>>>> Mat Object:    ()     1 MPI processes
>>>>>>>> type: seqaij
>>>>>>>> rows=667070, cols=667070
>>>>>>>> total: nonzeros=4.6765e+07, allocated nonzeros=5.473e+07
>>>>>>>> total number of mallocs used during MatSetValues calls =0
>>>>>>>>  using I-node routines: found 133414 nodes, limit used is 5
>>>>>>>> linear system matrix = precond matrix:
>>>>>>>> Mat Object:  ()   12 MPI processes
>>>>>>>> type: mpiaij
>>>>>>>> rows=6723030, cols=6723030
>>>>>>>> total: nonzeros=4.98852e+08, allocated nonzeros=5.38983e+08
>>>>>>>> total number of mallocs used during MatSetValues calls =0
>>>>>>>> using I-node (on process 0) routines: found 133414 nodes, limit used is 5
>>>>>>>> 
>>>>>>>> 
>>>>>>>> Anything jumps out at you as odd? 
>>>>>>>> 
>>>>>>>> -Manav
>>>>>>>> 
>>>>>>>> 
>>>>>>>> 
>>>>>>>>> On Mar 26, 2015, at 9:34 AM, Matthew Knepley <knepley at gmail.com> wrote:
>>>>>>>>> 
>>>>>>>>> On Thu, Mar 26, 2015 at 9:21 AM, Manav Bhatia <bhatiamanav at gmail.com> wrote:
>>>>>>>>> Hi,
>>>>>>>>> 
>>>>>>>>> I am using the KSP linear solver for my system of equations, without any command line options at this point. I have checked that the L1 norms of my system matrix and the force vector are finite values, but the KSP solver is returning with an “inf” residual in the very first iteration.
>>>>>>>>> 
>>>>>>>>> The problem has 6.7M dofs and I have tried this on multiple machines with different number of nodes with the same result.
>>>>>>>>> 
>>>>>>>>> Is there a reason why the solver would return after the first iteration with an inf?
>>>>>>>>> 
>>>>>>>>> I am not sure on where to start debugging this case, so I would appreciate any pointers.
>>>>>>>>> 
>>>>>>>>> For all solver questions, we want to see the output of
>>>>>>>>> 
>>>>>>>>> -ksp_view -ksp_monitor_true_residual -ksp_converged_reason
>>>>>>>>> 
>>>>>>>>> The problem here would be that there is an error, so we would never see the output
>>>>>>>>> of -ksp_view and know what solver you are using. If you are using something complex,
>>>>>>>>> can you try using
>>>>>>>>> 
>>>>>>>>> -pc_type jacobi
>>>>>>>>> 
>>>>>>>>> and send the output from the options above? Then we can figure out why the other solver
>>>>>>>>> gets an inf.
>>>>>>>>> 
>>>>>>>>> Thanks,
>>>>>>>>> 
>>>>>>>>> Matt
>>>>>>>>> 
>>>>>>>>> Thanks,
>>>>>>>>> Manav
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> -- 
>>>>>>>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>>>>>>>>> -- Norbert Wiener
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>> 
>>>> 
>>> 
>> 
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150326/6395524a/attachment-0001.html>


More information about the petsc-users mailing list