[petsc-users] KSPCG solution blow up

Matthew Knepley knepley at gmail.com
Mon Apr 15 16:12:16 CDT 2013


On Mon, Apr 15, 2013 at 4:08 PM, Hugo Gagnon <
opensource.petsc at user.fastmail.fm> wrote:

> For the problem I'm describing my serial in-house solver does not work
> with ILU(0) but works with ILU(3).  I have no option to run Jacobi.  When I
> apply the same problem to PETSc's PC solver with ILU(3) in serial I
> get KSP_DIVERGED_INDEFINITE_PC on the first iteration (in MPI the solution
> somewhat converges but very slowly).
>

As Mark said, ILU(3) does not preserve either symmetry or definiteness.

   Matt


> call KSPGetPC(Pksp,Ppc,Pierr)
> call PCSetType(Ppc,PCILU,Pierr)
> call PCFactorSetLevels(Ppc,3,Pierr)
>
> This effectively changes the fill level from 0 to 3, right?
>
> --
>   Hugo Gagnon
>
> On 2013-04-15, at 4:35 PM, Mark F. Adams <mark.adams at columbia.edu> wrote:
>
> ILU is not guaranteed to stay positive and CG requires this.  PETSc's ILU
> is ILU(0).  If your in-house PCG solver works with ILU(0) then there is
> probably a bug in your construction of the PETSc matrix.  If your in-house
> code does not use ILU(0) then I would try PCJACOBI and verify that you get
> the same residaul history as your in-house solver (assuming that you can do
> Jacobi).
>
>
> On Apr 15, 2013, at 4:04 PM, Jed Brown <jedbrown at mcs.anl.gov> wrote:
>
> Hugo Gagnon <opensource.petsc at user.fastmail.fm> writes:
>
> Hi,
>
> I hope you don't mind me writing to you off list, I just don't think
> my problem would be helpful to others anyway.
>
>
> I'd rather keep discussions on the list.  We (PETSc developers) cannot
> scale to have private conversations with all users.
>
> I'm trying to converge a problem with KSPCG, which I know for sure
> that both A and b, as in Ax=b, are correct.  I can successfully
> converge this particular problem using our own serial in-house PCG
> solver, whereas with PETSc the solution blows up at the first
> iteration with error code -8.
>
>
> -ksp_converged_reason prints why, or use KSPConvergedReasons to get the
> string.
>
>              KSP_DIVERGED_INDEFINITE_PC       = -8,
>
>
> I've included parts of my code below.  Again, I've triple checked that
> I build A and b correctly.  Being a beginner with PETSc and parallel
> solvers in general, I'd appreciate if you could give me some simple
> pointers as to why my problem is blowing up and how can I improve my
> code.
>
>
>
> http://scicomp.stackexchange.com/questions/513/why-is-my-iterative-linear-solver-not-converging
>
> Note that this code does work for easier problems, although I've
> noticed that in general PETSc takes way more iterations to converge
> than our in-house solver.  Would you have an idea as to why this is
> so?
>
>
> You are either doing a different algorithm or evaluating convergence
> differently.  What preconditioner are you using in serial?  Why aren't
> you using PETSc's multigrid?  Send the diagnostics described in the link
> if you want more help.
>
> I know I'm asking quite a bit but I've been struggling with this
> problem for what I feel far too long!
>
> Thank you for your help,
> --
> Hugo Gagnon
>
>   !-- initialize petsc with the current communicator
>   PETSC_COMM_WORLD = COMM_CURRENT
>   call PetscInitialize(PETSC_NULL_CHARACTER,Pierr)
>
>   !-- instantiate the lhs and solution vectors
>   call VecCreate(PETSC_COMM_WORLD,Pb,Pierr)
>   call VecSetSizes(Pb,PETSC_DECIDE,numVar,Pierr))
>   call VecSetType(Pb,VECSTANDARD,Pierr)
>   call VecDuplicate(Pb,Psol,Pierr)
>
>   !-- instantiate the rhs
>   call MatCreate(PETSC_COMM_WORLD,Paa,Pierr)
>   call MatSetSizes(Paa,PETSC_DECIDE,PETSC_DECIDE,numVar,numVar,Pierr)
>   call MatSetType(Paa,MATAIJ,Pierr) !-- csr format
>
>   !-- preallocate memory
>   mnnz = 81 !-- knob
>   call MatSeqAIJSetPreallocation(Paa,mnnz,PETSC_NULL_INTEGER,Pierr)
>   call
> MatMPIAIJSetPreallocation(Paa,mnnz,PETSC_NULL_INTEGER,mnnz,PETSC_NULL_INTEGER,Pierr)
>
>   !-- instantiate the linear solver context
>   call KSPCreate(PETSC_COMM_WORLD,Pksp,Pierr)
>   call KSPSetType(Pksp,KSPCG,Pierr)
>
>   !-- use a custom monitor function
>   call
> KSPMonitorSet(Pksp,KSPMonitorPETSc,PETSC_NULL_OBJECT,PETSC_NULL_FUNCTION,Pierr)
>
>   !-- instantiate a scatterer so that root has access to full Psol
>   call VecScatterCreateToZero(Psol,Psols,Psol0,Pierr)
>
>   call MatZeroEntries(Paa,Pierr)
>
>   *** build Psol, Pb and Paa ****
>
>   !-- assemble (distribute) the initial solution, the lhs and the rhs
>   call VecAssemblyBegin(Psol,Pierr)
>   call VecAssemblyBegin(Pb,Pierr)
>   call MatAssemblyBegin(Paa,MAT_FINAL_ASSEMBLY,Pierr)
>   call VecAssemblyEnd(Psol,Pierr)
>   call VecAssemblyEnd(Pb,Pierr)
>   call MatAssemblyEnd(Paa,MAT_FINAL_ASSEMBLY,Pierr)
>   call MatSetOption(Paa,MAT_NEW_NONZERO_LOCATIONS,PETSC_FALSE,Pierr)
>
>   !-- setup pcg, reusing the lhs as preconditioner
>   call KSPSetOperators(Pksp,Paa,Paa,SAME_NONZERO_PATTERN,Pierr)
>   call
> KSPSetTolerances(Pksp,tol,PETSC_DEFAULT_DOUBLE_PRECISION,PETSC_DEFAULT_DOUBLE_PRECISION,grid%pcg_its,Pierr)
>
>   !-- setup the preconditioner with the desired fill level
>   call KSPGetPC(Pksp,Ppc,Pierr)
>   if (nProc > 1) then
>     call PCSetType(Ppc,PCBJACOBI,Pierr)
>     call KSPSetup(Pksp,Pierr)
>     call PCBJacobiGetSubKSP(Ppc,j,j,Psubksp,Pierr)
>     call KSPGetPC(Psubksp(1),Ppc,Pierr)
>   end if
>   call PCSetType(Ppc,PCILU,Pierr)
>   call PCFactorSetLevels(Ppc,grid%pcg_fill,Pierr)
>
>   !-- do not zero out the solution vector
>   call KSPSetInitialGuessNonzero(Pksp,PETSC_TRUE,Pierr)
>   call KSPSetFromOptions(Pksp,Pierr)
>
>   !-- let petsc do its magic
>   call KSPSolve(Pksp,Pb,Psol,Pierr)
>   call KSPGetConvergedReason(Pksp,Pconv,Pierr)
>
>
>
>
>


-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20130415/df3a1d96/attachment.html>


More information about the petsc-users mailing list