[petsc-users] Diagnosing Poisson Solver Behavior
Matthew Knepley
knepley at gmail.com
Mon Oct 12 08:39:55 CDT 2015
On Sat, Oct 10, 2015 at 7:56 PM, K. N. Ramachandran <knram06 at gmail.com>
wrote:
> Sorry, some more questions.
>
> 3) Also, for Dirichlet bc, I specify the value through Identity rows, i.e.
> A_ii = 1 and the rhs value would correspond to the Dirichlet condition. I
> am specifying it this way for my convenience. I am aware that
> MatZerosRowColumns might help here, but would keeping it this way be
> detrimental?
>
That is fine. However you would want to scale these entries to be
approximately the same size as the other diagonal entries.
> 4) Can I expect symmetric matrices to perform better, i.e. if I eliminate
> Dirichlet rows? But I would still be left with Neumann boundary conditions,
> where I use the second order formulation. If I used the first order
> formulation and made it symmetric, would that be an advantage? I tried the
> latter, but I didn't see the condition number change much.
>
Will not matter for MG.
Matt
>
>
> On Sat, Oct 10, 2015 at 8:51 PM, K. N. Ramachandran <knram06 at gmail.com>
> wrote:
>
>> Hello all,
>>
>> I am a graduate student pursuing my Master's and I am trying to benchmark
>> a previous work by using PETSc for solving Poisson's Equation.
>>
>> I am starting off with a serial code and I am trying to keep my code
>> modular, i.e. I generate the sparse matrix format and send it to PETSc or
>> any other solver. So I haven't built my code from the ground up using
>> PETSc's native data structures.
>>
>> I am having trouble understanding the behavior of the solver and would
>> like your thoughts or inputs on what I can do better. I have both Dirichlet
>> and Neumann boundary conditions and my matrix size (number of rows) is
>> around a million but very sparse (~7 nonzeros per row), as can be expected
>> from a finite difference discretization of Poisson's equation.
>>
>> I tried the methods outlined here
>> <http://scicomp.stackexchange.com/questions/513/why-is-my-iterative-linear-solver-not-converging?rq=1>
>> and here
>> <http://scicomp.stackexchange.com/questions/34/how-can-i-estimate-the-condition-number-of-a-large-sparse-matrix-using-petsc>.
>> Reverting to a 41^3 grid, I got the approximate condition number (using -ksp_monitor_singular_value
>> -ksp_type gmres -ksp_gmres_restart 1000 -pc_type none) as ~9072, which
>> seems pretty large. Higher matrix sizes give a larger condition number.
>>
>> 1) My best performing solver + preconditioner is bcgs+ilu(0) (on 1e6
>> grid) which solves in around 32 seconds, 196 iterations. How do I get a
>> fix for what the lower bound on the running time could be?
>>
>> 2) Initially -pc_type hypre just Diverged and I was never able to use
>> it. Looking at this thread
>> <http://lists.mcs.anl.gov/pipermail/petsc-users/2013-October/019127.html>,
>> I had tried the options and it no longer diverges, but the residuals reduce
>> and hover around a constant value. How do I work with hypre to get a
>> useful preconditioner?
>>
>> Initially I solve Laplace's equation, so the mesh grid size has no effect
>> and even when I solve Poisson's equation, the spacing is carried over to
>> the RHS, so I am pretty sure the spacing is not affecting the condition
>> number calculation.
>>
>> Hope this helps. Please let me know if you might need more information.
>>
>> Thanking You,
>> K.N.Ramachandran
>> Ph: 814-441-4279
>>
>
>
> Thanking You,
> K.N.Ramachandran
> Ph: 814-441-4279
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20151012/4afb9a9f/attachment.html>
More information about the petsc-users
mailing list