[petsc-users] 2D Poisson on nonuniform meshes
Filippo Leonardi
filippo.leon at gmail.com
Wed Sep 2 01:52:21 CDT 2015
So I spent a lot of time debugging my code for this one, it turns out that in my debug run, where I
solve the same system multiple times, reusing the preconditioner if possible, sometimes, it
happens that the rhs is 0. It has nothing to to with scaling. Just that when I did my debug run I
just had reworked the code to use nonuniform meshes, so I assumed incorrectly, hey, there is an
issue with scaling.
I do not perform the checks for incompatible rhs in this case, despite knowing that PETSc breaks
in this case (I knew that, but I forgot). I now introduced a check in my debug build that check if a
rhs is zero or not.
Long story short:
I have the suspect that PETSc could be improved in this respect: I am not an expert but
somehow it shouldn't break or should warn the user (even if he is doing somthing "nonsensical"
like solving Ax = 0). Could it be that if I set PETSc to solve a system with rhs 0 it breaks (maybe
if combined with a multiple solve?)? What I do is the following:
i approximate the Leray projection operator with Laplace, so i solve Ax = b multiple times,
reusing the matrix/precond. When I debug the code, sometimes I have a b = 0.
(In fact, this was my first misktake:
http://lists.mcs.anl.gov/pipermail/petsc-users/2014-November/023478.html
)
Maybe because the relative error cannot possibly work? If so, can I suggest to insert some sort
of check for idiots like me :) (I repeated this mistake twice knowing what the issue was)?
On Tuesday 01 September 2015 12:13:34 Matthew Knepley wrote:
> On Tue, Sep 1, 2015 at 6:53 AM, Filippo Leonardi <filippo.leon at gmail.com>
>
> wrote:
> > On Tuesday 01 September 2015 06:32:38 you wrote:
> > > > > 1) The KSP view does not say it is shifting. Are you using the
> > > > > latest
> > > > >
> > > > >
> > > > >
> > > > > release?
> > > >
> > > > yes, 3.6. Does PETSc warn for that even if I set the nullspace? I can
> >
> > also
> >
> > > > check MUMPS or something else.
> > >
> > > I am not sure what you think PETSc does here. If shifting were enabled,
> >
> > it
> >
> > > would add some
> > >
> > > diagonal matrix to the input matrix and continue the factorization. This
> > >
> > > would mean that the
> > >
> > > factors were not, in fact, a factorization of the input matrix, and you
> > >
> > > would not get the exact
> > >
> > > solution in one iterate.
> >
> > I though PETSc would've replace my pivots with small eps, which is
> > actually not a problem in my case
> >
> > > > > 2) If it shifted, it would not solve in a single iterate.
> > > >
> > > > even with preonly?
> > >
> > > You would have a large residual. Do you?
> >
> > Actually, I get a perfect solution.
> >
> > > > > 3) Your GAMG results imply that something is wrong with the coarse
> > > > >
> > > > > solve.
> > > > >
> > > > >
> > > > >
> > > > > This is exactly what would happen if
> > > > >
> > > > >
> > > > >
> > > > > that problem was not solved accurately (its off by > 10 orders of
> > > > >
> > > > >
> > > > >
> > > > > magnitude).
> > > >
> > > > yes, but GAMG builds is own coarse solvers so either the problem is
> > > >
> > > > already in the definition of A and b (likely) or it is a bug in gamg.
> > >
> > > Yes. GAMG uses the constants to build the basis, on the assumption that
> > >
> > > they are in the (near) nullspace of the
> > >
> > > operator with no boundary conditions. Since this is far off, I think
> > > this
> > >
> > > must not be true for your A.
> > >
> > > > > It sounds like your operator is not singular, and its not the
> >
> > Laplacian
> >
> > > > > since it does not look like the Neumann version
> > > > >
> > > > >
> > > > >
> > > > > has constants as a null space.
> > > >
> > > > I'm using periodic boundaries, and constants are in kern(A)
> > >
> > > Did you check?
> >
> > Checked with VecSet and MatMult just in case, I get a machine eps constant
> > vector.
>
> Okay, it seems that we have the best chance of figuring out the problem by
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150902/fb030d61/attachment-0001.html>
More information about the petsc-users
mailing list