[petsc-users] Nullspaces

Mark Adams mfadams at lbl.gov
Mon Jan 3 08:50:01 CST 2022


I have not looked at your code, but as a general observation you want to
have some sort of memory checker, like valgrid for CPUs, in your workflow.
It is the fastest way to find some classes of bugs.

On Mon, Jan 3, 2022 at 8:47 AM Marco Cisternino <marco.cisternino at optimad.it>
wrote:

> Are you talking about the code that produce the linear system or about the
> tiny code that test the null space?
> In the first case, it is absolutely possible, but I would expect no
> problem in the tiny code, do you agree?
> It is important to remark that the real code and the tiny one behave in
> the same way when testing the null space of the operator. I can analyze
> with valgrind and I will, but I would not expect great insights.
>
>
>
> Thanks,
>
>
>
> Marco Cisternino, PhD
> marco.cisternino at optimad.it
>
> ______________________
>
> Optimad Engineering Srl
>
> Via Bligny 5, Torino, Italia.
> +3901119719782
> www.optimad.it
>
>
>
> *From:* Mark Adams <mfadams at lbl.gov>
> *Sent:* lunedì 3 gennaio 2022 14:42
> *To:* Marco Cisternino <marco.cisternino at optimad.it>
> *Cc:* Matthew Knepley <knepley at gmail.com>; petsc-users <
> petsc-users at mcs.anl.gov>
> *Subject:* Re: [petsc-users] Nullspaces
>
>
>
> There could be a memory bug that does not cause a noticeable problem until
> it hits some vital data and valgrind might find it on a small problem.
>
>
>
> However you might have a bug like a hardwired buffer size that
> overflows that is in fact not a bug until you get to this large size and in
> that case valgrid would need to be run on the large case and would have a
> good chance of finding it.
>
>
>
>
>
> On Mon, Jan 3, 2022 at 4:42 AM Marco Cisternino <
> marco.cisternino at optimad.it> wrote:
>
> My comments are between the Mark’s lines and they starts with “#”
>
>
>
> Marco Cisternino
>
>
>
> *From:* Mark Adams <mfadams at lbl.gov>
> *Sent:* sabato 25 dicembre 2021 14:59
> *To:* Marco Cisternino <marco.cisternino at optimad.it>
> *Cc:* Matthew Knepley <knepley at gmail.com>; petsc-users <
> petsc-users at mcs.anl.gov>
> *Subject:* Re: [petsc-users] Nullspaces
>
>
>
> If  "triggering the issue" requires a substantial mesh, that makes me
> think there is a logic bug somewhere. Maybe use valgrind.
>
>
>
> # Are you suggesting to use valgrind on this tiny toy code or on the
> original one? However, considering the purpose of the tiny code, i.e.
> testing the constant null space, why there should be a logical bug? Case 1
> passes and case 2 should be exactly the same, shouldn’t be it?
>
>
>
> Also you say you divide by the cell volume. Maybe I am not understanding
> this but that is basically diagonal scaling and that will change the null
> space (ie, not a constant anymore)
>
>
>
> # I agree on this, but it pushes a question: why the case 1 passes the
> test?
>
> # Thank you, Mark.
>
>
>
> On Thu, Dec 16, 2021 at 11:11 AM Marco Cisternino <
> marco.cisternino at optimad.it> wrote:
>
> Hello Matthew,
>
> as promised I prepared a minimal (112960 rows. I’m not able to produce
> anything smaller than this and triggering the issue) example of the
> behavior I was talking about some days ago.
>
> What I did is to produce matrix, right hand side and initial solution of
> the linear system.
>
>
>
> As I told you before, this linear system is the discretization of the
> pressure equation of a predictor-corrector method for NS equations in the
> framework of finite volume method.
>
> This case has homogeneous Neumann boundary conditions. Computational
> domain has two independent and separated sub-domains.
>
> I discretize the weak formulation and I divide every row of the linear
> system by the volume of the relative cell.
>
> The underlying mesh is not uniform, therefore cells have different
> volumes.
>
> The issue I’m going to explain does not show up if the mesh is uniform,
> same volume for all the cells.
>
>
>
> I usually build the null space sub-domain by sub-domain with
>
> MatNullSpaceCreate(getCommunicator(), PETSC_FALSE, nConstants, constants,
> &nullspace);
>
> Where nConstants = 2 and constants contains two normalized arrays with
> constant values on degrees of freedom relative to the associated sub-domain
> and zeros elsewhere.
>
>
>
> However, as a test I tried the constant over the whole domain using 2
> alternatives that should produce the same null space:
>
>    1. MatNullSpaceCreate(getCommunicator(), PETSC_TRUE, 0, nullptr,
>    &nullspace);
>    2. Vec* nsp;
>
> VecDuplicateVecs(solution, 1, &nsp);
>
> VecSet(nsp[0],1.0);
>
> VecNormalize(nsp[0], nullptr);
>
> MatNullSpaceCreate(getCommunicator(), PETSC_FALSE, 1, nsp, &nullspace);
>
>
>
> Once I created the null space I test it using:
>
> MatNullSpaceTest(nullspace, m_A, &isNullSpaceValid);
>
>
>
> The case 1 pass the test while case 2 don’t.
>
>
>
> I have a small code for matrix loading, null spaces creation and testing.
>
> Unfortunately I cannot implement a small code able to produce that linear
> system.
>
>
>
> As attachment you can find an archive containing the matrix, the initial
> solution (used to manually build the null space) and the rhs (not used in
> the test code) in binary format.
>
> You can also find the testing code in the same archive.
>
> I used petsc 3.12(gcc+openMPI) and petsc 3.15.2(intelOneAPI) same results.
>
> If the attachment is not delivered, I can share a link to it.
>
>
>
> Thanks for any help.
>
>
>
> Marco Cisternino
>
>
>
>
>
> Marco Cisternino, PhD
> marco.cisternino at optimad.it
>
> ______________________
>
> Optimad Engineering Srl
>
> Via Bligny 5, Torino, Italia.
> +3901119719782
> www.optimad.it
>
>
>
> *From:* Marco Cisternino <marco.cisternino at optimad.it>
> *Sent:* martedì 7 dicembre 2021 19:36
> *To:* Matthew Knepley <knepley at gmail.com>
> *Cc:* petsc-users <petsc-users at mcs.anl.gov>
> *Subject:* Re: [petsc-users] Nullspaces
>
>
>
> I will, as soon as possible...
>
>
>
> Scarica Outlook per Android <https://aka.ms/AAb9ysg>
> ------------------------------
>
> *From:* Matthew Knepley <knepley at gmail.com>
> *Sent:* Tuesday, December 7, 2021 7:25:43 PM
> *To:* Marco Cisternino <marco.cisternino at optimad.it>
> *Cc:* petsc-users <petsc-users at mcs.anl.gov>
> *Subject:* Re: [petsc-users] Nullspaces
>
>
>
> On Tue, Dec 7, 2021 at 11:19 AM Marco Cisternino <
> marco.cisternino at optimad.it> wrote:
>
> Good morning,
>
> I’m still struggling with the Poisson equation with Neumann BCs.
>
> I discretize the equation by finite volume method and I divide every line
> of the linear system by the volume of the cell. I could avoid this
> division, but I’m trying to understand.
>
> My mesh is not uniform, i.e. cells have different volumes (it is an octree
> mesh).
>
> Moreover, in my computational domain there are 2 separated sub-domains.
>
> I build the null space and then I use MatNullSpaceTest to check it.
>
>
>
> If I do this:
>
> MatNullSpaceCreate(getCommunicator(), PETSC_TRUE, 0, nullptr, &nullspace);
>
> It works
>
>
>
> This produces the normalized constant vector.
>
>
>
> If I do this:
>
> Vec nsp;
>
> VecDuplicate(m_rhs, &nsp);
>
> VecSet(nsp,1.0);
>
> VecNormalize(nsp, nullptr);
>
> MatNullSpaceCreate(getCommunicator(), PETSC_FALSE, 1, &nsp, &nullspace);
>
> It does not work
>
>
>
> This is also the normalized constant vector.
>
>
>
> So you are saying that these two vectors give different results with
> MatNullSpaceTest()?
>
> Something must be wrong in the code. Can you send a minimal example of
> this? I will go
>
> through and debug it.
>
>
>
>   Thanks,
>
>
>
>      Matt
>
>
>
> Probably, I have wrong expectations, but should not it be the same?
>
>
>
> Thanks
>
>
>
> Marco Cisternino, PhD
> marco.cisternino at optimad.it
>
> ______________________
>
> Optimad Engineering Srl
>
> Via Bligny 5, Torino, Italia.
> +3901119719782
> www.optimad.it
>
>
>
>
>
>
> --
>
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220103/9f3d53d9/attachment.html>


More information about the petsc-users mailing list