[petsc-users] DMPlex problem

Morten Nobel-Jørgensen mono at dtu.dk
Mon Sep 26 08:41:26 CDT 2016


Hi Matt

We are trying to do a simple FE using DMPlex, but when assemble the global stiffness matrix we get in problems when running NP>1 - that is the global matrix differs when we move to a distributed system, where it should not.

In pseudo-code our CreateGlobalStiffnessMatrix does the following
create a local stiffness matrix ke with some values
for each local cell/element e (using result from DMPlexGetHeightStratum(..,0,..)
    for each of its vertices (using DMPlexGetTransitiveClosure(..,e,..)
        set local/global mapping to edof
update the global stiffness matrix K using the local mapping edof and values ke

The code we have sent is a simplified version, which just builds a dummy stiffness matrix - but we believe this matrix should still the same independent of NP. (That is why we use trace).

I'm not familiar with MatSetValuesClosure(). Is that the missing piece?

Kind regards,
Morten


________________________________
From: Matthew Knepley [knepley at gmail.com]
Sent: Monday, September 26, 2016 2:19 PM
To: Morten Nobel-Jørgensen
Cc: PETSc ‎[petsc-users at mcs.anl.gov]‎
Subject: Re: [petsc-users] DMPlex problem

On Mon, Sep 26, 2016 at 7:00 AM, Morten Nobel-Jørgensen <mono at dtu.dk<mailto:mono at dtu.dk>> wrote:
Hi Matthew

It seems like the problem is not fully fixed. I have changed the code to now run on with both 2,3 and 4 cells. When I run the code using NP = 1..3 I get different result both for NP=1 to NP=2/3 when cell count is larger than 2.

Do you mean the trace? I have no idea what you are actually putting in.

I have a lot of debugging when you use, MatSetValuesClosure(), but when you directly use
MatSetValuesLocal(), you are handling things yourself.

  Thanks,

     Matt

Kind regards,
Morten
____
mpiexec -np 1 ./ex18k
cells 2
Loc size: 36
Trace of matrix: 132.000000
cells 3
Loc size: 48
Trace of matrix: 192.000000
cells 4
Loc size: 60
Trace of matrix: 258.000000
mpiexec -np 2 ./ex18k
cells 2
Loc size: 24
Loc size: 24
Trace of matrix: 132.000000
cells 3
Loc size: 36
Loc size: 24
Trace of matrix: 198.000000
cells 4
Loc size: 36
Loc size: 36
Trace of matrix: 264.000000
mpiexec -np 3 ./ex18k
cells 2
Loc size: 24
Loc size: 24
Loc size: 0
Trace of matrix: 132.000000
cells 3
Loc size: 24
Loc size: 24
Loc size: 24
Trace of matrix: 198.000000
cells 4
Loc size: 36
Loc size: 24
Loc size: 24
Trace of matrix: 264.000000




________________________________
From: petsc-users-bounces at mcs.anl.gov<mailto:petsc-users-bounces at mcs.anl.gov> [petsc-users-bounces at mcs.anl.gov<mailto:petsc-users-bounces at mcs.anl.gov>] on behalf of Morten Nobel-Jørgensen [mono at dtu.dk<mailto:mono at dtu.dk>]
Sent: Sunday, September 25, 2016 11:15 AM
To: Matthew Knepley
Cc: PETSc ‎[petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>]‎
Subject: Re: [petsc-users] DMPlex problem

Hi Matthew

Thank you for the bug-fix :) I can confirm that it works :)

And thanks for your hard work on PETSc - your work is very much appreciated!

Kind regards,
Morten
________________________________
From: Matthew Knepley [knepley at gmail.com<mailto:knepley at gmail.com>]
Sent: Friday, September 23, 2016 2:46 PM
To: Morten Nobel-Jørgensen
Cc: PETSc ‎[petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>]‎
Subject: Re: [petsc-users] DMPlex problem

On Fri, Sep 23, 2016 at 7:45 AM, Matthew Knepley <knepley at gmail.com<mailto:knepley at gmail.com>> wrote:
On Fri, Sep 23, 2016 at 3:48 AM, Morten Nobel-Jørgensen <mono at dtu.dk<mailto:mono at dtu.dk>> wrote:
Dear PETSc developers

Any update on this issue regarding DMPlex? Or is there any obvious workaround that we are unaware of?

I have fixed this bug. It did not come up in nightly tests because we are not using MatSetValuesLocal(). Instead we
use MatSetValuesClosure() which translates differently.

Here is the branch

  https://bitbucket.org/petsc/petsc/branch/knepley/fix-dm-ltog-bs

and I have merged it to next. It will go to master in a day or two.

Also, here is the cleaned up source with no memory leaks.

  Matt

Also should we additionally register the issue on Bitbucket or is reporting the issue on the mailing list enough?

Normally we are faster, but the start of the semester was hard this year.

  Thanks,

     Matt

Kind regards,
Morten

________________________________
From: Matthew Knepley [knepley at gmail.com<mailto:knepley at gmail.com>]
Sent: Friday, September 09, 2016 12:21 PM
To: Morten Nobel-Jørgensen
Cc: PETSc ‎[petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>]‎
Subject: Re: [petsc-users] DMPlex problem

On Fri, Sep 9, 2016 at 4:04 AM, Morten Nobel-Jørgensen <mono at dtu.dk<mailto:mono at dtu.dk>> wrote:
Dear PETSc developers and users,

Last week we posted a question regarding an error with DMPlex and multiple dofs and have not gotten any feedback yet. This is uncharted waters for us, since we have gotten used to an extremely fast feedback from the PETSc crew. So - with the chance of sounding impatient and ungrateful - we would like to hear if anybody has any ideas that could point us in the right direction?

This is my fault. You have not gotten a response because everyone else was waiting for me, and I have been
slow because I just moved houses at the same time as term started here. Sorry about that.

The example ran for me and I saw your problem. The local-tp-global map is missing for some reason.
I am tracking it down now. It should be made by DMCreateMatrix(), so this is mysterious. I hope to have
this fixed by early next week.

  Thanks,

    Matt

We have created a small example problem that demonstrates the error in the matrix assembly.

Thanks,
Morten





--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160926/b335c796/attachment.html>


More information about the petsc-users mailing list