[petsc-users] Distribution of DMPlex for FEM

Morten Nobel-Jørgensen mono at dtu.dk
Tue Aug 30 14:43:11 CDT 2016


We have now hit a related problem. If we change the dof from 1 to 3 we the following error message. I'm using the next branch (pulled from git today).

mpiexec -np 2 ./ex18k
[1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[1]PETSC ERROR: Argument out of range
[1]PETSC ERROR: Row too large: row 36 max 35
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[1]PETSC ERROR: Petsc Development GIT revision: v3.7.3-1857-g5b40c63  GIT Date: 2016-08-29 22:13:25 -0500
[1]PETSC ERROR: ./ex18k on a arch-linux2-c-debug named morten-VirtualBox by morten Tue Aug 30 21:31:36 2016
[1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack --download-mpich --download-netcdf --download-hdf5 --download-exodusii
[1]PETSC ERROR: #1 MatSetValues_MPIAIJ() line 556 in /home/morten/petsc/src/mat/impls/aij/mpi/mpiaij.c
[1]PETSC ERROR: #2 MatSetValues() line 1239 in /home/morten/petsc/src/mat/interface/matrix.c
[1]PETSC ERROR: #3 MatSetValuesLocal() line 2102 in /home/morten/petsc/src/mat/interface/matrix.c
[1]PETSC ERROR: #4 CreateGlobalStiffnessMatrix() line 82 in /home/morten/topop_in_petsc_unstruct/ex18k.cc
[1]PETSC ERROR: #5 main() line 113 in /home/morten/topop_in_petsc_unstruct/ex18k.cc
[1]PETSC ERROR: No PETSc Option Table entries

We are not fully sure if this is related to the way we use DMPlex or a bug in our code. Any help or tips will be appreciated :)

Kind regards,
Morten


________________________________
From: Morten Nobel-Jørgensen
Sent: Thursday, July 14, 2016 9:45 AM
To: Matthew Knepley
Cc: petsc-users at mcs.anl.gov
Subject: Re: [petsc-users] Distribution of DMPlex for FEM

Hi Matthew

Thanks for your answer and your fix. It works :)))

Kind regards,
Morten


Fra: Matthew Knepley <knepley at gmail.com<mailto:knepley at gmail.com>>
Dato: Thursday 14 July 2016 at 00:03
Til: Morten Nobel-Joergensen <mono at dtu.dk<mailto:mono at dtu.dk>>
Cc: "petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>" <petsc-users at mcs.anl.gov<mailto:petsc-users at mcs.anl.gov>>
Emne: Re: [petsc-users] Distribution of DMPlex for FEM

On Wed, Jul 13, 2016 at 3:57 AM, Morten Nobel-Jørgensen <mono at dtu.dk<mailto:mono at dtu.dk>> wrote:
I’m having problems distributing a simple FEM model using DMPlex. For test case I use 1x1x2 hex box elements (/cells) with 12 vertices. Each vertex has one DOF.
When I distribute the system to two processors, each get a single element and the local vector has the size 8 (one DOF for each vertex of a hex box) as expected.

My problem is that when I manually assemble the global stiffness matrix (a 12x12 matrix) it seems like my ghost values are ignored. I’m sure that I’m missing something obvious but cannot see what it is.

In the attached example, I’m assembling the global stiffness matrix using a simple local stiffness matrix of ones. This makes it very easy to see if the matrix is assembled correctly. If I run it on one process, then global stiffness matrix consists of 0’s, 1’s and 2’s and its trace is 16.0. But if I run it distributed on on two, then it consists only of 0's and 1’s and its trace is 12.0.

I hope that somebody can spot my mistake and help me in the right direction :)

This is my fault, and Stefano Zampini had already tried to tell me this was broken. I normally use DMPlexMatSetClosure(), which handles global indices correctly.
I have fixed this in the branch

  knepley/fix-plex-l2g

which is also merged to 'next'. I am attaching a version of your sample where all objects are freed correctly. Let me know if that works for you.

  Thanks,

     Matt

Kind regards,
Morten



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160830/9d4244c0/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: ex18k.cc
Type: application/octet-stream
Size: 4631 bytes
Desc: ex18k.cc
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160830/9d4244c0/attachment.obj>


More information about the petsc-users mailing list