[petsc-users] Error in DMFieldComputeFaceData_DS with hex elements

rickcha at googlemail.com rickcha at googlemail.com
Tue Jan 22 04:18:44 CST 2019



> On 18. Jan 2019, at 18:19, rickcha at googlemail.com wrote:
> 
> 
> 
>> On 18. Jan 2019, at 16:09, Matthew Knepley <knepley at gmail.com> wrote:
>> 
>> On Fri, Jan 18, 2019 at 7:54 AM rickcha--- via petsc-users <petsc-users at mcs.anl.gov> wrote:
>> Dear petsc-team,
>> 
>> I ran into an error when using petscFE in combination with a hex mesh constructed with gmsh. The error message reads:
>> 
>> Yes, I am guessing that GMsh put some crazy crap in the mesh file. Can you test it out on a hex mesh from DMPlexCreateBoxMesh()
> 
> I tested DMPlexCreateBoxMesh() and it throws me the same error message. I tried running example 77 with -simplex 0, but it seems that tensor product elements are not available for this example:
> 
> 
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
> [0]PETSC ERROR: Invalid argument
> [0]PETSC ERROR: No stored partition matching run parameters
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: v3.10.3-1227-gcd612eedf5  GIT Date: 2019-01-17 05:31:34 +0100
> [0]PETSC ERROR: ./ex77 on a arch-linux2-intel named iket127138 by hartig Fri Jan 18 18:00:33 2019
> [0]PETSC ERROR: Configure options PETSC_ARCH=arch-linux2-intel --with-debugging=false --with-cc=/home/hartig/intel/bin/icc --with-cxx=/home/hartig/intel/bin/icpc --with-fc=/home/hartig/intel/bin/ifort --download-mpich --download-ml --download-parmetis --download-metis --download-hypre --download-mumps --download-scalapack --download-triangle --download-hdf5 --download-ptscotch --download-chaco
> [0]PETSC ERROR: #1 CreateMesh() line 443 in /home/hartig/petsc/src/snes/examples/tutorials/ex77.c
> [0]PETSC ERROR: #2 main() line 594 in /home/hartig/petsc/src/snes/examples/tutorials/ex77.c
> [0]PETSC ERROR: PETSc Option Table entries:
> [0]PETSC ERROR: -bc_fixed 1
> [0]PETSC ERROR: -bc_pressure 2
> [0]PETSC ERROR: -def_petscspace_degree 2
> [0]PETSC ERROR: -dim 3
> [0]PETSC ERROR: -dm_refine 0
> [0]PETSC ERROR: -elastMat_petscspace_degree 0
> [0]PETSC ERROR: -fieldsplit_deformation_ksp_type preonly
> [0]PETSC ERROR: -fieldsplit_deformation_pc_type lu
> [0]PETSC ERROR: -fieldsplit_pressure_ksp_rtol 1e-10
> [0]PETSC ERROR: -fieldsplit_pressure_pc_type jacobi
> [0]PETSC ERROR: -interpolate 1
> [0]PETSC ERROR: -ksp_converged_reason
> [0]PETSC ERROR: -ksp_monitor_short
> [0]PETSC ERROR: -ksp_rtol 1e-10
> [0]PETSC ERROR: -ksp_type fgmres
> [0]PETSC ERROR: -pc_fieldsplit_schur_factorization_type upper
> [0]PETSC ERROR: -pc_fieldsplit_type schur
> [0]PETSC ERROR: -pc_type fieldsplit
> [0]PETSC ERROR: -pres_petscspace_degree 1
> [0]PETSC ERROR: -run_type full
> [0]PETSC ERROR: -show_solution 0
> [0]PETSC ERROR: -simplex 0
> [0]PETSC ERROR: -snes_converged_reason
> [0]PETSC ERROR: -snes_monitor_short
> [0]PETSC ERROR: -snes_rtol 1e-05
> [0]PETSC ERROR: -test_partition
> [0]PETSC ERROR: -wall_pres_petscspace_degree 0
> [0]PETSC ERROR: -wall_pressure 0.4
> [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov----------
> application called MPI_Abort(MPI_COMM_WORLD, 62) - process 0
> [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=62
> :
> system msg for write_line failure : Bad file descriptor
> 
> 
>> just ot make sure your physics works? Then we can look at the GMsh file (I hate that program).
>>  
>> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
>> [0]PETSC ERROR: Petsc has generated inconsistent data
>> [0]PETSC ERROR: Not implemented yet
>> 
>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
>> [0]PETSC ERROR: Petsc Development GIT revision: v3.10.3-1227-gcd612eedf5  GIT Date: 2019-01-17 05:31:34 +0100
>> [0]PETSC ERROR: ./standaloneFEM on a arch-linux2-intel-debug named iket127138 by hartig Fri Jan 18 10:49:52 2019
>> [0]PETSC ERROR: Configure options PETSC_ARCH=arch-linux2-intel-debug --with-debugging=true --with-cc=/home/hartig/intel/bin/icc --with-cxx=/home/hartig/intel/bin/icpc --with-fc=/home/hartig/intel/bin/ifort --download-mpich --download-ml --download-parmetis --download-metis --download-hypre --download-mumps --download-scalapack --download-triangle --download-hdf5 --download-ptscotch --download-chaco
>> [0]PETSC ERROR: #1 DMFieldComputeFaceData_DS() line 943 in /home/hartig/petsc/src/dm/field/impls/ds/dmfieldds.c
>> [0]PETSC ERROR: #2 DMFieldCreateFEGeom() line 537 in /home/hartig/petsc/src/dm/field/interface/dmfield.c
>> [0]PETSC ERROR: #3 DMSNESGetFEGeom() line 2598 in /home/hartig/petsc/src/dm/impls/plex/plexfem.c
>> [0]PETSC ERROR: #4 DMPlexComputeBdResidual_Single_Internal() line 1520 in /home/hartig/petsc/src/snes/utils/dmplexsnes.c
>> [0]PETSC ERROR: #5 DMPlexComputeBdResidual_Internal() line 1649 in /home/hartig/petsc/src/snes/utils/dmplexsnes.c
>> [0]PETSC ERROR: #6 DMPlexComputeResidual_Internal() line 1946 in /home/hartig/petsc/src/snes/utils/dmplexsnes.c
>> [0]PETSC ERROR: #7 DMPlexTSComputeIFunctionFEM() line 206 in /home/hartig/petsc/src/ts/utils/dmplexts.c
>> [0]PETSC ERROR: #8 ComputeI2FunctionWithInertia() line 1284 in /home/hartig/cimply/femanalysis.c
>> [0]PETSC ERROR: #9 TSComputeI2Function() line 1707 in /home/hartig/petsc/src/ts/interface/ts.c
>> [0]PETSC ERROR: #10 SNESTSFormFunction_Alpha() line 339 in /home/hartig/petsc/src/ts/impls/implicit/alpha/alpha2.c
>> [0]PETSC ERROR: #11 SNESTSFormFunction() line 4694 in /home/hartig/petsc/src/ts/interface/ts.c
>> [0]PETSC ERROR: #12 SNESComputeFunction() line 2301 in /home/hartig/petsc/src/snes/interface/snes.c
>> [0]PETSC ERROR: #13 SNESSolve_NEWTONLS() line 175 in /home/hartig/petsc/src/snes/impls/ls/ls.c
>> [0]PETSC ERROR: #14 SNESSolve() line 4454 in /home/hartig/petsc/src/snes/interface/snes.c
>> [0]PETSC ERROR: #15 TSAlpha_SNESSolve() line 101 in /home/hartig/petsc/src/ts/impls/implicit/alpha/alpha2.c
>> [0]PETSC ERROR: #16 TSAlpha_Restart() line 143 in /home/hartig/petsc/src/ts/impls/implicit/alpha/alpha2.c
>> [0]PETSC ERROR: #17 TSStep_Alpha() line 217 in /home/hartig/petsc/src/ts/impls/implicit/alpha/alpha2.c
>> [0]PETSC ERROR: #18 TSStep() line 3664 in /home/hartig/petsc/src/ts/interface/ts.c
>> [0]PETSC ERROR: #19 TSSolve() line 3847 in /home/hartig/petsc/src/ts/interface/ts.c
>> 
>> Upon investigation, I found that petsc throws the error message because coneSize has a value of six. This is the first thing that puzzles me. I would have expected a cone size of 4 since we are dealing with faces of hex elements. Is it possible that we are on the wrong “level” of the mesh graph somehow? Printing "numFaces“ in the debugger, I get the total number of elements (not 2D faces) I have in my mesh for the hex-case.
>> 
>> I ran the same case with a simplex mesh and found another thing I don’t quite understand: Both times I ran with -petscspace-degree 2. Yet in case of the simplex mesh, I get a polynomial degree of 1 in DMFieldComputeFaceData_DS (“maxDegree”)
>> 
>> I think you are looking at the coordinate space. By default, simplex mesh use affine coordinate spaces.
>>  
>> and in case of the hex-mesh, I get a polynomial degree of 3.
>> 
>> That is strange. It should be 2.
> 
> I can’t quite follow you here. Do you refer to the local coordinate system of the element used to define the shape functions? Then a degree of 1 would mean two non-collinear vectors e_1 and e_2?
> 
> Thanks,
> Max
> 
> 
>> 
>>   Thanks,
>> 
>>     Matt
>>  
>> In both cases I would have expected the degree to be 2, corresponding to Tet10 and Hex20 elements. I might be confusing two different concepts here. But this was how I understood the documentation until now.
>> 
>> Thanks,
>> Max
>> 
>> 
>> -- 
>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>> -- Norbert Wiener
>> 
>> https://www.cse.buffalo.edu/~knepley/


I believe I have found the issue in dmfieldds.c: 
While the first part of the code (starting line 785 in file dmfieldds.c) references the cell discretisation, the second part of the code references the face discretisation. However, the reference cell / reference DM (variable “K” in the code) is never updated. The sorting of the quadrature points seems correct to me. But instead of a coneSize of 4 - as one would expect for a quad face - we get a coneSize of 6 because the reference DM is still the hex cell.
Hope this makes sense.

The following modifications made the code run correctly (as far as I’ve tested it) but are quite a hack:


@@ -908,7 +908,7 @@ static PetscErrorCode DMFieldComputeFaceData_DS(DMField field, IS pointIS, Petsc
               orientPoints[o][2 * q + 1] = -(lambdao[0] + lambdao[1]) + lambdao[2];
             }
             break;
-          case 4:
+          case 6:
             for (q = 0; q < Nq; q++) {
               PetscReal xi[2], xio[2];
               PetscInt oabs = (orient >= 0) ? orient : -(orient + 1);
@@ -942,6 +942,7 @@ static PetscErrorCode DMFieldComputeFaceData_DS(DMField field, IS pointIS, Petsc
           default:
             SETERRQ(PETSC_COMM_SELF,PETSC_ERR_PLIB,"Not implemented yet\n");
           }
+         break;
         default:
           SETERRQ(PETSC_COMM_SELF,PETSC_ERR_PLIB,"Not implemented yet\n");
         }


Unfortunately, my understanding is too limited to fix this the proper way. Probably one should swap the reference DM from cell to face somewhere along the way, but I’m not sure where.


Thanks,
Max



More information about the petsc-users mailing list