<div dir="ltr"><div class="gmail_quote"><div dir="ltr">On Thu, Oct 11, 2018 at 4:39 PM Ellen M. Price <<a href="mailto:ellen.price@cfa.harvard.edu">ellen.price@cfa.harvard.edu</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">I was working with a DMPLEX and FEM, following SNES example 12. I get<br>
the following error when I call DMProjectFunction, but I don't know what<br>
it means. Can anyone explain where I might have gone wrong, or at least<br>
what this error is telling me? I think the point closure size is<br>
correct, since my mesh is 3d simplex,</blockquote><div><br></div><div>Yes, if you have 3D SIMPLEX mesh and are using P1 elements, then you would have</div><div>4 dofs in the closure of a cell. The dual space dimension is the number of dual space</div><div>basis vectors assigned to points in the closure. Since it is 1, it looks like you have a P0</div><div>dual space. I assume you changed something in ex12?</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"> but what is the dual space<br>
dimension, and where might I have set it incorrectly?<br>
<br>
[0]PETSC ERROR: Nonconforming object sizes<br>
[0]PETSC ERROR: The section point closure size 4 != dual space dimension 1<br>
[0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">http://www.mcs.anl.gov/petsc/documentation/faq.html</a><br>
for trouble shooting.<br>
[0]PETSC ERROR: Petsc Release Version 3.9.2, May, 20, 2018<br>
...<br>
[0]PETSC ERROR: Configure options<br>
--prefix=/home/eprice/software/petsc-opt --with-hdf5=1<br>
--with-hdf5-dir=/home/eprice/software/hdf5-parallel --with-mpe=1<br>
--with-mpe-dir=/home/eprice/software/mpe --with-debugging=0<br>
LDFLAGS="-pthread -lz" COPTFLAGS="-O3 -march=native -mtune=native"<br>
CXXOPTFLAGS="-O3 -march=native -mtune=native" FOPTFLAGS="-O3<br>
-march=native -mtune=native" --with-mpi=1<br>
--with-mpi-dir=/home/eprice/software/mpich --with-mumps=1<br>
--with-mumps-dir=/home/eprice/software/mumps --with-parmetis=1<br>
--with-parmetis-dir=/home/eprice/software/parmetis --with-metis=1<br>
--with-metis-dir=/home/eprice/software/parmetis --with-ptscotch=1<br>
--with-ptscotch-dir=/home/eprice/software/scotch --with-scalapack=1<br>
--with-scalapack-dir=/home/eprice/software/scalapack<br>
[0]PETSC ERROR: #1 DMProjectLocal_Generic_Plex() line 347 in<br>
/h/sabriel0/src/petsc-3.9.2/src/dm/impls/plex/plexproject.c<br>
[0]PETSC ERROR: #2 DMProjectFunctionLocal_Plex() line 428 in<br>
/h/sabriel0/src/petsc-3.9.2/src/dm/impls/plex/plexproject.c<br>
[0]PETSC ERROR: #3 DMProjectFunctionLocal() line 6265 in<br>
/h/sabriel0/src/petsc-3.9.2/src/dm/interface/dm.c<br>
[0]PETSC ERROR: #4 DMProjectFunction() line 6250 in<br>
/h/sabriel0/src/petsc-3.9.2/src/dm/interface/dm.c<br>
...<br>
<br>
(I know this is an optimized PETSc build, but I get the same error from<br>
my debug build, it's just much slower.)<br>
<br>
Ellen<br>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr" class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>