[petsc-users] strange error by changing the number of cells.
Yann JOBIC
yann.jobic at univ-amu.fr
Tue Dec 18 07:52:17 CST 2018
Le 18/12/2018 à 14:26, Matthew Knepley a écrit :
> On Tue, Dec 18, 2018 at 5:23 AM Yann JOBIC via petsc-users
> <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>> wrote:
>
> Le 18/12/2018 à 03:01, Matthew Knepley a écrit :
>> On Mon, Dec 17, 2018 at 4:24 PM Yann Jobic via petsc-users
>> <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov>> wrote:
>>
>> Dear petsc users,
>>
>> I'm using petsc FEM framework, with p4est. I'm getting a
>> strange error,
>> when changing the number of point in DMPlexCreateBoxMesh()
>> (2D, 20x20 is
>> ok, 22x22 is not) :
>>
>> [0]PETSC ERROR: Nonconforming object sizes
>> [0]PETSC ERROR: The section point closure size 20 != dual
>> space dimension 18
>>
>>
>> Okay, here is what the nomenclature means. The closure size is
>> the size I get from the Section
>> when I take the closure of the given point. So for example, if we
>> have quads and Q2 elements,
>> that would have a closure size of
>>
>> 3 x 3 = 9
>>
>> per component. So a 2-component Q2 element would have closure
>> size 18. The dual space
>> dimension is the number of dual space basis vectors. If we use a
>> Lagrange element, there
>> would be 1 dof per vertex, 1 per edge and 1 per cell for
>> quadratic fields, so on a quad that is
>> 9 dofs, and for 2 component fields that is 18. This is why the
>> two numbers are designed to match.
>> Could you give
>>
>> -petscds_view
>>
>> so I can see the element you are using, and also
>> PetscSectionView() on the DM section (or use
>> -petscsection_view although that catches all sections, so it hard
>> to find the right one).
>>
>> Thanks,
> The problem arise in parallel, and for a 6x6 grid size. I'm doing
> something wrong, but i don't know where. It's working fine in
> older petsc version.
> You will find in attachment the log files.
>
> The code for the section is the following :
> ierr = DMGetSection(dmAux,§ion);CHKERRQ(ierr);
> PetscSectionView(section, PETSC_VIEWER_STDOUT_WORLD);
>
> Thanks for the help !
>
>
> On 1 proc, you have 120 cells, but on 2 procs you have 126? Can you
> also send the output with -dm_view?
> I can't see what is going on with the parallelization. Are you
> specfiying some overlap in the partition?
I don't have an overlap. My DM is created by :
ierr = DMPlexCreateBoxMesh(comm, dim, user->simplex, user->cells,
DM_BOUNDARY_NONE, DM_BOUNDARY_NONE, DM_BOUNDARY_NONE, interpolate,
dm);CHKERRQ(ierr);
I then distribute the DM :
ierr = DMPlexDistribute(*dm, 0, NULL, &distributedMesh);CHKERRQ(ierr);
And finally i convert to DMP4EST the resulting DM.
I copy this mesh to create the auxiliary one, for the auxiliary velocity
field.
Is that right ?
Thanks,
PS : sorry for multiple sends, i forgot to send this email to the list.
>
> Thanks,
>
> Matt
>
> Yann
>
>>
>> Matt
>>
>> [0]PETSC ERROR: See
>> http://www.mcs.anl.gov/petsc/documentation/faq.html
>> for trouble shooting.
>> [0]PETSC ERROR: Petsc Release Version 3.10.2, Oct, 09, 2018
>> [0]PETSC ERROR: ./gauss on a named crabe by yann Mon Dec 17
>> 22:17:38 2018
>> [0]PETSC ERROR: Configure options
>> --prefix=/local/lib/petsc/3.10/p2/gcc/openmpi_gcc_all
>> --with-single-library=0 --with-debugging=1
>> --download-scalapack=1
>> --download-metis=1 --download-parmetis=1 --download-ptscotch=1
>> --download-mumps=1 --download-hypre=1 --download-superlu=1
>> --download-superlu_dist=1 --download-fblaslapack=1
>> --download-metis=1
>> --download-ml=1 --download-p4est=1 --download-netcdf=1
>> --download-pragmatic=1 --with-cxx-dialect=C++11
>> --download-eigen=1
>> --download-parms=1 --download-triangle=1 --download-hdf5=1
>> --with-zlib=1
>> --download-szlib=1 --download-suitesparse=1
>> --with-shared-libraries=0
>> PETSC_ARCH=openmpi_gcc_all
>> [0]PETSC ERROR: #1 DMProjectLocal_Generic_Plex() line 400 in
>> /local/yann/petsc-3.10.2/src/dm/impls/plex/plexproject.c
>> [0]PETSC ERROR: #2 DMProjectFunctionLocal_Plex() line 530 in
>> /local/yann/petsc-3.10.2/src/dm/impls/plex/plexproject.c
>> [0]PETSC ERROR: #3 DMProjectFunctionLocal() line 6320 in
>> /local/yann/petsc-3.10.2/src/dm/interface/dm.c
>> [0]PETSC ERROR: #4 DMProjectFunctionLocal_p4est() line 4393 in
>> /local/yann/petsc-3.10.2/src/dm/impls/forest/p4est/pforest.c
>> [0]PETSC ERROR: #5 DMProjectFunctionLocal() line 6320 in
>> /local/yann/petsc-3.10.2/src/dm/interface/dm.c
>> [0]PETSC ERROR: #6 SetupVelocity() line 332 in
>> /local/yann/fe-utils/petsc/3.10/gauss.c
>> [0]PETSC ERROR: #7 SetupVelBC() line 359 in
>> /local/yann/fe-utils/petsc/3.10/gauss.c
>> [0]PETSC ERROR: #8 main() line 652 in
>> /local/yann/fe-utils/petsc/3.10/gauss.c
>>
>> I'm using petscspace_degree == 2 for the primary field, and
>> also for the
>> auxiliary one (velocity). The error occurs when i project the
>> velocity
>> function into the auxiliary field.
>>
>> Do you see where the problem could come from ?
>>
>> Best regards,
>>
>> Yann
>>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which
> their experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/%7Eknepley/>
--
___________________________
Yann JOBIC
HPC engineer
Polytech Marseille DME
IUSTI-CNRS UMR 6595
Technopôle de Château Gombert
5 rue Enrico Fermi
13453 Marseille cedex 13
Tel : (33) 4 91 10 69 39
ou (33) 4 91 10 69 43
Fax : (33) 4 91 10 69 69
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181218/d6d62e25/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: oneproc.log
Type: text/x-log
Size: 6885 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181218/d6d62e25/attachment-0002.bin>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: twoproc.log
Type: text/x-log
Size: 12671 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20181218/d6d62e25/attachment-0003.bin>
More information about the petsc-users
mailing list