[petsc-users] ex12 with Neumann BC

Olivier Bonnefon olivier.bonnefon at avignon.inra.fr
Thu Oct 3 04:48:13 CDT 2013


Hello,

Thank you for your answer, I'm now able to run the ex12.c with Neumann 
BC (options -bc_type neumann   -interpolate 1).

I have adapted the ex12.c for the 2D-system:

-\Delta u +u =0

It consists in adapting the fem->f0Funcs[0] function and adding the 
jacobian function fem->g0Funcs[0].
My implementation works for Dirichlet BC.
With Neumann BC(with options -bc_type neumann   -interpolate 1), the 
line search failed. I think my jacobian functions are corrects, because 
the option "-snes_mf_operator" leads to the same behavior.
Do you know what I have missed ?
In Neumann case, Where is added the 1d-integral along \delta \Omega ?


Thanks,
Olivier Bonnefon







On 09/26/2013 06:54 PM, Matthew Knepley wrote:
> On Thu, Sep 26, 2013 at 6:04 AM, Olivier Bonnefon 
> <olivier.bonnefon at avignon.inra.fr 
> <mailto:olivier.bonnefon at avignon.inra.fr>> wrote:
>
>     Hello,
>
>     I have implemented my own system from ex12.c. It works with
>     Dirichlet BC, but failed with Neumann one.
>
>     So, I'm came back to the example
>     /src/snes/example/tutorial/ex12.c, and I tried with Neumann BC:
>
>     ./ex12 -bc_type NEUMANN
>
>
> Here is the full list of tests I run (just checked that it passes in 
> 'next'):
>
> https://bitbucket.org/petsc/petsc/src/f34a81fe8510aa025c9247a5b14f0fe30e3c0bed/config/builder.py?at=master#cl-175
>
> Make sure you use an interpolated mesh with Neumann conditions since 
> you need faces.
>
>    Matt
>
>     This leads to the following crach:
>
>     [0]PETSC ERROR: --------------------- Error Message
>     ------------------------------------
>     [0]PETSC ERROR: No support for this operation for this object type!
>     [0]PETSC ERROR: Unsupported number of vertices 0 in cell 8 for
>     element geometry computation!
>     [0]PETSC ERROR:
>     ------------------------------------------------------------------------
>     [0]PETSC ERROR: Petsc Release Version 3.4.2, Jul, 02, 2013
>     [0]PETSC ERROR: See docs/changes/index.html for recent updates.
>     [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>     [0]PETSC ERROR: See docs/index.html for manual pages.
>     [0]PETSC ERROR:
>     ------------------------------------------------------------------------
>     [0]PETSC ERROR: ./ex12 on a arch-linux2-c-debug named pcbiom38 by
>     olivierb Thu Sep 26 14:53:32 2013
>     [0]PETSC ERROR: Libraries linked from
>     /home/olivierb/SOFT/petsc-3.4.2/arch-linux2-c-debug/lib
>     [0]PETSC ERROR: Configure run at Thu Sep 26 14:44:42 2013
>     [0]PETSC ERROR: Configure options --with-debugging=1
>     --download-fiat --download-scientificpython --download-generator
>     --download-triangle --download-ctetgen --download-chaco
>     --download-netcdf --download-hdf5
>     [0]PETSC ERROR:
>     ------------------------------------------------------------------------
>     [0]PETSC ERROR: DMPlexComputeCellGeometry() line 732 in
>     /home/olivierb/SOFT/petsc-3.4.2/src/dm/impls/plex/plexgeometry.c
>     [0]PETSC ERROR: DMPlexComputeResidualFEM() line 558 in
>     /home/olivierb/SOFT/petsc-3.4.2/src/dm/impls/plex/plexfem.c
>     [0]PETSC ERROR: SNESComputeFunction_DMLocal() line 75 in
>     /home/olivierb/SOFT/petsc-3.4.2/src/snes/utils/dmlocalsnes.c
>     [0]PETSC ERROR: SNESComputeFunction() line 1988 in
>     /home/olivierb/SOFT/petsc-3.4.2/src/snes/interface/snes.c
>     [0]PETSC ERROR: SNESSolve_NEWTONLS() line 162 in
>     /home/olivierb/SOFT/petsc-3.4.2/src/snes/impls/ls/ls.c
>     [0]PETSC ERROR: SNESSolve() line 3636 in
>     /home/olivierb/SOFT/petsc-3.4.2/src/snes/interface/snes.c
>     [0]PETSC ERROR: main() line 582 in
>     "unknowndirectory/"/home/olivierb/solvers/trunk/SandBox/PETSC/LANDSCAPE/REF/ex12.c
>     --------------------------------------------------------------------------
>     MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
>     with errorcode 56.
>
>     NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
>     You may or may not see output from other processes, depending on
>     exactly when Open MPI kills them.
>     --------------------------------------------------------------------------
>
>     With Gbd, I saw that DMPlexGetConeSize is 0 for the last point.
>
>     Do I have forget a step to use Neumann BC ?
>
>     Thanks
>     Olivier Bonnefon
>
>     -- 
>     Olivier Bonnefon
>     INRA PACA-Avignon, Unité BioSP
>     Tel: +33 (0)4 32 72 21 58 <tel:%2B33%20%280%294%2032%2072%2021%2058>
>
>
>
>
> -- 
> What most experimenters take for granted before they begin their 
> experiments is infinitely more interesting than any results to which 
> their experiments lead.
> -- Norbert Wiener


-- 
Olivier Bonnefon
INRA PACA-Avignon, Unité BioSP
Tel: +33 (0)4 32 72 21 58

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20131003/9750a2e5/attachment-0001.html>


More information about the petsc-users mailing list