[petsc-users] Problems running ex12.c

Matthew Knepley knepley at gmail.com
Fri Apr 4 10:37:47 CDT 2014


On Mon, Mar 31, 2014 at 7:08 PM, Matthew Knepley <knepley at gmail.com> wrote:

> On Mon, Mar 31, 2014 at 6:37 PM, Miguel Angel Salazar de Troya <
> salazardetroya at gmail.com> wrote:
>
>> Thanks for your response. Now I am trying to modify this example to
>> include Dirichlet and Neumann conditions at the same time.
>>
>> I can see that inside of DMPlexCreateSquareBoundary there is an option
>> ("-dm_plex_separate_marker") to just mark the top boundary with 1. I
>> understand that only this side would have Dirichlet conditions that are
>> described by the function bcFuncs in user.fem (the exact function in this
>> example). However, when we run the Neumann condition, we fix all the
>> boundary as Neumann condition with the function DMPlexAddBoundary, is this
>> right?
>>
>
> Right about the shortcoming, but wrong about the source.
> DMPlexAddBoundary() takes an argument that is the marker value for the
> given label, so you can select boundaries.
> However, DMPlexComputeResidualFEM() currently hardcodes the boundary name
> ("boundary")
> and the marker value (1). I wrote this when we had no boundary
> representation in PETSc. Now that we have DMAddBoundary(), we can
> loop over the Neumann boundaries. I have put this on my todo list. If you
> are motivated, you can do it first and I will help.
>

I pushed this change today, so now you can have multiple Neumann boundaries
with whatever labels
you want. I have not written a test for multiple boundaries, but the old
single boundary tests pass.

  Thanks,

      Matt


>   Thanks,
>
>      Matt
>
>
>> Could there be a way to just fix a certain boundary with the Neumann
>> condition in this example? Would it be easier with an external library as
>> Exodus II?
>>
>>
>> On Sun, Mar 30, 2014 at 7:51 PM, Matthew Knepley <knepley at gmail.com>wrote:
>>
>>> On Sun, Mar 30, 2014 at 7:07 PM, Miguel Angel Salazar de Troya <
>>> salazardetroya at gmail.com> wrote:
>>>
>>>> Thanks for your response. Your help is really useful to me.
>>>>
>>>> The difference between the analytic and the field options are that for
>>>> the field options the function is projected onto the function space defined
>>>> for feAux right? What is the advantage of doing this?
>>>>
>>>
>>> If it is not purely a function of the coordinates, or you do not know
>>> that function, there is no option left.
>>>
>>>
>>>> Also, for this field case I see that the function always has to be a
>>>> vector. What if we wanted to implement a heterogeneous material in linear
>>>> elasticity? Would we implement the constitutive tensor as a vector? It
>>>> would not be very difficult I think, I just want to make sure it would be
>>>> this way.
>>>>
>>>
>>> Its not a vector, which indicates a particular behavior under coordinate
>>> transformations, but an array
>>> which can hold any data you want.
>>>
>>>    Matt
>>>
>>>
>>>> Thanks in advance
>>>> Miguel
>>>>
>>>>
>>>> On Sun, Mar 30, 2014 at 2:01 PM, Matthew Knepley <knepley at gmail.com>wrote:
>>>>
>>>>> On Sun, Mar 30, 2014 at 1:57 PM, Miguel Angel Salazar de Troya <
>>>>> salazardetroya at gmail.com> wrote:
>>>>>
>>>>>> Hello everybody
>>>>>>
>>>>>> I had a question about this example. In the petsc-dev next version,
>>>>>> why don't we create a PetscSection in the function SetupSection, but we do
>>>>>> it in the function SetupMaterialSection and in the function SetupSection of
>>>>>> the petsc-current version.
>>>>>>
>>>>>
>>>>> 1) I wanted to try and make things more automatic for the user
>>>>>
>>>>> 2) I needed a way to automatically layout data for coarser/finer grids
>>>>> in unstructured MG
>>>>>
>>>>> Thus, now when you set for PetscFE into the DM using DMSetField(), it
>>>>> will automatically create
>>>>> the section on the first call to DMGetDefaultSection().
>>>>>
>>>>> I do not have a similar provision now for materials, so you create
>>>>> your own section. I think this is
>>>>> alright until we have some idea of a nicer interface.
>>>>>
>>>>>   Thanks,
>>>>>
>>>>>      Matt
>>>>>
>>>>>
>>>>>> petsc-dev:
>>>>>>
>>>>>> #undef __FUNCT__
>>>>>> #define __FUNCT__ "SetupSection"
>>>>>> PetscErrorCode SetupSection(DM dm, AppCtx *user)
>>>>>> {
>>>>>>   DM             cdm = dm;
>>>>>>   const PetscInt id  = 1;
>>>>>>   PetscErrorCode ierr;
>>>>>>
>>>>>>   PetscFunctionBeginUser;
>>>>>>   ierr = PetscObjectSetName((PetscObject) user->fe[0],
>>>>>> "potential");CHKERRQ(ierr);
>>>>>>   while (cdm) {
>>>>>>     ierr = DMSetNumFields(cdm, 1);CHKERRQ(ierr);
>>>>>>     ierr = DMSetField(cdm, 0, (PetscObject)
>>>>>> user->fe[0]);CHKERRQ(ierr);
>>>>>>     ierr = DMPlexAddBoundary(cdm, user->bcType == DIRICHLET,
>>>>>> user->bcType == NEUMANN ? "boundary" : "marker", 0, user->exactFuncs[0], 1,
>>>>>> &id, user);CHKERRQ(ierr);
>>>>>>     ierr = DMPlexGetCoarseDM(cdm, &cdm);CHKERRQ(ierr);
>>>>>>   }
>>>>>>   PetscFunctionReturn(0);
>>>>>> }
>>>>>>
>>>>>>
>>>>>> It seems that it adds the number of fields directly to the DM, and
>>>>>> takes the number of components that were specified in SetupElementCommon,
>>>>>> but what about the number of degrees of freedom? Why we added it for the
>>>>>> MaterialSection but not for the regular Section.
>>>>>>
>>>>>> Thanks in advance
>>>>>> Miguel
>>>>>>
>>>>>>
>>>>>> On Sat, Mar 15, 2014 at 4:16 PM, Miguel Angel Salazar de Troya <
>>>>>> salazardetroya at gmail.com> wrote:
>>>>>>
>>>>>>> Thanks a lot.
>>>>>>>
>>>>>>>
>>>>>>> On Sat, Mar 15, 2014 at 3:36 PM, Matthew Knepley <knepley at gmail.com>wrote:
>>>>>>>
>>>>>>>> On Sat, Mar 15, 2014 at 3:31 PM, Miguel Angel Salazar de Troya <
>>>>>>>> salazardetroya at gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Hello everybody
>>>>>>>>>
>>>>>>>>> I keep trying to understand this example. I don't have any
>>>>>>>>> problems with this example when I run it like this:
>>>>>>>>>
>>>>>>>>> [salaza11 at maya PETSC]$ ./ex12 -bc_type dirichlet -interpolate
>>>>>>>>> -petscspace_order 1 -variable_coefficient nonlinear -dim 2 -run_type full
>>>>>>>>> -show_solution
>>>>>>>>> Number of SNES iterations = 5
>>>>>>>>> L_2 Error: 0.107289
>>>>>>>>> Solution
>>>>>>>>> Vec Object: 1 MPI processes
>>>>>>>>>   type: seq
>>>>>>>>> 0.484618
>>>>>>>>>
>>>>>>>>> However, when I change the boundary conditions to Neumann, I get
>>>>>>>>> this error.
>>>>>>>>>
>>>>>>>>> [salaza11 at maya PETSC]$ ./ex12 -bc_type neumann -interpolate 1
>>>>>>>>> -petscspace_order 2 -variable_coefficient nonlinear -dim 2 -run_type full
>>>>>>>>> -show_solution
>>>>>>>>>
>>>>>>>>
>>>>>>>> Here you set the order of the element used in bulk, but not on the
>>>>>>>> boundary where you condition is, so it defaults to 0. In
>>>>>>>> order to become more familiar, take a look at the tests that I run
>>>>>>>> here:
>>>>>>>>
>>>>>>>>
>>>>>>>> https://bitbucket.org/petsc/petsc/src/64715f0f033346c10c77b73cf58216d111db8789/config/builder.py?at=master#cl-216
>>>>>>>>
>>>>>>>>      Matt
>>>>>>>>
>>>>>>>> [0]PETSC ERROR: --------------------- Error Message
>>>>>>>>> --------------------------------------------------------------
>>>>>>>>> [0]PETSC ERROR: Petsc has generated inconsistent data
>>>>>>>>> [0]PETSC ERROR: Number of dual basis vectors 0 not equal to
>>>>>>>>> dimension 1
>>>>>>>>> [0]PETSC ERROR: See http://
>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
>>>>>>>>> shooting.
>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision:
>>>>>>>>> v3.4.3-4776-gb18359b  GIT Date: 2014-03-04 10:53:30 -0600
>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by
>>>>>>>>> salaza11 Sat Mar 15 14:28:05 2014
>>>>>>>>>  [0]PETSC ERROR: Configure options --download-mpich
>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen
>>>>>>>>> --download-chaco --with-c2html=0
>>>>>>>>> [0]PETSC ERROR: #1 PetscDualSpaceSetUp_Lagrange() line 1763 in
>>>>>>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c
>>>>>>>>> [0]PETSC ERROR: #2 PetscDualSpaceSetUp() line 1277 in
>>>>>>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c
>>>>>>>>> [0]PETSC ERROR: #3 SetupElementCommon() line 474 in
>>>>>>>>> /home/salaza11/workspace/PETSC/ex12.c
>>>>>>>>> [0]PETSC ERROR: #4 SetupBdElement() line 559 in
>>>>>>>>> /home/salaza11/workspace/PETSC/ex12.c
>>>>>>>>> [0]PETSC ERROR: #5 main() line 755 in
>>>>>>>>> /home/salaza11/workspace/PETSC/ex12.c
>>>>>>>>> [0]PETSC ERROR: ----------------End of Error Message -------send
>>>>>>>>> entire error message to petsc-maint at mcs.anl.gov----------
>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 77) - process 0
>>>>>>>>> [unset]: aborting job:
>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 77) - process 0
>>>>>>>>>
>>>>>>>>> I honestly do not know much about using dual spaces in a finite
>>>>>>>>> element context. I have been trying to find some material that could help
>>>>>>>>> me without much success. I tried to modify the dual space order with the
>>>>>>>>> option -petscdualspace_order but I kept getting errors. In particular, I
>>>>>>>>> got this when I set it to 1.
>>>>>>>>>
>>>>>>>>> [salaza11 at maya PETSC]$ ./ex12 -bc_type neumann -interpolate 1
>>>>>>>>> -petscspace_order 2 -variable_coefficient nonlinear -dim 2 -run_type full
>>>>>>>>> -show_solution -petscdualspace_order 1
>>>>>>>>> [0]PETSC ERROR: PetscTrFreeDefault() called from
>>>>>>>>> PetscFESetUp_Basic() line 2492 in
>>>>>>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c
>>>>>>>>> [0]PETSC ERROR: Block [id=0(32)] at address 0x1cc32f0 is corrupted
>>>>>>>>> (probably write past end of array)
>>>>>>>>> [0]PETSC ERROR: Block allocated in PetscFESetUp_Basic() line 2483
>>>>>>>>> in /home/salaza11/petsc/src/dm/dt/interface/dtfe.c
>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message
>>>>>>>>> --------------------------------------------------------------
>>>>>>>>> [0]PETSC ERROR: Memory corruption:
>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/installation.html#valgrind
>>>>>>>>> [0]PETSC ERROR: Corrupted memory
>>>>>>>>> [0]PETSC ERROR: See http://
>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble
>>>>>>>>> shooting.
>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision:
>>>>>>>>> v3.4.3-4776-gb18359b  GIT Date: 2014-03-04 10:53:30 -0600
>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named maya by
>>>>>>>>> salaza11 Sat Mar 15 14:37:34 2014
>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich
>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen
>>>>>>>>> --download-chaco --with-c2html=0
>>>>>>>>> [0]PETSC ERROR: #1 PetscTrFreeDefault() line 289 in
>>>>>>>>> /home/salaza11/petsc/src/sys/memory/mtr.c
>>>>>>>>> [0]PETSC ERROR: #2 PetscFESetUp_Basic() line 2492 in
>>>>>>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c
>>>>>>>>> [0]PETSC ERROR: #3 PetscFESetUp() line 2126 in
>>>>>>>>> /home/salaza11/petsc/src/dm/dt/interface/dtfe.c
>>>>>>>>> [0]PETSC ERROR: #4 SetupElementCommon() line 482 in
>>>>>>>>> /home/salaza11/workspace/PETSC/ex12.c
>>>>>>>>> [0]PETSC ERROR: #5 SetupElement() line 506 in
>>>>>>>>> /home/salaza11/workspace/PETSC/ex12.c
>>>>>>>>> [0]PETSC ERROR: #6 main() line 754 in
>>>>>>>>> /home/salaza11/workspace/PETSC/ex12.c
>>>>>>>>> [0]PETSC ERROR: ----------------End of Error Message -------send
>>>>>>>>> entire error message to petsc-maint at mcs.anl.gov----------
>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
>>>>>>>>> [unset]: aborting job:
>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
>>>>>>>>> [salaza11 at maya PETSC]$
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Then again, I do not know much what I am doing given my ignorance
>>>>>>>>> with respect to the dual spaces in FE. I apologize for that. My questions
>>>>>>>>> are:
>>>>>>>>>
>>>>>>>>> - Where could I find more resources in order to understand the
>>>>>>>>> PETSc implementation of dual spaces for FE?
>>>>>>>>> - Why does it run with Dirichlet but not with Neumann?
>>>>>>>>>
>>>>>>>>> Thanks in advance.
>>>>>>>>> Miguel.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Tue, Mar 4, 2014 at 11:28 PM, Matthew Knepley <
>>>>>>>>> knepley at gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> On Tue, Mar 4, 2014 at 12:01 PM, Matthew Knepley <
>>>>>>>>>> knepley at gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> On Tue, Mar 4, 2014 at 11:51 AM, Miguel Angel Salazar de Troya <
>>>>>>>>>>> salazardetroya at gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> I can run it now, thanks. Although if I run it with valgrind
>>>>>>>>>>>> 3.5.0 (should I update to the last version?) I get some memory leaks
>>>>>>>>>>>> related with the function DMPlexCreateBoxMesh.
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> I will check it out.
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> This is now fixed.
>>>>>>>>>>
>>>>>>>>>>   Thanks for finding it
>>>>>>>>>>
>>>>>>>>>>       Matt
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>>   Thanks,
>>>>>>>>>>>
>>>>>>>>>>>      Matt
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>> [salaza11 at maya tutorials]$ valgrind --leak-check=full ./ex12
>>>>>>>>>>>> -run_type test -refinement_limit 0.0    -bc_type dirichlet -interpolate 0
>>>>>>>>>>>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1
>>>>>>>>>>>> ==9625== Memcheck, a memory error detector
>>>>>>>>>>>> ==9625== Copyright (C) 2002-2009, and GNU GPL'd, by Julian
>>>>>>>>>>>> Seward et al.
>>>>>>>>>>>> ==9625== Using Valgrind-3.5.0 and LibVEX; rerun with -h for
>>>>>>>>>>>> copyright info
>>>>>>>>>>>> ==9625== Command: ./ex12 -run_type test -refinement_limit 0.0
>>>>>>>>>>>> -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial
>>>>>>>>>>>> -dm_plex_print_fem 1
>>>>>>>>>>>> ==9625==
>>>>>>>>>>>> Local function:
>>>>>>>>>>>> Vec Object: 1 MPI processes
>>>>>>>>>>>>   type: seq
>>>>>>>>>>>> 0
>>>>>>>>>>>> 0.25
>>>>>>>>>>>> 1
>>>>>>>>>>>> 0.25
>>>>>>>>>>>> 0.5
>>>>>>>>>>>> 1.25
>>>>>>>>>>>> 1
>>>>>>>>>>>> 1.25
>>>>>>>>>>>> 2
>>>>>>>>>>>> Initial guess
>>>>>>>>>>>> Vec Object: 1 MPI processes
>>>>>>>>>>>>   type: seq
>>>>>>>>>>>> 0.5
>>>>>>>>>>>> L_2 Error: 0.111111
>>>>>>>>>>>> Residual:
>>>>>>>>>>>> Vec Object: 1 MPI processes
>>>>>>>>>>>>   type: seq
>>>>>>>>>>>> 0
>>>>>>>>>>>> 0
>>>>>>>>>>>> 0
>>>>>>>>>>>> 0
>>>>>>>>>>>> 0
>>>>>>>>>>>> 0
>>>>>>>>>>>> 0
>>>>>>>>>>>> 0
>>>>>>>>>>>>  0
>>>>>>>>>>>> Initial Residual
>>>>>>>>>>>> Vec Object: 1 MPI processes
>>>>>>>>>>>>   type: seq
>>>>>>>>>>>> 0
>>>>>>>>>>>> L_2 Residual: 0
>>>>>>>>>>>> Jacobian:
>>>>>>>>>>>> Mat Object: 1 MPI processes
>>>>>>>>>>>>   type: seqaij
>>>>>>>>>>>> row 0: (0, 4)
>>>>>>>>>>>> Residual:
>>>>>>>>>>>> Vec Object: 1 MPI processes
>>>>>>>>>>>>   type: seq
>>>>>>>>>>>> 0
>>>>>>>>>>>> 0
>>>>>>>>>>>> 0
>>>>>>>>>>>> 0
>>>>>>>>>>>> -2
>>>>>>>>>>>> 0
>>>>>>>>>>>> 0
>>>>>>>>>>>> 0
>>>>>>>>>>>> 0
>>>>>>>>>>>> Au - b = Au + F(0)
>>>>>>>>>>>>  Vec Object: 1 MPI processes
>>>>>>>>>>>>   type: seq
>>>>>>>>>>>> 0
>>>>>>>>>>>> Linear L_2 Residual: 0
>>>>>>>>>>>> ==9625==
>>>>>>>>>>>> ==9625== HEAP SUMMARY:
>>>>>>>>>>>> ==9625==     in use at exit: 288 bytes in 3 blocks
>>>>>>>>>>>> ==9625==   total heap usage: 2,484 allocs, 2,481 frees,
>>>>>>>>>>>> 1,009,287 bytes allocated
>>>>>>>>>>>> ==9625==
>>>>>>>>>>>> ==9625== 48 bytes in 1 blocks are definitely lost in loss
>>>>>>>>>>>> record 1 of 3
>>>>>>>>>>>> ==9625==    at 0x4A05E46: malloc (vg_replace_malloc.c:195)
>>>>>>>>>>>> ==9625==    by 0x5D8D4E1: writepoly (triangle.c:12012)
>>>>>>>>>>>> ==9625==    by 0x5D8FAAC: triangulate (triangle.c:13167)
>>>>>>>>>>>> ==9625==    by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749)
>>>>>>>>>>>> ==9625==    by 0x56B5EE4: DMPlexGenerate (plex.c:4503)
>>>>>>>>>>>> ==9625==    by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668)
>>>>>>>>>>>> ==9625==    by 0x4051FA: CreateMesh (ex12.c:341)
>>>>>>>>>>>> ==9625==    by 0x408D3D: main (ex12.c:651)
>>>>>>>>>>>> ==9625==
>>>>>>>>>>>> ==9625== 96 bytes in 1 blocks are definitely lost in loss
>>>>>>>>>>>> record 2 of 3
>>>>>>>>>>>> ==9625==    at 0x4A05E46: malloc (vg_replace_malloc.c:195)
>>>>>>>>>>>> ==9625==    by 0x5D8D485: writepoly (triangle.c:12004)
>>>>>>>>>>>> ==9625==    by 0x5D8FAAC: triangulate (triangle.c:13167)
>>>>>>>>>>>> ==9625==    by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749)
>>>>>>>>>>>> ==9625==    by 0x56B5EE4: DMPlexGenerate (plex.c:4503)
>>>>>>>>>>>> ==9625==    by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668)
>>>>>>>>>>>> ==9625==    by 0x4051FA: CreateMesh (ex12.c:341)
>>>>>>>>>>>> ==9625==    by 0x408D3D: main (ex12.c:651)
>>>>>>>>>>>> ==9625==
>>>>>>>>>>>> ==9625== 144 bytes in 1 blocks are definitely lost in loss
>>>>>>>>>>>> record 3 of 3
>>>>>>>>>>>> ==9625==    at 0x4A05E46: malloc (vg_replace_malloc.c:195)
>>>>>>>>>>>> ==9625==    by 0x5D8CD20: writenodes (triangle.c:11718)
>>>>>>>>>>>> ==9625==    by 0x5D8F9DE: triangulate (triangle.c:13132)
>>>>>>>>>>>> ==9625==    by 0x56B0884: DMPlexGenerate_Triangle (plex.c:3749)
>>>>>>>>>>>> ==9625==    by 0x56B5EE4: DMPlexGenerate (plex.c:4503)
>>>>>>>>>>>> ==9625==    by 0x567F414: DMPlexCreateBoxMesh (plexcreate.c:668)
>>>>>>>>>>>> ==9625==    by 0x4051FA: CreateMesh (ex12.c:341)
>>>>>>>>>>>> ==9625==    by 0x408D3D: main (ex12.c:651)
>>>>>>>>>>>> ==9625==
>>>>>>>>>>>> ==9625== LEAK SUMMARY:
>>>>>>>>>>>> ==9625==    definitely lost: 288 bytes in 3 blocks
>>>>>>>>>>>> ==9625==    indirectly lost: 0 bytes in 0 blocks
>>>>>>>>>>>> ==9625==      possibly lost: 0 bytes in 0 blocks
>>>>>>>>>>>> ==9625==    still reachable: 0 bytes in 0 blocks
>>>>>>>>>>>> ==9625==         suppressed: 0 bytes in 0 blocks
>>>>>>>>>>>> ==9625==
>>>>>>>>>>>> ==9625== For counts of detected and suppressed errors, rerun
>>>>>>>>>>>> with: -v
>>>>>>>>>>>> ==9625== ERROR SUMMARY: 3 errors from 3 contexts (suppressed: 6
>>>>>>>>>>>> from 6)
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Mon, Mar 3, 2014 at 7:05 PM, Matthew Knepley <
>>>>>>>>>>>> knepley at gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> On Mon, Mar 3, 2014 at 4:59 PM, Miguel Angel Salazar de Troya
>>>>>>>>>>>>> <salazardetroya at gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> You are welcome, thanks for your help.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> Okay, I have rebuilt completely clean, and ex12 runs for me.
>>>>>>>>>>>>> Can you try again after pulling?
>>>>>>>>>>>>>
>>>>>>>>>>>>>   Thanks,
>>>>>>>>>>>>>
>>>>>>>>>>>>>      Matt
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>>  On Mon, Mar 3, 2014 at 4:13 PM, Matthew Knepley <
>>>>>>>>>>>>>> knepley at gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Mon, Mar 3, 2014 at 1:44 PM, Miguel Angel Salazar de
>>>>>>>>>>>>>>> Troya <salazardetroya at gmail.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Thanks. This is what I get.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Okay, this was broken by a new push to master/next in the
>>>>>>>>>>>>>>> last few days. I have pushed a fix,
>>>>>>>>>>>>>>> however next is currently broken due to a failure to check
>>>>>>>>>>>>>>> in a file. This should be fixed shortly,
>>>>>>>>>>>>>>> and then ex12 will work. I will mail you when its ready.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>   Thanks for finding this,
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>       Matt
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> (gdb) cont
>>>>>>>>>>>>>>>> Continuing.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Program received signal SIGSEGV, Segmentation fault.
>>>>>>>>>>>>>>>> 0x00007fd6811bea7b in DMPlexComputeJacobianFEM
>>>>>>>>>>>>>>>> (dm=0x159a180, X=0x168b5b0,
>>>>>>>>>>>>>>>>     Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88,
>>>>>>>>>>>>>>>> str=0x7fffae6e7970,
>>>>>>>>>>>>>>>>     user=0x7fd6811be509)
>>>>>>>>>>>>>>>>     at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882
>>>>>>>>>>>>>>>> 882         ierr = PetscFEGetDimension(fe[f],
>>>>>>>>>>>>>>>> &Nb);CHKERRQ(ierr);
>>>>>>>>>>>>>>>> (gdb) where
>>>>>>>>>>>>>>>> #0  0x00007fd6811bea7b in DMPlexComputeJacobianFEM
>>>>>>>>>>>>>>>> (dm=0x159a180, X=0x168b5b0,
>>>>>>>>>>>>>>>>     Jac=0x7fffae6e8a88, JacP=0x7fffae6e8a88,
>>>>>>>>>>>>>>>> str=0x7fffae6e7970,
>>>>>>>>>>>>>>>>     user=0x7fd6811be509)
>>>>>>>>>>>>>>>>     at /home/salaza11/petsc/src/dm/impls/plex/plexfem.c:882
>>>>>>>>>>>>>>>> #1  0x00007fd6814a5bf6 in SNESComputeJacobian_DMLocal
>>>>>>>>>>>>>>>> (snes=0x14e9450,
>>>>>>>>>>>>>>>>     X=0x1622ad0, A=0x7fffae6e8a88, B=0x7fffae6e8a88,
>>>>>>>>>>>>>>>> ctx=0x1652300)
>>>>>>>>>>>>>>>>     at /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c:102
>>>>>>>>>>>>>>>> #2  0x00007fd6814cc609 in SNESComputeJacobian
>>>>>>>>>>>>>>>> (snes=0x14e9450, X=0x1622ad0,
>>>>>>>>>>>>>>>>     A=0x7fffae6e8a88, B=0x7fffae6e8a88)
>>>>>>>>>>>>>>>>     at /home/salaza11/petsc/src/snes/interface/snes.c:2245
>>>>>>>>>>>>>>>> #3  0x000000000040af72 in main (argc=15,
>>>>>>>>>>>>>>>> argv=0x7fffae6e8bc8)
>>>>>>>>>>>>>>>>     at
>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:784
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Mon, Mar 3, 2014 at 1:40 PM, Matthew Knepley <
>>>>>>>>>>>>>>>> knepley at gmail.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Mon, Mar 3, 2014 at 1:39 PM, Miguel Angel Salazar de
>>>>>>>>>>>>>>>>> Troya <salazardetroya at gmail.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> This is what I get at gdb when I type 'where'.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> You have to type 'cont', and then when it fails you type
>>>>>>>>>>>>>>>>> 'where'.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>    Matt
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> #0  0x000000310e0aa860 in __nanosleep_nocancel () from
>>>>>>>>>>>>>>>>>> /lib64/libc.so.6
>>>>>>>>>>>>>>>>>> #1  0x000000310e0aa70f in sleep () from /lib64/libc.so.6
>>>>>>>>>>>>>>>>>> #2  0x00007fd83a00a8be in PetscSleep (s=10)
>>>>>>>>>>>>>>>>>>     at /home/salaza11/petsc/src/sys/utils/psleep.c:52
>>>>>>>>>>>>>>>>>> #3  0x00007fd83a06f331 in PetscAttachDebugger ()
>>>>>>>>>>>>>>>>>>     at /home/salaza11/petsc/src/sys/error/adebug.c:397
>>>>>>>>>>>>>>>>>> #4  0x00007fd83a0af1d2 in
>>>>>>>>>>>>>>>>>> PetscOptionsCheckInitial_Private ()
>>>>>>>>>>>>>>>>>>     at /home/salaza11/petsc/src/sys/objects/init.c:444
>>>>>>>>>>>>>>>>>> #5  0x00007fd83a0b6448 in PetscInitialize
>>>>>>>>>>>>>>>>>> (argc=0x7fff5cd8df2c,
>>>>>>>>>>>>>>>>>>     args=0x7fff5cd8df20, file=0x0,
>>>>>>>>>>>>>>>>>>     help=0x60ce40 "Poisson Problem in 2d and 3d with
>>>>>>>>>>>>>>>>>> simplicial finite elements.\nWe solve the Poisson problem in a
>>>>>>>>>>>>>>>>>> rectangular\ndomain, using a parallel unstructured mesh (DMPLEX) to
>>>>>>>>>>>>>>>>>> discretize it.\n\n\n")
>>>>>>>>>>>>>>>>>>     at /home/salaza11/petsc/src/sys/objects/pinit.c:876
>>>>>>>>>>>>>>>>>> #6  0x0000000000408f2c in main (argc=15,
>>>>>>>>>>>>>>>>>> argv=0x7fff5cd8f1f8)
>>>>>>>>>>>>>>>>>>     at
>>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/examples/tutorials/ex12.c:663
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> The rest of the gdb output is attached. I am a bit
>>>>>>>>>>>>>>>>>> ignorant with gdb, I apologize for that.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Mon, Mar 3, 2014 at 12:48 PM, Matthew Knepley <
>>>>>>>>>>>>>>>>>> knepley at gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Mon, Mar 3, 2014 at 12:39 PM, Miguel Angel Salazar de
>>>>>>>>>>>>>>>>>>> Troya <salazardetroya at gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Thanks for your response. Sorry I did not have the
>>>>>>>>>>>>>>>>>>>> "next" version, but the "master" version. I still have an error though. I
>>>>>>>>>>>>>>>>>>>> followed the steps given here (
>>>>>>>>>>>>>>>>>>>> https://bitbucket.org/petsc/petsc/wiki/Home) to obtain
>>>>>>>>>>>>>>>>>>>> the next version, I configured petsc as above and ran ex12 as above as
>>>>>>>>>>>>>>>>>>>> well, getting this error:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> [salaza11 at maya tutorials]$ ./ex12 -run_type test
>>>>>>>>>>>>>>>>>>>> -refinement_limit 0.0    -bc_type dirichlet -interpolate 0
>>>>>>>>>>>>>>>>>>>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1
>>>>>>>>>>>>>>>>>>>> Local function:
>>>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes
>>>>>>>>>>>>>>>>>>>>   type: seq
>>>>>>>>>>>>>>>>>>>> 0
>>>>>>>>>>>>>>>>>>>> 0.25
>>>>>>>>>>>>>>>>>>>> 1
>>>>>>>>>>>>>>>>>>>> 0.25
>>>>>>>>>>>>>>>>>>>> 0.5
>>>>>>>>>>>>>>>>>>>> 1.25
>>>>>>>>>>>>>>>>>>>> 1
>>>>>>>>>>>>>>>>>>>> 1.25
>>>>>>>>>>>>>>>>>>>> 2
>>>>>>>>>>>>>>>>>>>> Initial guess
>>>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes
>>>>>>>>>>>>>>>>>>>>   type: seq
>>>>>>>>>>>>>>>>>>>> 0.5
>>>>>>>>>>>>>>>>>>>> L_2 Error: 0.111111
>>>>>>>>>>>>>>>>>>>> Residual:
>>>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes
>>>>>>>>>>>>>>>>>>>>   type: seq
>>>>>>>>>>>>>>>>>>>> 0
>>>>>>>>>>>>>>>>>>>> 0
>>>>>>>>>>>>>>>>>>>> 0
>>>>>>>>>>>>>>>>>>>> 0
>>>>>>>>>>>>>>>>>>>> 0
>>>>>>>>>>>>>>>>>>>> 0
>>>>>>>>>>>>>>>>>>>> 0
>>>>>>>>>>>>>>>>>>>> 0
>>>>>>>>>>>>>>>>>>>> 0
>>>>>>>>>>>>>>>>>>>> Initial Residual
>>>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes
>>>>>>>>>>>>>>>>>>>>   type: seq
>>>>>>>>>>>>>>>>>>>> 0
>>>>>>>>>>>>>>>>>>>> L_2 Residual: 0
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Okay, now run with -start_in_debugger, and give me a
>>>>>>>>>>>>>>>>>>> stack trace using 'where'.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>   Thanks,
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>      Matt
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>  [0]PETSC ERROR:
>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------
>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV:
>>>>>>>>>>>>>>>>>>>> Segmentation Violation, probably memory access out of range
>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or
>>>>>>>>>>>>>>>>>>>> -on_error_attach_debugger
>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: or see
>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try
>>>>>>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X to
>>>>>>>>>>>>>>>>>>>> find memory corruption errors
>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in
>>>>>>>>>>>>>>>>>>>> stack below
>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ---------------------  Stack Frames
>>>>>>>>>>>>>>>>>>>> ------------------------------------
>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the
>>>>>>>>>>>>>>>>>>>> stack are not available,
>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR:       INSTEAD the line number of the
>>>>>>>>>>>>>>>>>>>> start of the function
>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR:       is given.
>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 871
>>>>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c
>>>>>>>>>>>>>>>>>>>>  [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line
>>>>>>>>>>>>>>>>>>>> 94 /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c
>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line
>>>>>>>>>>>>>>>>>>>> 2244 /home/salaza11/petsc/src/snes/interface/snes.c
>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203
>>>>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c
>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message
>>>>>>>>>>>>>>>>>>>> --------------------------------------------------------------
>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received
>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See http://
>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.htmlfor trouble shooting.
>>>>>>>>>>>>>>>>>>>>  [0]PETSC ERROR: Petsc Development GIT revision:
>>>>>>>>>>>>>>>>>>>> v3.4.3-4705-gfb6b3bc  GIT Date: 2014-03-03 08:23:43 -0600
>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named
>>>>>>>>>>>>>>>>>>>> maya by salaza11 Mon Mar  3 11:49:15 2014
>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich
>>>>>>>>>>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen
>>>>>>>>>>>>>>>>>>>> --download-chaco --with-c2html=0
>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: #1 User provided function() line 0 in
>>>>>>>>>>>>>>>>>>>>  unknown file
>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) -
>>>>>>>>>>>>>>>>>>>> process 0
>>>>>>>>>>>>>>>>>>>> [unset]: aborting job:
>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) -
>>>>>>>>>>>>>>>>>>>> process 0
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Sun, Mar 2, 2014 at 7:11 PM, Matthew Knepley <
>>>>>>>>>>>>>>>>>>>> knepley at gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Sun, Mar 2, 2014 at 6:54 PM, Miguel Angel Salazar
>>>>>>>>>>>>>>>>>>>>> de Troya <salazardetroya at gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Hi everybody
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> I am trying to run example ex12.c without much
>>>>>>>>>>>>>>>>>>>>>> success. I specifically run it with the command options:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> We need to start narrowing down differences, because
>>>>>>>>>>>>>>>>>>>>> it runs for me and our nightly tests. So, first can
>>>>>>>>>>>>>>>>>>>>> you confirm that you are using the latest 'next'
>>>>>>>>>>>>>>>>>>>>> branch?
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>   Thanks,
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>       Matt
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>  ./ex12 -run_type test -refinement_limit 0.0
>>>>>>>>>>>>>>>>>>>>>>  -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial
>>>>>>>>>>>>>>>>>>>>>> -dm_plex_print_fem 1
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> And I get this output
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Local function:
>>>>>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes
>>>>>>>>>>>>>>>>>>>>>>   type: seq
>>>>>>>>>>>>>>>>>>>>>> 0
>>>>>>>>>>>>>>>>>>>>>> 1
>>>>>>>>>>>>>>>>>>>>>> 1
>>>>>>>>>>>>>>>>>>>>>> 2
>>>>>>>>>>>>>>>>>>>>>> 1
>>>>>>>>>>>>>>>>>>>>>> 2
>>>>>>>>>>>>>>>>>>>>>> 2
>>>>>>>>>>>>>>>>>>>>>> 3
>>>>>>>>>>>>>>>>>>>>>> Initial guess
>>>>>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes
>>>>>>>>>>>>>>>>>>>>>>   type: seq
>>>>>>>>>>>>>>>>>>>>>> L_2 Error: 0.625
>>>>>>>>>>>>>>>>>>>>>> Residual:
>>>>>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes
>>>>>>>>>>>>>>>>>>>>>>   type: seq
>>>>>>>>>>>>>>>>>>>>>> 0
>>>>>>>>>>>>>>>>>>>>>> 0
>>>>>>>>>>>>>>>>>>>>>> 0
>>>>>>>>>>>>>>>>>>>>>> 0
>>>>>>>>>>>>>>>>>>>>>> 0
>>>>>>>>>>>>>>>>>>>>>> 0
>>>>>>>>>>>>>>>>>>>>>> 0
>>>>>>>>>>>>>>>>>>>>>> 0
>>>>>>>>>>>>>>>>>>>>>> Initial Residual
>>>>>>>>>>>>>>>>>>>>>> Vec Object: 1 MPI processes
>>>>>>>>>>>>>>>>>>>>>>   type: seq
>>>>>>>>>>>>>>>>>>>>>> L_2 Residual: 0
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR:
>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 11 SEGV:
>>>>>>>>>>>>>>>>>>>>>> Segmentation Violation, probably memory access out of range
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger or
>>>>>>>>>>>>>>>>>>>>>> -on_error_attach_debugger
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: or see
>>>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try
>>>>>>>>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac OS X
>>>>>>>>>>>>>>>>>>>>>> to find memory corruption errors
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem given in
>>>>>>>>>>>>>>>>>>>>>> stack below
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ---------------------  Stack Frames
>>>>>>>>>>>>>>>>>>>>>> ------------------------------------
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the
>>>>>>>>>>>>>>>>>>>>>> stack are not available,
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR:       INSTEAD the line number of the
>>>>>>>>>>>>>>>>>>>>>> start of the function
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR:       is given.
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeJacobianFEM line 867
>>>>>>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/dm/impls/plex/plexfem.c
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian_DMLocal line
>>>>>>>>>>>>>>>>>>>>>> 94 /home/salaza11/petsc/src/snes/utils/dmlocalsnes.c
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user Jacobian function line
>>>>>>>>>>>>>>>>>>>>>> 2244 /home/salaza11/petsc/src/snes/interface/snes.c
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeJacobian line 2203
>>>>>>>>>>>>>>>>>>>>>> /home/salaza11/petsc/src/snes/interface/snes.c
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error Message
>>>>>>>>>>>>>>>>>>>>>> ------------------------------------
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received!
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR:
>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision:
>>>>>>>>>>>>>>>>>>>>>> v3.4.3-3453-g0a94005  GIT Date: 2014-03-02 13:12:04 -0600
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for
>>>>>>>>>>>>>>>>>>>>>> recent updates.
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints about
>>>>>>>>>>>>>>>>>>>>>> trouble shooting.
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual pages.
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR:
>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a linux-gnu-c-debug named
>>>>>>>>>>>>>>>>>>>>>> maya by salaza11 Sun Mar  2 17:00:09 2014
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from
>>>>>>>>>>>>>>>>>>>>>> /home/salaza11/petsc/linux-gnu-c-debug/lib
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Sun Mar  2 16:46:51
>>>>>>>>>>>>>>>>>>>>>> 2014
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options --download-mpich
>>>>>>>>>>>>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen
>>>>>>>>>>>>>>>>>>>>>> --download-chaco --with-c2html=0
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR:
>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------
>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0 in
>>>>>>>>>>>>>>>>>>>>>>  unknown file
>>>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) -
>>>>>>>>>>>>>>>>>>>>>> process 0
>>>>>>>>>>>>>>>>>>>>>> [unset]: aborting job:
>>>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD, 59) -
>>>>>>>>>>>>>>>>>>>>>> process 0
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Probably my problems could be on my configuration. I
>>>>>>>>>>>>>>>>>>>>>> attach the configure.log. I ran ./configure like this
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> ./configure --download-mpich
>>>>>>>>>>>>>>>>>>>>>> --download-scientificpython --download-triangle --download-ctetgen
>>>>>>>>>>>>>>>>>>>>>> --download-chaco --with-c2html=0
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Thanks a lot in advance.
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:37 AM, Matthew Knepley <
>>>>>>>>>>>>>>>>>>>>>> knepley at gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Tue, Jan 28, 2014 at 10:31 AM, Yaakoub El Khamra
>>>>>>>>>>>>>>>>>>>>>>> <yelkhamra at gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> If
>>>>>>>>>>>>>>>>>>>>>>>>  ./ex12 -run_type test -dim 3 -refinement_limit
>>>>>>>>>>>>>>>>>>>>>>>> 0.0125 -variable_coefficient field    -interpolate 1 -petscspace_order 2
>>>>>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> is for serial, any chance we can get the options to
>>>>>>>>>>>>>>>>>>>>>>>> run in parallel?
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> Just use mpiexec -n <procs>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>    Matt
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>>> Yaakoub El Khamra
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:29 AM, Matthew Knepley <
>>>>>>>>>>>>>>>>>>>>>>>> knepley at gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Fri, Jan 17, 2014 at 11:06 AM, Jones,Martin
>>>>>>>>>>>>>>>>>>>>>>>>> Alexander <MAJones2 at mdanderson.org> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>  ------------------------------
>>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com]
>>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Friday, January 17, 2014 11:04 AM
>>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander
>>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov
>>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running
>>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>    On Fri, Jan 17, 2014 at 11:00 AM,
>>>>>>>>>>>>>>>>>>>>>>>>>> Jones,Martin Alexander <MAJones2 at mdanderson.org>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>  These examples all seem to run excepting the
>>>>>>>>>>>>>>>>>>>>>>>>>>> following command,
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> ex12 -run_type test -dim 3 -refinement_limit
>>>>>>>>>>>>>>>>>>>>>>>>>>> 0.0125 -variable_coefficient field    -interpolate 1 -petscspace_order 2
>>>>>>>>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> I get the following ouput:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> ./ex12 -run_type test -dim 3 -refinement_limit
>>>>>>>>>>>>>>>>>>>>>>>>>>> 0.0125 -variable_coefficient field    -interpolate 1 -petscspace_order 2
>>>>>>>>>>>>>>>>>>>>>>>>>>> -show_initial -dm_plex_print_fem
>>>>>>>>>>>>>>>>>>>>>>>>>>> Local function:
>>>>>>>>>>>>>>>>>>>>>>>>>>> ./ex12: symbol lookup error:
>>>>>>>>>>>>>>>>>>>>>>>>>>> /opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so: undefined
>>>>>>>>>>>>>>>>>>>>>>>>>>> symbol: omp_get_num_procs
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>  This is a build problem, but it should affect
>>>>>>>>>>>>>>>>>>>>>>>>>> all the runs. Is this reproducible? Can you send configure.log? MKL is the
>>>>>>>>>>>>>>>>>>>>>>>>>> worst. If this
>>>>>>>>>>>>>>>>>>>>>>>>>> persists, I would just switch to
>>>>>>>>>>>>>>>>>>>>>>>>>> --download-f-blas-lapack.
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> Thanks. I have some advice on options
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>   --with-precision=single # I would not use this
>>>>>>>>>>>>>>>>>>>>>>>>> unless you are doing something special, like CUDA
>>>>>>>>>>>>>>>>>>>>>>>>>   --with-clanguage=C++  # I would recommend
>>>>>>>>>>>>>>>>>>>>>>>>> switching to C, the build is much faster
>>>>>>>>>>>>>>>>>>>>>>>>>   --with-mpi-dir=/usr --with-mpi4py=0
>>>>>>>>>>>>>>>>>>>>>>>>>   --with-shared-libraries --CFLAGS=-O0
>>>>>>>>>>>>>>>>>>>>>>>>> --CXXFLAGS=-O0 --with-fc=0
>>>>>>>>>>>>>>>>>>>>>>>>>   --with-etags=1                # This is
>>>>>>>>>>>>>>>>>>>>>>>>> unnecessary
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]"
>>>>>>>>>>>>>>>>>>>>>>>>> # Here is the problem, see below
>>>>>>>>>>>>>>>>>>>>>>>>>   --download-metis
>>>>>>>>>>>>>>>>>>>>>>>>>   --download-fiat=yes --download-generator
>>>>>>>>>>>>>>>>>>>>>>>>> --download-scientificpython # Get rid of these, they are obsolete
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> Your MKL needs another library for the OpenMP
>>>>>>>>>>>>>>>>>>>>>>>>> symbols. I would recommend switching to --download-f2cblaslapack,
>>>>>>>>>>>>>>>>>>>>>>>>> or you can try and find that library.
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>   Thanks,
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>       Matt
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>    Thanks,
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>       Matt
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>   ------------------------------
>>>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com]
>>>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 6:35 PM
>>>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander
>>>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov
>>>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running
>>>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>    On Thu, Jan 16, 2014 at 5:43 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>> Jones,Martin Alexander <MAJones2 at mdanderson.org>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>  Hi, This is the next error message after
>>>>>>>>>>>>>>>>>>>>>>>>>>>> configuring and building with the triangle package when trying to run ex12
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>  This is my fault for bad defaults. I will fix.
>>>>>>>>>>>>>>>>>>>>>>>>>>> Try running
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>    ./ex12 -run_type test -refinement_limit 0.0
>>>>>>>>>>>>>>>>>>>>>>>>>>>    -bc_type dirichlet -interpolate 0 -petscspace_order 1 -show_initial
>>>>>>>>>>>>>>>>>>>>>>>>>>> -dm_plex_print_fem 1
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>  for a representative run. Then you could try 3D
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>     ex12 -run_type test -dim 3
>>>>>>>>>>>>>>>>>>>>>>>>>>> -refinement_limit 0.0125 -variable_coefficient field    -interpolate 1
>>>>>>>>>>>>>>>>>>>>>>>>>>> -petscspace_order 2 -show_initial -dm_plex_print_fem
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>  or a full run
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>    ex12 -refinement_limit 0.01 -bc_type
>>>>>>>>>>>>>>>>>>>>>>>>>>> dirichlet -interpolate -petscspace_order 1
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>    ex12 -refinement_limit 0.01 -bc_type
>>>>>>>>>>>>>>>>>>>>>>>>>>> dirichlet -interpolate -petscspace_order 2
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>  Let me know if those work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>    Thanks,
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>       Matt
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>   ./ex12
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR:
>>>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Caught signal number 8 FPE:
>>>>>>>>>>>>>>>>>>>>>>>>>>>> Floating Point Exception,probably divide by zero
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Try option -start_in_debugger
>>>>>>>>>>>>>>>>>>>>>>>>>>>> or -on_error_attach_debugger
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: or see
>>>>>>>>>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind[0]PETSCERROR: or try
>>>>>>>>>>>>>>>>>>>>>>>>>>>> http://valgrind.org on GNU/linux and Apple Mac
>>>>>>>>>>>>>>>>>>>>>>>>>>>> OS X to find memory corruption errors
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: likely location of problem
>>>>>>>>>>>>>>>>>>>>>>>>>>>> given in stack below
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ---------------------  Stack
>>>>>>>>>>>>>>>>>>>>>>>>>>>> Frames ------------------------------------
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in
>>>>>>>>>>>>>>>>>>>>>>>>>>>> the stack are not available,
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR:       INSTEAD the line number
>>>>>>>>>>>>>>>>>>>>>>>>>>>> of the start of the function
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR:       is given.
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] DMPlexComputeResidualFEM
>>>>>>>>>>>>>>>>>>>>>>>>>>>> line 531 /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexfem.c
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction_DMLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>> line 63 /home/mjonesa/PETSc/petsc/src/snes/utils/dmlocalsnes.c
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNES user function line
>>>>>>>>>>>>>>>>>>>>>>>>>>>> 2088 /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESComputeFunction line
>>>>>>>>>>>>>>>>>>>>>>>>>>>> 2076 /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve_NEWTONLS line 144
>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/impls/ls/ls.c
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: [0] SNESSolve line 3765
>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/interface/snes.c
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: --------------------- Error
>>>>>>>>>>>>>>>>>>>>>>>>>>>> Message ------------------------------------
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Signal received!
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR:
>>>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT revision:
>>>>>>>>>>>>>>>>>>>>>>>>>>>> v3.4.3-2317-gcd0e7f7  GIT Date: 2014-01-15 20:33:42 -0600
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html for
>>>>>>>>>>>>>>>>>>>>>>>>>>>> recent updates.
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints
>>>>>>>>>>>>>>>>>>>>>>>>>>>> about trouble shooting.
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual
>>>>>>>>>>>>>>>>>>>>>>>>>>>> pages.
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR:
>>>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a
>>>>>>>>>>>>>>>>>>>>>>>>>>>> arch-linux2-cxx-debug named maeda by mjonesa Thu Jan 16 17:41:23 2014
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from
>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16
>>>>>>>>>>>>>>>>>>>>>>>>>>>> 17:38:33 2014
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options
>>>>>>>>>>>>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local
>>>>>>>>>>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]"
>>>>>>>>>>>>>>>>>>>>>>>>>>>> --with-c2html=0 --with-clanguage=c++ PETSC_ARCH=arch-linux2-cxx-debug
>>>>>>>>>>>>>>>>>>>>>>>>>>>> --download-triangle
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR:
>>>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: User provided function() line 0
>>>>>>>>>>>>>>>>>>>>>>>>>>>> in  unknown file
>>>>>>>>>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD,
>>>>>>>>>>>>>>>>>>>>>>>>>>>> 59) - process 0
>>>>>>>>>>>>>>>>>>>>>>>>>>>>  ------------------------------
>>>>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com]
>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:37 PM
>>>>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander
>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov
>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running
>>>>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>    On Thu, Jan 16, 2014 at 4:33 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>> Jones,Martin Alexander <MAJones2 at mdanderson.org
>>>>>>>>>>>>>>>>>>>>>>>>>>>> > wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>  Hi, I have downloaded and built the dev
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> version you suggested. I think I need the triangle package to run this
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> particular case. Is there any thing else that appears wrong in what I have
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> done from the error messages below:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>  Great! Its running. You can reconfigure like
>>>>>>>>>>>>>>>>>>>>>>>>>>>> this:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> $PETSC_DIR/$PETSC_ARCH/conf/reconfigure-$PETSC_ARCH.py --download-triangle
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>  and then rebuild
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>     make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>  and then rerun. You can load meshes, but its
>>>>>>>>>>>>>>>>>>>>>>>>>>>> much easier to have triangle create them.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>    Thanks for being patient,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>       Matt
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>  [0]PETSC ERROR: --------------------- Error
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Message ------------------------------------
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: No support for this operation
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> for this object type!
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Mesh generation needs external
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> package support.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Please reconfigure with --download-triangle.!
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Petsc Development GIT
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> revision: v3.4.3-2317-gcd0e7f7  GIT Date: 2014-01-15 20:33:42 -0600
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/changes/index.html
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> for recent updates.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/faq.html for hints
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> about trouble shooting.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: See docs/index.html for manual
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> pages.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: ./ex12 on a
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> arch-linux2-cxx-debug named maeda by mjonesa Thu Jan 16 16:28:20 2014
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Libraries linked from
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/local/lib
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure run at Thu Jan 16
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 16:25:53 2014
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: Configure options
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --prefix=/home/mjonesa/local --with-clanguage=c++ --with-c2html=0
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --with-blas-lapack-lib="[/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_rt.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_intel_thread.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libmkl_core.so,/opt/apps/EPD/epd-7.3-1-rh5-x86_64/lib/libiomp5.so]"
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ------------------------------------------------------------------------
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexGenerate() line 4332 in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plex.c
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: DMPlexCreateBoxMesh() line 600
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> in /home/mjonesa/PETSc/petsc/src/dm/impls/plex/plexcreate.c
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: CreateMesh() line 295 in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [0]PETSC ERROR: main() line 659 in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc/src/snes/examples/tutorials/ex12.c
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> application called MPI_Abort(MPI_COMM_WORLD,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 56) - process 0
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>  ------------------------------
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 4:06 PM
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems running
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>    On Thu, Jan 16, 2014 at 4:05 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Jones,Martin Alexander <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>  Hi. I changed the ENV variable to the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> correct entry. when I type make ex12 I get this:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> make ex12
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> g++ -o ex12.o -c -Wall -Wwrite-strings
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -Wno-strict-aliasing -Wno-unknown-pragmas -g   -fPIC
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/arch-linux2-cxx-debug/include
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -I/home/mjonesa/PETSc/petsc-3.4.3/include/mpiuni
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -D__INSDIR__=src/snes/examples/tutorials/ ex12.c
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ex12.c:14:18: fatal error: ex12.h: No such
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> file or directory
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> compilation terminated.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> make: *** [ex12.o] Error 1
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Any help of yours is very much appreciated.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>  Yes, this relates to my 3). This is not
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> going to work for you with the release. Please see the link I sent.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>     Matt
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   ------------------------------
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:58 PM
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> running ex12.c
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>    On Thu, Jan 16, 2014 at 3:55 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Jones,Martin Alexander <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>  Thanks!
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>  You built with
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> PETSC_ARCH=arch-linux2-cxx-debug
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>     Matt
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   ------------------------------
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [knepley at gmail.com]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 3:31 PM
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> running ex12.c
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>    On Thu, Jan 16, 2014 at 3:11 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Jones,Martin Alexander <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>  Now I went to the directory where ex12.c
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> sits and just did a 'make ex12.c' with the following error if this helps?  :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> mjonesa at maeda:~/PETSc/petsc-3.4.3/src/snes/examples/tutorials$
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> make ex12.c
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/variables:108:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscvariables: No
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> such file or directory
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/conf/rules:962:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules: No
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> such file or directory
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> make: *** No rule to make target
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> `/home/mjonesa/PETSc/petsc-3.4.3/linux-gnu-cxx-debug/conf/petscrules'.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Stop.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>  1) You would type 'make ex12'
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>  2) Either you PETSC_DIR (
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /home/mjonesa/PETSc/petsc-3.4.3) or
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> PETSC_ARCH (linux-gnu-cxx-debug)
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> environment variables
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>     do not match what you built. Please send
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> configure.log and make.log
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>  3) Since it was only recently added, if
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> you want to use the FEM functionality, you must use the development version:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> http://www.mcs.anl.gov/petsc/developers/index.html
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>    Thanks,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>        Matt
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *From:* Matthew Knepley [mailto:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> knepley at gmail.com]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Sent:* Thursday, January 16, 2014 2:48 PM
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *To:* Jones,Martin Alexander
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Cc:* petsc-users at mcs.anl.gov
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Subject:* Re: [petsc-users] Problems
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> running ex12.c
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Jan 16, 2014 at 2:35 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Jones,Martin Alexander <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MAJones2 at mdanderson.org> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello To Whom it Concerns,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> I am trying to run the tutorial ex12.c by
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> running 'bin/pythonscripts/PetscGenerateFEMQuadrature.py
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dim order dim 1 laplacian dim order dim 1 boundary
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> src/snes/examples/tutorials/ex12.h'
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but getting the following error:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> bin/pythonscripts/PetscGenerateFEMQuadrature.py dim order dim 1 laplacian
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dim order dim 1 boundary src/snes/examples/tutorials/ex12.h
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Traceback (most recent call last):
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "bin/pythonscripts/PetscGenerateFEMQuadrature.py", line 15, in <module>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>     from FIAT.reference_element import
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> default_simplex
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ImportError: No module named
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FIAT.reference_element
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> I have removed the requirement of
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> generating the header file (its now all handled in C). I thought
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> I changed the documentation everywhere
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> (including the latest tutorial slides). Can you try running
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> with 'master' (or 'next'), and point me
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> toward the old docs?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   Thanks,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>     Matt
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> before they begin their experiments is infinitely more interesting than any
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>  --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> before they begin their experiments is infinitely more interesting than any
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>  --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> before they begin their experiments is infinitely more interesting than any
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>  --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> before they begin their experiments is infinitely more interesting than any
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>  --
>>>>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before
>>>>>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any
>>>>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead.
>>>>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>  --
>>>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before
>>>>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any
>>>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead.
>>>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>  --
>>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before
>>>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any
>>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead.
>>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before
>>>>>>>>>>>>>>>>>>>>>>>>> they begin their experiments is infinitely more interesting than any
>>>>>>>>>>>>>>>>>>>>>>>>> results to which their experiments lead.
>>>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they
>>>>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to
>>>>>>>>>>>>>>>>>>>>>>> which their experiments lead.
>>>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>> *Miguel Angel Salazar de Troya*
>>>>>>>>>>>>>>>>>>>>>> Graduate Research Assistant
>>>>>>>>>>>>>>>>>>>>>> Department of Mechanical Science and Engineering
>>>>>>>>>>>>>>>>>>>>>> University of Illinois at Urbana-Champaign
>>>>>>>>>>>>>>>>>>>>>> (217) 550-2360
>>>>>>>>>>>>>>>>>>>>>> salaza11 at illinois.edu
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they
>>>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to
>>>>>>>>>>>>>>>>>>>>> which their experiments lead.
>>>>>>>>>>>>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>> *Miguel Angel Salazar de Troya*
>>>>>>>>>>>>>>>>>>>> Graduate Research Assistant
>>>>>>>>>>>>>>>>>>>> Department of Mechanical Science and Engineering
>>>>>>>>>>>>>>>>>>>> University of Illinois at Urbana-Champaign
>>>>>>>>>>>>>>>>>>>> (217) 550-2360
>>>>>>>>>>>>>>>>>>>> salaza11 at illinois.edu
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>> What most experimenters take for granted before they
>>>>>>>>>>>>>>>>>>> begin their experiments is infinitely more interesting than any results to
>>>>>>>>>>>>>>>>>>> which their experiments lead.
>>>>>>>>>>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>> *Miguel Angel Salazar de Troya*
>>>>>>>>>>>>>>>>>> Graduate Research Assistant
>>>>>>>>>>>>>>>>>> Department of Mechanical Science and Engineering
>>>>>>>>>>>>>>>>>> University of Illinois at Urbana-Champaign
>>>>>>>>>>>>>>>>>> (217) 550-2360
>>>>>>>>>>>>>>>>>> salaza11 at illinois.edu
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>> What most experimenters take for granted before they begin
>>>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which
>>>>>>>>>>>>>>>>> their experiments lead.
>>>>>>>>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>> *Miguel Angel Salazar de Troya*
>>>>>>>>>>>>>>>> Graduate Research Assistant
>>>>>>>>>>>>>>>> Department of Mechanical Science and Engineering
>>>>>>>>>>>>>>>> University of Illinois at Urbana-Champaign
>>>>>>>>>>>>>>>> (217) 550-2360
>>>>>>>>>>>>>>>> salaza11 at illinois.edu
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>> What most experimenters take for granted before they begin
>>>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which
>>>>>>>>>>>>>>> their experiments lead.
>>>>>>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> --
>>>>>>>>>>>>>> *Miguel Angel Salazar de Troya*
>>>>>>>>>>>>>> Graduate Research Assistant
>>>>>>>>>>>>>> Department of Mechanical Science and Engineering
>>>>>>>>>>>>>> University of Illinois at Urbana-Champaign
>>>>>>>>>>>>>> (217) 550-2360
>>>>>>>>>>>>>> salaza11 at illinois.edu
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> --
>>>>>>>>>>>>> What most experimenters take for granted before they begin
>>>>>>>>>>>>> their experiments is infinitely more interesting than any results to which
>>>>>>>>>>>>> their experiments lead.
>>>>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> --
>>>>>>>>>>>> *Miguel Angel Salazar de Troya*
>>>>>>>>>>>> Graduate Research Assistant
>>>>>>>>>>>> Department of Mechanical Science and Engineering
>>>>>>>>>>>> University of Illinois at Urbana-Champaign
>>>>>>>>>>>> (217) 550-2360
>>>>>>>>>>>> salaza11 at illinois.edu
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> --
>>>>>>>>>>> What most experimenters take for granted before they begin their
>>>>>>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>>>>>>> experiments lead.
>>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> --
>>>>>>>>>> What most experimenters take for granted before they begin their
>>>>>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>>>>>> experiments lead.
>>>>>>>>>> -- Norbert Wiener
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> --
>>>>>>>>> *Miguel Angel Salazar de Troya*
>>>>>>>>> Graduate Research Assistant
>>>>>>>>> Department of Mechanical Science and Engineering
>>>>>>>>> University of Illinois at Urbana-Champaign
>>>>>>>>> (217) 550-2360
>>>>>>>>> salaza11 at illinois.edu
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>> What most experimenters take for granted before they begin their
>>>>>>>> experiments is infinitely more interesting than any results to which their
>>>>>>>> experiments lead.
>>>>>>>> -- Norbert Wiener
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> *Miguel Angel Salazar de Troya*
>>>>>>> Graduate Research Assistant
>>>>>>> Department of Mechanical Science and Engineering
>>>>>>> University of Illinois at Urbana-Champaign
>>>>>>> (217) 550-2360
>>>>>>> salaza11 at illinois.edu
>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> *Miguel Angel Salazar de Troya*
>>>>>> Graduate Research Assistant
>>>>>> Department of Mechanical Science and Engineering
>>>>>> University of Illinois at Urbana-Champaign
>>>>>> (217) 550-2360
>>>>>> salaza11 at illinois.edu
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> What most experimenters take for granted before they begin their
>>>>> experiments is infinitely more interesting than any results to which their
>>>>> experiments lead.
>>>>> -- Norbert Wiener
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> *Miguel Angel Salazar de Troya*
>>>> Graduate Research Assistant
>>>> Department of Mechanical Science and Engineering
>>>> University of Illinois at Urbana-Champaign
>>>> (217) 550-2360
>>>> salaza11 at illinois.edu
>>>>
>>>>
>>>
>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>
>>
>>
>> --
>> *Miguel Angel Salazar de Troya*
>> Graduate Research Assistant
>> Department of Mechanical Science and Engineering
>> University of Illinois at Urbana-Champaign
>> (217) 550-2360
>> salaza11 at illinois.edu
>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>



-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140404/168f5923/attachment-0001.html>


More information about the petsc-users mailing list