[petsc-users] Parallel DM example for FEM

Anthony Vergottis a.vergottis at ucl.ac.uk
Mon Nov 11 06:47:38 CST 2013


I would also like to ask as I have some confusion. Are the DM objects
capable of handling all aspects of the parallel computations within a given
problem?

Let me elaborate - for example if I am passing a mesh structure (triangular
3 node elements) as the following:

Elem 0: N1 N5 N6
etc.....
where N# is the node tag.

into a DM object and then perform a decomposition for as many processes I
required. When the mesh partitions are distributed to their processes do I
then have the ability by using DM objects or something else built into
PETSc to compute specific calculations on each process without the need to
hardcode any MPI?

I hope I have explained this well.

Thanks again.
Adoni


On 11 November 2013 12:33, Anthony Vergottis <a.vergottis at ucl.ac.uk> wrote:

> Thanks mate that worked!
>
> Thanks.
> Adoni
>
>
>
> On 11 November 2013 12:29, Matthew Knepley <knepley at gmail.com> wrote:
>
>> On Mon, Nov 11, 2013 at 6:25 AM, Anthony Vergottis <a.vergottis at ucl.ac.uk
>> > wrote:
>>
>>> I compile ex12 but when I run the program I get this in the terminal.
>>>
>>
>> I guess I should make better defaults. Here are the runs I use for ex12:
>>
>>
>> https://bitbucket.org/petsc/petsc/src/a965ca046084fa53248a41da989a0a62cb6266ea/config/builder.py?at=master#cl-180
>>
>> You can see that the problem here is that the -petscspace_order has not
>> been set. So to test P_1 I use
>>
>>   -run_type test -refinement_limit 0.0 -bc_type dirichlet -interpolate 0
>> -petscspace_order 1 -show_initial -dm_plex_print_fem 1
>>
>> and then take away run_type (or use full) to solve the problem. SNES ex62
>> is the Stokes problem, and run parameters are
>> in the same place.
>>
>>   Thanks,
>>
>>      Matt
>>
>>
>>> adoni at Adoni:~/Desktop/petsc-lat/src/snes/examples/tutorials$ make ex12
>>> /home/adoni/Desktop/petsc-lat/arch-linux2-c-debug/bin/mpicxx -o ex12.o
>>> -c -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -g
>>>  -fPIC    -I/home/adoni/Desktop/petsc-lat/include
>>> -I/home/adoni/Desktop/petsc-lat/arch-linux2-c-debug/include
>>>  /home/adoni/Desktop/petsc-lat/src/snes/examples/tutorials/ex12.c
>>> /home/adoni/Desktop/petsc-lat/arch-linux2-c-debug/bin/mpicxx -Wall
>>> -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -g  -o ex12
>>> ex12.o -Wl,-rpath,/home/adoni/Desktop/petsc-lat/arch-linux2-c-debug/lib
>>> -L/home/adoni/Desktop/petsc-lat/arch-linux2-c-debug/lib  -lpetsc
>>> -Wl,-rpath,/home/adoni/Desktop/petsc-lat/arch-linux2-c-debug/lib -lflapack
>>> -lfblas -ltriangle -lX11 -lparmetis -lmetis -lpthread -lchaco -lctetgen -lm
>>> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/4.6
>>> -L/usr/lib/gcc/x86_64-linux-gnu/4.6 -Wl,-rpath,/usr/lib/x86_64-linux-gnu
>>> -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu
>>> -L/lib/x86_64-linux-gnu -lmpichf90 -lgfortran -lm -lgfortran -lm -lquadmath
>>> -lm -lmpichcxx -lstdc++ -ldl -lmpich -lopa -lmpl -lrt -lpthread -lgcc_s
>>> -ldl
>>> /bin/rm -f ex12.o
>>> adoni at Adoni:~/Desktop/petsc-lat/src/snes/examples/tutorials$ mpiexec
>>> -np 2 ./ex12
>>> [0]PETSC ERROR: --------------------- Error Message
>>> ------------------------------------
>>> [0]PETSC ERROR: Invalid argument!
>>> [0]PETSC ERROR: The section cell closure size 0 != dual space dimension
>>> 1!
>>> [0]PETSC ERROR:
>>> ------------------------------------------------------------------------
>>> [0]PETSC ERROR: Petsc Development GIT revision: unknown  GIT Date:
>>> unknown
>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>>> [0]PETSC ERROR: See docs/index.html for manual pages.
>>> [0]PETSC ERROR:
>>> ------------------------------------------------------------------------
>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-c-debug named Adoni by adoni Mon
>>> Nov 11 12:21:49 2013
>>> [0]PETSC ERROR: Libraries linked from
>>> /home/adoni/Desktop/petsc-lat/arch-linux2-c-debug/lib
>>> [0]PETSC ERROR: Configure run at Fri Nov  8 20:03:26 2013
>>> [0]PETSC ERROR: Configure options --with-clanguage=c++
>>> --with-shared-libraries=1 --download-f-blas-lapack --download-mpich
>>> --download-boost --download-triangle --download-ctetgen --download-chaco
>>> --download-metis --download-parmetis
>>> [0]PETSC ERROR:
>>> ------------------------------------------------------------------------
>>> [0]PETSC ERROR: DMPlexProjectFunctionLocal() line 207 in
>>> src/dm/impls/plex/plexfem.c
>>> [0]PETSC ERROR: DMPlexProjectFunction() line 258 in
>>> src/dm/impls/plex/plexfem.c
>>> [0]PETSC ERROR: main() line 686 in
>>> /home/adoni/Desktop/petsc-lat/src/snes/examples/tutorials/ex12.c
>>> [0]PETSC ERROR: --------------------- Error Message
>>> ------------------------------------
>>> [0]PETSC ERROR: Invalid argument!
>>> [0]PETSC ERROR: The section cell closure size 0 != dual space dimension
>>> 1!
>>> [0]PETSC ERROR:
>>> ------------------------------------------------------------------------
>>> [0]PETSC ERROR: Petsc Development GIT revision: unknown  GIT Date:
>>> unknown
>>> [0]PETSC ERROR: See docs/changes/index.html for recent updates.
>>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
>>> [0]PETSC ERROR: See docs/index.html for manual pages.
>>> [0]PETSC ERROR:
>>> ------------------------------------------------------------------------
>>> [0]PETSC ERROR: ./ex12 on a arch-linux2-c-debug named Adoni by adoni Mon
>>> Nov 11 12:21:49 2013
>>> [0]PETSC ERROR: Libraries linked from
>>> /home/adoni/Desktop/petsc-lat/arch-linux2-c-debug/lib
>>> [0]PETSC ERROR: Configure run at Fri Nov  8 20:03:26 2013
>>> [0]PETSC ERROR: Configure options --with-clanguage=c++
>>> --with-shared-libraries=1 --download-f-blas-lapack --download-mpich
>>> --download-boost --download-triangle --download-ctetgen --download-chaco
>>> --download-metis --download-parmetis
>>> [0]PETSC ERROR:
>>> ------------------------------------------------------------------------
>>> [0]PETSC ERROR: DMPlexProjectFunctionLocal() line 207 in
>>> src/dm/impls/plex/plexfem.c
>>> [0]PETSC ERROR: DMPlexProjectFunction() line 258 in
>>> src/dm/impls/plex/plexfem.c
>>> [0]PETSC ERROR: main() line 686 in
>>> /home/adoni/Desktop/petsc-lat/src/snes/examples/tutorials/ex12.c
>>> application called MPI_Abort(MPI_COMM_WORLD, 62) - process 0
>>> [unset]: aborting job:
>>> application called MPI_Abort(MPI_COMM_WORLD, 62) - process 0
>>> application called MPI_Abort(MPI_COMM_WORLD, 62) - process 0
>>> [unset]: aborting job:
>>> application called MPI_Abort(MPI_COMM_WORLD, 62) - process 0
>>>
>>> This occurs for all process counts.
>>>
>>> Thanks.
>>>
>>> Adoni
>>>
>>>
>>> On 11 November 2013 12:19, Matthew Knepley <knepley at gmail.com> wrote:
>>>
>>>> On Mon, Nov 11, 2013 at 6:10 AM, Anthony Vergottis <
>>>> a.vergottis at ucl.ac.uk> wrote:
>>>>
>>>>> Dear All,
>>>>>
>>>>> I am trying to find a good example within PETSc that examples with
>>>>> some detail the use of DM objects within a parallel and FEM framework. I
>>>>> have looked at most of the example and have not found anything like this.
>>>>> Is there any other material available on the subject from any other source
>>>>> or maybe I missed something within the PETSc full download file?
>>>>>
>>>>
>>>> Did you look at SNES ex12? Is it not understandable?
>>>>
>>>>   Thanks,
>>>>
>>>>      Matt
>>>>
>>>>
>>>>> Thanks in advance.
>>>>>
>>>>> Regards,
>>>>> Adonis
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to which their
>>>> experiments lead.
>>>> -- Norbert Wiener
>>>>
>>>
>>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20131111/c4eae654/attachment.html>


More information about the petsc-users mailing list