[petsc-users] A question about DMPlexDistribute
leejearl
leejearl at 126.com
Mon Aug 15 21:28:11 CDT 2016
Thank you for all your helps. I have tried and reinstalled PETSc lots of
times. The error are still existed.
I can not find out the reasons. Now, I give the some messages in this
letter.
1> The source code is downloaded from the website of PETSc, and the
version is 3.7.2.
2> Configure:
>export PETSC_DIR=./
>export PETSC_ARCH=arch
>./configure --prefix=$HOME/Install/petsc
--with-mpi-dir=/home/leejearl/Install/mpich_3.1.4/gnu
--download-exodusii=../externalpackages/exodus-5.24.tar.bz2
--download-netcdf=../externalpackages/netcdf-4.3.2.tar.gz
--download-hdf5=../externalpackages/hdf5-1.8.12.tar.gz
--download-metis=../externalpackages/git.metis.tar.gz
--download-parmetis=yes
3> The process of installation has no error.
4> After the installation, I added the following statement into the file
~/.bashrc:
export PETSC_ARCH=""
export PETSC_DIR=$HOME/Install/pets/
I wish to get some helps as follows:
1> Is there any problems in my installation?
2> Can any one help me a simple code in which the value of overlap used
in DMPlexDistribute function is greater than 1.
3> I attach the code, makefile, grid and the error messages again, I
hope some one can help me to figure out the problems.
3.1> code: cavity.c
3.2> makefile: makefile
3.3> grid: cavity.exo
3.4> error messages: error.dat
It is very strange that there is no error message when I run it
using "mpirun -n 3 ./cavity", but when I run it using "mpirun -n 2
./cavity", the errors happed.
The error messages are shown in the file error.dat.
Any helps are appreciated.
On 2016年08月13日 09:04, leejearl wrote:
>
> Thank you for your reply. The source code I have used is from the
> website of PETSc, not from the git repository.
>
> I will have a test with the code from git repository.
>
>
> leejearl
>
>
> On 2016年08月13日 08:49, Oxberry, Geoffrey Malcolm wrote:
>>
>>> On Aug 12, 2016, at 5:41 PM, leejearl <leejearl at 126.com
>>> <mailto:leejearl at 126.com>> wrote:
>>>
>>> Hi, Matt:
>>>
>>>
>>>
>>> > Can you verify that you are running the master branch?
>>
>> cd ${PETSC_DIR}
>> git branch
>>
>> The last command should return something like a list of branch names,
>> and the branch name with an asterisk to the left of it will be the
>> branch you are currently on.
>>
>> Geoff
>>
>>> I am not sure, how can I verify this?
>>> And I configure PETSc with this command
>>> "./configure --prefix=$HOME/Install/petsc-openmpi
>>> --with-mpi=/home/leejearl/Install/openmpi/gnu/1.8.4
>>> --download-exodusii=yes --download-netcdf
>>> --with-hdf5-dir=/home/leejearl/Install/hdf5-1.8.14
>>> --download-metis=yes".
>>> Is there some problem? Can you show me your command for configuring
>>> PETSc?
>>>
>>>
>>> Thanks
>>>
>>> leejearl
>>>
>>>
>>>
>>>
>>>
>>> On 2016年08月13日 01:10, Matthew Knepley wrote:
>>>> On Thu, Aug 11, 2016 at 8:00 PM, leejearl <leejearl at 126.com
>>>> <mailto:leejearl at 126.com>> wrote:
>>>>
>>>> Thank you for your reply. I have attached the code, grid and
>>>> the error message.
>>>>
>>>> cavity.c is the code file, cavity.exo is the grid, and
>>>> error.dat is the error message.
>>>>
>>>> The command is "mpirun -n 2 ./cavity
>>>>
>>>>
>>>> Can you verify that you are running the master branch? I just ran
>>>> this and got
>>>>
>>>> DM Object: 2 MPI processes
>>>> type: plex
>>>> DM_0x84000004_0 in 2 dimensions:
>>>> 0-cells: 5253 5252
>>>> 1-cells: 10352 10350
>>>> 2-cells: 5298 (198) 5297 (198)
>>>> Labels:
>>>> ghost: 2 strata of sizes (199, 400)
>>>> vtk: 1 strata of sizes (4901)
>>>> Cell Sets: 1 strata of sizes (5100)
>>>> Face Sets: 3 strata of sizes (53, 99, 50)
>>>> depth: 3 strata of sizes (5253, 10352, 5298)
>>>>
>>>> Thanks,
>>>>
>>>> Matt
>>>>
>>>> On 2016年08月11日 23:29, Matthew Knepley wrote:
>>>>> On Thu, Aug 11, 2016 at 3:14 AM, leejearl <leejearl at 126.com
>>>>> <mailto:leejearl at 126.com>> wrote:
>>>>>
>>>>> Hi,
>>>>> Thank you for your reply. It help me very much.
>>>>> But, for
>>>>> "/petsc-3.7.2/src/ts/examples/tutorials/ex11.c", when I
>>>>> set the overlap to 2 levels, the command is
>>>>> "mpirun -n 3 ./ex11 -f annulus-20.exo -ufv_mesh_overlap 2
>>>>> -physics sw", it suffers a error.
>>>>> It seems to me that setting overlap to 2 is very
>>>>> common. Are there issues that I have not take into
>>>>> consideration?
>>>>> Any help are appreciated.
>>>>>
>>>>> I will check this out. I have not tested an overlap of 2 here
>>>>> since I generally use nearest neighbor FV methods for
>>>>> unstructured stuff. I have test examples that run fine for
>>>>> overlap > 1. Can you send the entire error message?
>>>>>
>>>>> If the error is not in the distribution, but rather in the
>>>>> analytics, that is understandable because this example is only
>>>>> intended to be run using a nearest neighbor FV method, and
>>>>> thus might be confused if we give it two layers of ghost
>>>>> cells.
>>>>>
>>>>> Matt
>>>>>
>>>>>
>>>>> leejearl
>>>>>
>>>>>
>>>>> On 2016年08月11日 14:57, Julian Andrej wrote:
>>>>>> Hi,
>>>>>>
>>>>>> take a look at slide 10 of [1], there is visually
>>>>>> explained what the overlap between partitions is.
>>>>>>
>>>>>> [1]
>>>>>> https://www.archer.ac.uk/training/virtual/files/2015/06-PETSc/slides.pdf
>>>>>> <https://www.archer.ac.uk/training/virtual/files/2015/06-PETSc/slides.pdf>
>>>>>>
>>>>>> On Thu, Aug 11, 2016 at 8:48 AM, leejearl
>>>>>> <leejearl at 126.com <mailto:leejearl at 126.com>> wrote:
>>>>>>
>>>>>> Hi, all:
>>>>>> I want to use PETSc to build my FVM code. Now, I
>>>>>> have a question about
>>>>>> the function DMPlexDistribute(DM dm, PetscInt
>>>>>> overlap, PetscSF *sf, DM *dmOverlap) .
>>>>>>
>>>>>> In the example
>>>>>> "/petsc-3.7.2/src/ts/examples/tutorials/ex11.c". When
>>>>>> I set the overlap
>>>>>> as 0 or 1, it works well. But, if I set the overlap
>>>>>> as 2, it suffers a problem.
>>>>>> I am confused about the value of overlap. Can it
>>>>>> be set as 2? What is the meaning of
>>>>>> the parameter overlap?
>>>>>> Any helps are appreciated!
>>>>>>
>>>>>> leejearl
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> What most experimenters take for granted before they begin
>>>>> their experiments is infinitely more interesting than any
>>>>> results to which their experiments lead.
>>>>> -- Norbert Wiener
>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> What most experimenters take for granted before they begin their
>>>> experiments is infinitely more interesting than any results to
>>>> which their experiments lead.
>>>> -- Norbert Wiener
>>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160816/5ec139d7/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: cavity.c
Type: text/x-csrc
Size: 1918 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160816/5ec139d7/attachment-0001.c>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: cavity.exo
Type: application/octet-stream
Size: 344931 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160816/5ec139d7/attachment-0001.obj>
-------------- next part --------------
ALL: cavity
CFLAGS =
FFLAGS =
CPPFLAGS =
FPPFLAGS =
CLEANFILES = cavity
include ${PETSC_DIR}/lib/petsc/conf/variables
include ${PETSC_DIR}/lib/petsc/conf/rules
cavity: cavity.o chkopts
${CLINKER}$ -o cavity cavity.o ${PETSC_LIB}$
${RM} cavity.o
-------------- next part --------------
$ make
$ mpirun -n 3 ./cavity
overlap = 2
overlap = 2
overlap = 2
$ mpirun -n 2 ./cavity
overlap = 2
overlap = 2
[1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[1]PETSC ERROR: Argument out of range
[1]PETSC ERROR: key <= 0
[1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[1]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016
[1]PETSC ERROR: ./cavity on a arch named leejearl by leejearl Tue Aug 16 10:19:57 2016
[1]PETSC ERROR: Configure options --prefix=/home/leejearl/Install/petsc --with-mpi-dir=/home/leejearl/Install/mpich_3.1.4/gnu --download-exodusii=../externalpackages/exodus-5.24.tar.bz2 --download-netcdf=../externalpackages/netcdf-4.3.2.tar.gz --download-hdf5=../externalpackages/hdf5-1.8.12.tar.gz --download-metis=../externalpackages/git.metis.tar.gz --download-parmetis=yes
[1]PETSC ERROR: #1 PetscTableAdd() line 45 in ./include/petscctable.h
[1]PETSC ERROR: #2 PetscSFSetGraph() line 347 in /home/leejearl/Software/petsc/petsc-3.7.2/src/vec/is/sf/interface/sf.c
[1]PETSC ERROR: #3 DMLabelGather() line 1092 in /home/leejearl/Software/petsc/petsc-3.7.2/src/dm/label/dmlabel.c
[1]PETSC ERROR: #4 DMPlexPartitionLabelPropagate() line 1633 in /home/leejearl/Software/petsc/petsc-3.7.2/src/dm/impls/plex/plexpartition.c
[1]PETSC ERROR: #5 DMPlexCreateOverlap() line 615 in /home/leejearl/Software/petsc/petsc-3.7.2/src/dm/impls/plex/plexdistribute.c
[1]PETSC ERROR: #6 DMPlexDistributeOverlap() line 1729 in /home/leejearl/Software/petsc/petsc-3.7.2/src/dm/impls/plex/plexdistribute.c
[1]PETSC ERROR: #7 DMPlexDistribute() line 1635 in /home/leejearl/Software/petsc/petsc-3.7.2/src/dm/impls/plex/plexdistribute.c
[1]PETSC ERROR: #8 main() line 60 in /home/leejearl/Desktop/PETSc/gks_cavity/cavity.c
[1]PETSC ERROR: No PETSc Option Table entries
[1]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov----------
application called MPI_Abort(MPI_COMM_WORLD, 63) - process 1
===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= PID 24134 RUNNING AT leejearl
= EXIT CODE: 63
= CLEANING UP REMAINING PROCESSES
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================
More information about the petsc-users
mailing list