[petsc-users] A question about DMPlexDistribute
leejearl
leejearl at 126.com
Fri Aug 12 19:41:31 CDT 2016
Hi, Matt:
> Can you verify that you are running the master branch?
I am not sure, how can I verify this?
And I configure PETSc with this command
"./configure --prefix=$HOME/Install/petsc-openmpi
--with-mpi=/home/leejearl/Install/openmpi/gnu/1.8.4
--download-exodusii=yes --download-netcdf
--with-hdf5-dir=/home/leejearl/Install/hdf5-1.8.14 --download-metis=yes".
Is there some problem? Can you show me your command for configuring PETSc?
Thanks
leejearl
On 2016年08月13日 01:10, Matthew Knepley wrote:
> On Thu, Aug 11, 2016 at 8:00 PM, leejearl <leejearl at 126.com
> <mailto:leejearl at 126.com>> wrote:
>
> Thank you for your reply. I have attached the code, grid and the
> error message.
>
> cavity.c is the code file, cavity.exo is the grid, and error.dat
> is the error message.
>
> The command is "mpirun -n 2 ./cavity
>
>
> Can you verify that you are running the master branch? I just ran this
> and got
>
> DM Object: 2 MPI processes
> type: plex
> DM_0x84000004_0 in 2 dimensions:
> 0-cells: 5253 5252
> 1-cells: 10352 10350
> 2-cells: 5298 (198) 5297 (198)
> Labels:
> ghost: 2 strata of sizes (199, 400)
> vtk: 1 strata of sizes (4901)
> Cell Sets: 1 strata of sizes (5100)
> Face Sets: 3 strata of sizes (53, 99, 50)
> depth: 3 strata of sizes (5253, 10352, 5298)
>
> Thanks,
>
> Matt
>
> On 2016年08月11日 23:29, Matthew Knepley wrote:
>> On Thu, Aug 11, 2016 at 3:14 AM, leejearl <leejearl at 126.com
>> <mailto:leejearl at 126.com>> wrote:
>>
>> Hi,
>> Thank you for your reply. It help me very much.
>> But, for "/petsc-3.7.2/src/ts/examples/tutorials/ex11.c",
>> when I set the overlap to 2 levels, the command is
>> "mpirun -n 3 ./ex11 -f annulus-20.exo -ufv_mesh_overlap 2
>> -physics sw", it suffers a error.
>> It seems to me that setting overlap to 2 is very common.
>> Are there issues that I have not take into consideration?
>> Any help are appreciated.
>>
>> I will check this out. I have not tested an overlap of 2 here
>> since I generally use nearest neighbor FV methods for
>> unstructured stuff. I have test examples that run fine for
>> overlap > 1. Can you send the entire error message?
>>
>> If the error is not in the distribution, but rather in the
>> analytics, that is understandable because this example is only
>> intended to be run using a nearest neighbor FV method, and thus
>> might be confused if we give it two layers of ghost
>> cells.
>>
>> Matt
>>
>>
>> leejearl
>>
>>
>> On 2016年08月11日 14:57, Julian Andrej wrote:
>>> Hi,
>>>
>>> take a look at slide 10 of [1], there is visually explained
>>> what the overlap between partitions is.
>>>
>>> [1]
>>> https://www.archer.ac.uk/training/virtual/files/2015/06-PETSc/slides.pdf
>>> <https://www.archer.ac.uk/training/virtual/files/2015/06-PETSc/slides.pdf>
>>>
>>> On Thu, Aug 11, 2016 at 8:48 AM, leejearl <leejearl at 126.com
>>> <mailto:leejearl at 126.com>> wrote:
>>>
>>> Hi, all:
>>> I want to use PETSc to build my FVM code. Now, I
>>> have a question about
>>> the function DMPlexDistribute(DM dm, PetscInt overlap,
>>> PetscSF *sf, DM *dmOverlap) .
>>>
>>> In the example
>>> "/petsc-3.7.2/src/ts/examples/tutorials/ex11.c". When I
>>> set the overlap
>>> as 0 or 1, it works well. But, if I set the overlap as
>>> 2, it suffers a problem.
>>> I am confused about the value of overlap. Can it be
>>> set as 2? What is the meaning of
>>> the parameter overlap?
>>> Any helps are appreciated!
>>>
>>> leejearl
>>>
>>>
>>>
>>>
>>
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to
>> which their experiments lead.
>> -- Norbert Wiener
>
>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which
> their experiments lead.
> -- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160813/b5853e67/attachment.html>
More information about the petsc-users
mailing list