[petsc-users] A question about DMPlexDistribute

Oxberry, Geoffrey Malcolm oxberry1 at llnl.gov
Fri Aug 12 19:49:07 CDT 2016


On Aug 12, 2016, at 5:41 PM, leejearl <leejearl at 126.com<mailto:leejearl at 126.com>> wrote:


Hi, Matt:



> Can you verify that you are running the master branch?

cd ${PETSC_DIR}
git branch

The last command should return something like a list of branch names, and the branch name with an asterisk to the left of it will be the branch you are currently on.

Geoff

I am not sure, how can I verify this?
And I configure PETSc with this command
"./configure --prefix=$HOME/Install/petsc-openmpi --with-mpi=/home/leejearl/Install/openmpi/gnu/1.8.4 --download-exodusii=yes --download-netcdf --with-hdf5-dir=/home/leejearl/Install/hdf5-1.8.14 --download-metis=yes".
Is there some problem? Can you show me your command for configuring PETSc?


Thanks

leejearl





On 2016年08月13日 01:10, Matthew Knepley wrote:
On Thu, Aug 11, 2016 at 8:00 PM, leejearl <leejearl at 126.com<mailto:leejearl at 126.com>> wrote:

Thank you for your reply. I have attached the code, grid and the error message.

cavity.c is the code file, cavity.exo is the grid, and error.dat is the error message.

The command is "mpirun -n 2 ./cavity

Can you verify that you are running the master branch? I just ran this and got

DM Object: 2 MPI processes
  type: plex
DM_0x84000004_0 in 2 dimensions:
  0-cells: 5253 5252
  1-cells: 10352 10350
  2-cells: 5298 (198) 5297 (198)
Labels:
  ghost: 2 strata of sizes (199, 400)
  vtk: 1 strata of sizes (4901)
  Cell Sets: 1 strata of sizes (5100)
  Face Sets: 3 strata of sizes (53, 99, 50)
  depth: 3 strata of sizes (5253, 10352, 5298)

  Thanks,

     Matt

On 2016年08月11日 23:29, Matthew Knepley wrote:
On Thu, Aug 11, 2016 at 3:14 AM, leejearl <leejearl at 126.com<mailto:leejearl at 126.com>> wrote:

Hi,
    Thank you for your reply. It help me very much.
    But, for "/petsc-3.7.2/src/ts/examples/tutorials/ex11.c", when I set the overlap to 2 levels, the command is
"mpirun -n 3 ./ex11 -f annulus-20.exo -ufv_mesh_overlap 2 -physics sw", it suffers a error.
    It seems to me that setting overlap to 2 is very common. Are there issues that I have not take into consideration?
    Any help are appreciated.

I will check this out. I have not tested an overlap of 2 here since I generally use nearest neighbor FV methods for
unstructured stuff. I have test examples that run fine for overlap > 1. Can you send the entire error message?

If the error is not in the distribution, but rather in the analytics, that is understandable because this example is only
intended to be run using a nearest neighbor FV method, and thus might be confused if we give it two layers of ghost
cells.

   Matt


leejearl

On 2016年08月11日 14:57, Julian Andrej wrote:
Hi,

take a look at slide 10 of [1], there is visually explained what the overlap between partitions is.

[1] https://www.archer.ac.uk/training/virtual/files/2015/06-PETSc/slides.pdf

On Thu, Aug 11, 2016 at 8:48 AM, leejearl <leejearl at 126.com<mailto:leejearl at 126.com>> wrote:
Hi, all:
    I want to use PETSc to build my FVM code. Now, I have a question about
the function  DMPlexDistribute(DM dm, PetscInt overlap, PetscSF *sf, DM *dmOverlap) .

    In the example "/petsc-3.7.2/src/ts/examples/tutorials/ex11.c". When I set the overlap
as 0 or 1, it works well. But, if I set the overlap as 2, it suffers a problem.
    I am confused about the value of overlap. Can it be set as 2? What is the meaning of
the parameter overlap?
    Any helps are appreciated!

leejearl








--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener




--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20160813/91364d31/attachment-0001.html>


More information about the petsc-users mailing list