<div dir="ltr"><div dir="ltr">On Wed, Apr 8, 2020 at 4:26 PM Danyang Su <<a href="mailto:danyang.su@gmail.com">danyang.su@gmail.com</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div lang="EN-CA"><div class="gmail-m_1766518872680001549WordSection1"><p class="MsoNormal"><u></u> <u></u></p><p class="MsoNormal"><u></u> <u></u></p><div style="border-right:none;border-bottom:none;border-left:none;border-top:1pt solid rgb(181,196,223);padding:3pt 0cm 0cm"><p class="MsoNormal"><b><span style="font-size:12pt;color:black">From: </span></b><span style="font-size:12pt;color:black">Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>><br><b>Date: </b>Wednesday, April 8, 2020 at 12:50 PM<br><b>To: </b>Danyang Su <<a href="mailto:danyang.su@gmail.com" target="_blank">danyang.su@gmail.com</a>><br><b>Cc: </b>PETSc <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br><b>Subject: </b>Re: [petsc-users] DMPlex partition problem<u></u><u></u></span></p></div><div><p class="MsoNormal"><u></u> <u></u></p></div><div><div><p class="MsoNormal">On Wed, Apr 8, 2020 at 3:22 PM Danyang Su <<a href="mailto:danyang.su@gmail.com" target="_blank">danyang.su@gmail.com</a>> wrote:<u></u><u></u></p></div><div><blockquote style="border-top:none;border-right:none;border-bottom:none;border-left:1pt solid rgb(204,204,204);padding:0cm 0cm 0cm 6pt;margin-left:4.8pt;margin-right:0cm"><div><div><p class="MsoNormal">Hi Matt,<u></u><u></u></p><p class="MsoNormal"> <u></u><u></u></p><p class="MsoNormal">Here is something pretty interesting. I modified ex1.c file with output of number of nodes and cells (as shown below) . And I also changed the stencil size to 1.<u></u><u></u></p><p class="MsoNormal"> <u></u><u></u></p><p class="MsoNormal"> /* get coordinates and section */<u></u><u></u></p><p class="MsoNormal"> ierr = DMGetCoordinatesLocal(*dm,&gc);CHKERRQ(ierr);<u></u><u></u></p><p class="MsoNormal"> ierr = DMGetCoordinateDM(*dm,&cda);CHKERRQ(ierr);<u></u><u></u></p><p class="MsoNormal"> ierr = DMGetSection(cda,&cs);CHKERRQ(ierr);<u></u><u></u></p><p class="MsoNormal"> ierr = PetscSectionGetChart(cs,&istart,&iend);CHKERRQ(ierr);<u></u><u></u></p><p class="MsoNormal"> <u></u><u></u></p><p class="MsoNormal"> num_nodes = iend-istart;<u></u><u></u></p><p class="MsoNormal"> num_cells = istart;<u></u><u></u></p><p class="MsoNormal"> <u></u><u></u></p><p class="MsoNormal"> /* Output rank and processor information */<u></u><u></u></p><p class="MsoNormal" style="text-indent:9.75pt">printf("rank %d: of nprcs: %d, num_nodes %d, num_cess %d\n", rank, size, num_nodes, num_cells); <u></u><u></u></p><p class="MsoNormal"> <u></u><u></u></p><p class="MsoNormal"> <u></u><u></u></p><p class="MsoNormal">If I compile the code using ‘make ex1’ and then run the test using ‘<span class="gmail-m_1766518872680001549gmail-m-6754249856326310125s3">mpiexec -n 2 ./ex1 -filename basin2layer.exo</span>’, I get the same problem as the modified ex1f90 code I sent.<u></u><u></u></p><p class="gmail-m_1766518872680001549gmail-m-6754249856326310125p1"><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125s1"><b><span style="font-family:"Segoe UI Symbol",sans-serif">➜</span></b></span><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125apple-converted-space"><b><span style="color:rgb(57,192,38)"> </span></b></span><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125s2"><b>tests</b></span><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125s3"> mpiexec -n 2 ./ex1 -filename basin2layer.exo</span><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125apple-converted-space"> </span><u></u><u></u></p><p class="gmail-m_1766518872680001549gmail-m-6754249856326310125p1"><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125s3">rank 1: of nprcs: 2, num_nodes 699, num_cess 824</span><u></u><u></u></p><p class="gmail-m_1766518872680001549gmail-m-6754249856326310125p1"><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125s3">rank 0: of nprcs: 2, num_nodes 699, num_cess 824</span><u></u><u></u></p></div></div></blockquote><div><p class="MsoNormal">Ah, I was not looking closely. You are asking for a cell overlap of 1 in the partition. That is why these numbers sum to more than<u></u><u></u></p></div><div><p class="MsoNormal">the total in the mesh. Do you want a cell overlap of 1?<u></u><u></u></p><p class="MsoNormal"><u></u> <u></u></p><p class="MsoNormal">Yes, I need cell overlap of 1 in some circumstance. The mesh has two layers of cells with 412 cells per layer and three layers of nodes with 233 nodes per layer. The number of cells looks good to me. I am confused why the same code generates pretty different partition. If I set the stencil to 0, I get following results. The first method looks good and the second one is not a good choice, with much more number of ghost nodes.<u></u><u></u></p><p class="gmail-m_1766518872680001549gmail-m-6754249856326310125p1"><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125s1"><b><span style="font-family:"Segoe UI Symbol",sans-serif">➜</span></b></span><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125apple-converted-space"><b><span style="color:rgb(57,192,38)"> </span></b></span><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125s2"><b>petsc-3.13.0</b></span><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125s3"> make -f ./gmakefile test globsearch="dm_impls_plex_tests-ex1_cylinder" EXTRA_OPTIONS="-filename ./basin2layer.exo -dm_view hdf5:$PWD/mesh.h5 -dm_partition_view" NP=2</span><u></u><u></u></p><p class="MsoNormal"># > rank 1: of nprcs: 2, num_nodes 354, num_cess 392<u></u><u></u></p><p class="MsoNormal"># > rank 0: of nprcs: 2, num_nodes 384, num_cess 432<u></u><u></u></p><p class="MsoNormal"><u></u> <u></u></p><p class="MsoNormal"><span style="font-family:"Segoe UI Symbol",sans-serif">➜</span> tests mpiexec -n 2 ./ex1 -filename basin2layer.exo<u></u><u></u></p><p class="MsoNormal">rank 0: of nprcs: 2, num_nodes 466, num_cess 412<u></u><u></u></p><p class="MsoNormal">rank 1: of nprcs: 2, num_nodes 466, num_cess 412</p></div></div></div></div></div></blockquote><div><br></div><div>I think this might just be a confusion over interpretation. Here is how partitioning works:</div><div><br></div><div> 1) We partition the mesh cells using ParMetis, Chaco, etc.</div><div><br></div><div> 2) We move those cells (and closures) to the correct processes</div><div><br></div><div> 3) If you ask for overlap, we mark a layer of adjacent cells on remote processes and move them to each process</div><div><br></div><div>The original partitions are the same, Then we add extra cells, and their closures, to each partition. This is what you are asking for.</div><div>You would get the same answer with GMsh if it gave you an overlap region.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div lang="EN-CA"><div class="gmail-m_1766518872680001549WordSection1"><div><div><div><p class="MsoNormal">Thanks,<u></u><u></u></p><p class="MsoNormal">Danyang<u></u><u></u></p><p class="MsoNormal"><u></u> <u></u></p></div><div><p class="MsoNormal"><u></u> <u></u></p></div><div><p class="MsoNormal"> Thanks,<u></u><u></u></p></div><div><p class="MsoNormal"><u></u> <u></u></p></div><div><p class="MsoNormal"> Matt<u></u><u></u></p></div><blockquote style="border-top:none;border-right:none;border-bottom:none;border-left:1pt solid rgb(204,204,204);padding:0cm 0cm 0cm 6pt;margin-left:4.8pt;margin-right:0cm"><div><div><p class="gmail-m_1766518872680001549gmail-m-6754249856326310125p1"><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125s1"><b><span style="font-family:"Segoe UI Symbol",sans-serif">➜</span></b></span><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125apple-converted-space"><b><span style="color:rgb(57,192,38)"> </span></b></span><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125s2"><b>tests</b></span><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125s3"> mpiexec -n 4 ./ex1 -filename basin2layer.exo</span><u></u><u></u></p><p class="gmail-m_1766518872680001549gmail-m-6754249856326310125p1"><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125s3">rank 1: of nprcs: 4, num_nodes 432, num_cess 486</span><u></u><u></u></p><p class="gmail-m_1766518872680001549gmail-m-6754249856326310125p1"><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125s3">rank 0: of nprcs: 4, num_nodes 405, num_cess 448</span><u></u><u></u></p><p class="gmail-m_1766518872680001549gmail-m-6754249856326310125p1"><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125s3">rank 2: of nprcs: 4, num_nodes 411, num_cess 464</span><u></u><u></u></p><p class="gmail-m_1766518872680001549gmail-m-6754249856326310125p1"><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125s3">rank 3: of nprcs: 4, num_nodes 420, num_cess 466</span><u></u><u></u></p><p class="MsoNormal"> <u></u><u></u></p><p class="MsoNormal">However, if I compile and run the code using the script you shared, I get reasonable results.<u></u><u></u></p><p class="gmail-m_1766518872680001549gmail-m-6754249856326310125p1"><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125s1"><b><span style="font-family:"Segoe UI Symbol",sans-serif">➜</span></b></span><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125apple-converted-space"><b><span style="color:rgb(57,192,38)"> </span></b></span><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125s2"><b>petsc-3.13.0</b></span><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125s3"> make -f ./gmakefile test globsearch="dm_impls_plex_tests-ex1_cylinder" EXTRA_OPTIONS="-filename ./basin2layer.exo -dm_view hdf5:$PWD/mesh.h5 -dm_partition_view" NP=2</span><u></u><u></u></p><p class="MsoNormal"><span style="font-size:8.5pt;font-family:Menlo;color:black"># > rank 0: of nprcs: 2, num_nodes 429, num_cess 484</span><u></u><u></u></p><p class="MsoNormal"><span style="font-size:8.5pt;font-family:Menlo;color:black"># > rank 1: of nprcs: 2, num_nodes 402, num_cess 446</span><u></u><u></u></p><p class="MsoNormal"> <u></u><u></u></p><p class="gmail-m_1766518872680001549gmail-m-6754249856326310125p1"><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125s1"><b><span style="font-family:"Segoe UI Symbol",sans-serif">➜</span></b></span><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125apple-converted-space"><b><span style="color:rgb(57,192,38)"> </span></b></span><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125s2"><b>petsc-3.13.0</b></span><span class="gmail-m_1766518872680001549gmail-m-6754249856326310125s3"> make -f ./gmakefile test globsearch="dm_impls_plex_tests-ex1_cylinder" EXTRA_OPTIONS="-filename ./basin2layer.exo -dm_view hdf5:$PWD/mesh.h5 -dm_partition_view" NP=4</span><u></u><u></u></p><p class="MsoNormal"><span style="font-size:8.5pt;font-family:Menlo;color:black"># > rank 1: of nprcs: 4, num_nodes 246, num_cess 260</span><u></u><u></u></p><p class="MsoNormal"><span style="font-size:8.5pt;font-family:Menlo;color:black"># > rank 2: of nprcs: 4, num_nodes 264, num_cess 274</span><u></u><u></u></p><p class="MsoNormal"><span style="font-size:8.5pt;font-family:Menlo;color:black"># > rank 3: of nprcs: 4, num_nodes 264, num_cess 280</span><u></u><u></u></p><p class="MsoNormal"><span style="font-size:8.5pt;font-family:Menlo;color:black"># > rank 0: of nprcs: 4, num_nodes 273, num_cess 284</span><u></u><u></u></p><p class="MsoNormal" style="text-indent:9.75pt"> <u></u><u></u></p><p class="MsoNormal">Is there some difference in compiling or runtime options that cause the difference? Would you please check if you can reproduce the same problem using the modified ex1.c?<u></u><u></u></p><p class="MsoNormal"> <u></u><u></u></p><p class="MsoNormal">Thanks,<u></u><u></u></p><p class="MsoNormal"> <u></u><u></u></p><p class="MsoNormal">Danyang<u></u><u></u></p><p class="MsoNormal"> <u></u><u></u></p><div style="border-right:none;border-bottom:none;border-left:none;border-top:1pt solid rgb(181,196,223);padding:3pt 0cm 0cm"><p class="MsoNormal"><b><span style="font-size:12pt;color:black">From: </span></b><span style="font-size:12pt;color:black">Danyang Su <<a href="mailto:danyang.su@gmail.com" target="_blank">danyang.su@gmail.com</a>><br><b>Date: </b>Wednesday, April 8, 2020 at 9:37 AM<br><b>To: </b>Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>><br><b>Cc: </b>PETSc <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br><b>Subject: </b>Re: [petsc-users] DMPlex partition problem</span><u></u><u></u></p></div><div><p class="MsoNormal"> <u></u><u></u></p></div><p class="MsoNormal"> <u></u><u></u></p><p class="MsoNormal"> <u></u><u></u></p><div style="border-right:none;border-bottom:none;border-left:none;border-top:1pt solid rgb(181,196,223);padding:3pt 0cm 0cm"><p class="MsoNormal"><b><span style="font-size:12pt;color:black">From: </span></b><span style="font-size:12pt;color:black">Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>><br><b>Date: </b>Wednesday, April 8, 2020 at 9:20 AM<br><b>To: </b>Danyang Su <<a href="mailto:danyang.su@gmail.com" target="_blank">danyang.su@gmail.com</a>><br><b>Cc: </b>PETSc <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br><b>Subject: </b>Re: [petsc-users] DMPlex partition problem</span><u></u><u></u></p></div><div><p class="MsoNormal"> <u></u><u></u></p></div><div><div><p class="MsoNormal">On Wed, Apr 8, 2020 at 12:13 PM Danyang Su <<a href="mailto:danyang.su@gmail.com" target="_blank">danyang.su@gmail.com</a>> wrote:<u></u><u></u></p></div><div><blockquote style="border-top:none;border-right:none;border-bottom:none;border-left:1pt solid rgb(204,204,204);padding:0cm 0cm 0cm 6pt;margin:5pt 0cm 5pt 4.8pt"><div><div><div style="border-right:none;border-bottom:none;border-left:none;border-top:1pt solid rgb(181,196,223);padding:3pt 0cm 0cm"><p class="MsoNormal"><b><span style="font-size:12pt;color:black">From: </span></b><span style="font-size:12pt;color:black">Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>><br><b>Date: </b>Wednesday, April 8, 2020 at 6:45 AM<br><b>To: </b>Danyang Su <<a href="mailto:danyang.su@gmail.com" target="_blank">danyang.su@gmail.com</a>><br><b>Cc: </b>PETSc <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>><br><b>Subject: </b>Re: [petsc-users] DMPlex partition problem</span><u></u><u></u></p></div><div><p class="MsoNormal"> <u></u><u></u></p></div><div><div><p class="MsoNormal">On Wed, Apr 8, 2020 at 7:25 AM Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>> wrote:<u></u><u></u></p></div><div><blockquote style="border-top:none;border-right:none;border-bottom:none;border-left:1pt solid rgb(204,204,204);padding:0cm 0cm 0cm 6pt;margin:5pt 0cm 5pt 4.8pt"><div><div><p class="MsoNormal">On Wed, Apr 8, 2020 at 12:48 AM Danyang Su <<a href="mailto:danyang.su@gmail.com" target="_blank">danyang.su@gmail.com</a>> wrote:<u></u><u></u></p></div><div><blockquote style="border-top:none;border-right:none;border-bottom:none;border-left:1pt solid rgb(204,204,204);padding:0cm 0cm 0cm 6pt;margin:5pt 0cm 5pt 4.8pt"><div><div><p class="MsoNormal"><span lang="EN-US">Dear All,</span><u></u><u></u></p><p class="MsoNormal"><span lang="EN-US"> </span><u></u><u></u></p><p class="MsoNormal"><span lang="EN-US">Hope you are safe and healthy.</span><u></u><u></u></p><p class="MsoNormal"><span lang="EN-US"> </span><u></u><u></u></p><p class="MsoNormal"><span lang="EN-US">I have a question regarding pretty different partition results of prism mesh. The partition in PETSc generates much more ghost nodes/cells than the partition in Gmsh, even though both use metis as partitioner. Attached please find the prism mesh in both vtk and exo format, the test code modified based on ex1f90 example. Similar problem are observed for larger dataset with more layers.</span><u></u><u></u></p></div></div></blockquote><div><p class="MsoNormal"> <u></u><u></u></p></div><div><p class="MsoNormal">I will figure this out by next week.<u></u><u></u></p></div></div></div></blockquote><div><p class="MsoNormal"> <u></u><u></u></p></div><div><p class="MsoNormal">I have run your mesh and do not get those weird partitions. I am running in master. What are you using? Also, here is an easy way<u></u><u></u></p></div><div><p class="MsoNormal">to do this using a PETSc test:<u></u><u></u></p></div><div><p class="MsoNormal"> <u></u><u></u></p></div><div><p class="MsoNormal">cd $PETSC_DIR<u></u><u></u></p></div><div><p class="MsoNormal">make -f ./gmakefile test globsearch="dm_impls_plex_tests-ex1_cylinder" EXTRA_OPTIONS="-filename ${HOME}/Downloads/basin2layer.exo -dm_view hdf5:$PWD/mesh.h5 -dm_partition_view" NP=5 <u></u><u></u></p></div><div><p class="MsoNormal">./lib/petsc/bin/petsc_gen_xdmf.py mesh.h5<u></u><u></u></p></div><div><p class="MsoNormal"> <u></u><u></u></p></div><div><p class="MsoNormal">and then load mesh.xmf into Paraview. Here is what I see (attached). Is it possible for you to try the master branch?<u></u><u></u></p><p class="MsoNormal"> <u></u><u></u></p><p class="MsoNormal">Hi Matt,<u></u><u></u></p><p class="MsoNormal"> <u></u><u></u></p><p class="MsoNormal">Thanks for your quick response. If I use your script, the partition looks good, as shown in the attached figure. I am working on PETSc 3.13.0 release version on Mac OS. <u></u><u></u></p><p class="MsoNormal"> <u></u><u></u></p><p class="MsoNormal">Does the above script use code /petsc/src/dm/label/tutorials/ex1c.c?<u></u><u></u></p></div><div><p class="MsoNormal"> <u></u><u></u></p></div></div></div></div></div></blockquote><div><p class="MsoNormal"> <u></u><u></u></p></div><div><p class="MsoNormal">It uses $PETSC_DIR/src/dm/impls/plex/tests/ex1.c<u></u><u></u></p></div><div><p class="MsoNormal"> <u></u><u></u></p></div><div><p class="MsoNormal">I looked at your code and cannot see any difference. Also, no changes are in master that are not in 3.13. This is very strange.<u></u><u></u></p></div><div><p class="MsoNormal">I guess we will have to go one step at a time between the example and your code.<u></u><u></u></p><p class="MsoNormal"> <u></u><u></u></p><p class="MsoNormal">I will add mesh output to the ex1f90 example and check if the cell/vertex rank is exactly the same. I wrote the mesh output myself based on the partition but there should be no problem in that part. The number of ghost nodes and cells is pretty easy to check. Not sure if there is any difference between the C code and Fortran code that causes the problem. Anyway, I will keep you updated.<u></u><u></u></p></div><div><p class="MsoNormal"> <u></u><u></u></p></div><div><p class="MsoNormal"> Thanks,<u></u><u></u></p></div><div><p class="MsoNormal"> <u></u><u></u></p></div><div><p class="MsoNormal"> Matt<u></u><u></u></p></div><div><p class="MsoNormal"> <u></u><u></u></p></div><blockquote style="border-top:none;border-right:none;border-bottom:none;border-left:1pt solid rgb(204,204,204);padding:0cm 0cm 0cm 6pt;margin:5pt 0cm 5pt 4.8pt"><div><div><div><div><div><p class="MsoNormal"> Thanks,<u></u><u></u></p></div><div><p class="MsoNormal"> <u></u><u></u></p></div><div><p class="MsoNormal"> Matt<u></u><u></u></p></div><div><p class="MsoNormal"> <u></u><u></u></p></div><blockquote style="border-top:none;border-right:none;border-bottom:none;border-left:1pt solid rgb(204,204,204);padding:0cm 0cm 0cm 6pt;margin:5pt 0cm 5pt 4.8pt"><div><div><div><p class="MsoNormal"> Thanks,<u></u><u></u></p></div><div><p class="MsoNormal"> <u></u><u></u></p></div><div><p class="MsoNormal"> Matt<u></u><u></u></p></div><div><p class="MsoNormal"> <u></u><u></u></p></div><blockquote style="border-top:none;border-right:none;border-bottom:none;border-left:1pt solid rgb(204,204,204);padding:0cm 0cm 0cm 6pt;margin:5pt 0cm 5pt 4.8pt"><div><div><p class="MsoNormal"><span lang="EN-US">For example, in Gmsh, I get partition results using two processors and four processors as shown below, which are pretty reasonable.</span><u></u><u></u></p><p class="MsoNormal"><span lang="EN-US"><img border="0" width="1243" height="629" style="width: 12.9479in; height: 6.552in;" id="gmail-m_1766518872680001549gmail-m_-6754249856326310125gmail-m_-5010994516531062944gmail-m_-3573090367149144785gmail-m_-3580117583950348177Picture_x0020_5" src="cid:1715b7aee284cff311"></span><u></u><u></u></p><p class="MsoNormal"><span lang="EN-US"> </span><u></u><u></u></p><p class="MsoNormal"><span lang="EN-US"><img border="0" width="1243" height="629" style="width: 12.9479in; height: 6.552in;" id="gmail-m_1766518872680001549gmail-m_-6754249856326310125gmail-m_-5010994516531062944gmail-m_-3573090367149144785gmail-m_-3580117583950348177Picture_x0020_6" src="cid:1715b7aee285b16b22"></span><u></u><u></u></p><p class="MsoNormal"><span lang="EN-US"> </span><u></u><u></u></p><p class="MsoNormal"><span lang="EN-US">However, in PETSc, the partition looks a bit weird. Looks like it takes layer partition first and then inside layer. If the number of nodes per layer is very large, this kind of partitioning results into much more ghost nodes/cells. </span><u></u><u></u></p><p class="MsoNormal"><span lang="EN-US"> </span><u></u><u></u></p><p class="MsoNormal"><span lang="EN-US">Anybody know how to improve the partitioning in PETSc? I have tried parmetis and chaco. There is no big difference between them. </span><u></u><u></u></p><p class="MsoNormal"><span lang="EN-US"> </span><u></u><u></u></p><p class="MsoNormal"><span lang="EN-US"><img border="0" width="642" height="802" style="width: 6.6875in; height: 8.3541in;" id="gmail-m_1766518872680001549gmail-m_-6754249856326310125gmail-m_-5010994516531062944gmail-m_-3573090367149144785gmail-m_-3580117583950348177Picture_x0020_8" src="cid:1715b7aee28692e333"></span><u></u><u></u></p><p class="MsoNormal"><span lang="EN-US"> </span><u></u><u></u></p><p class="MsoNormal"><span lang="EN-US"> </span><u></u><u></u></p><p class="MsoNormal"><span lang="EN-US">Thanks,</span><u></u><u></u></p><p class="MsoNormal"><span lang="EN-US"> </span><u></u><u></u></p><p class="MsoNormal"><span lang="EN-US">Danyang</span><u></u><u></u></p><p class="MsoNormal"><span lang="EN-US"> </span><u></u><u></u></p></div></div></blockquote></div><p class="MsoNormal"><br clear="all"><u></u><u></u></p><div><p class="MsoNormal"> <u></u><u></u></p></div><p class="MsoNormal">-- <u></u><u></u></p><div><div><div><div><div><div><div><p class="MsoNormal">What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener<u></u><u></u></p></div><div><p class="MsoNormal"> <u></u><u></u></p></div><div><p class="MsoNormal"><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><u></u><u></u></p></div></div></div></div></div></div></div></div></blockquote></div><p class="MsoNormal"><br clear="all"><u></u><u></u></p><div><p class="MsoNormal"> <u></u><u></u></p></div><p class="MsoNormal">-- <u></u><u></u></p><div><div><div><div><div><div><div><p class="MsoNormal">What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener<u></u><u></u></p></div><div><p class="MsoNormal"> <u></u><u></u></p></div><div><p class="MsoNormal"><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><u></u><u></u></p></div></div></div></div></div></div></div></div></div></div></blockquote></div><p class="MsoNormal"><br clear="all"><u></u><u></u></p><div><p class="MsoNormal"> <u></u><u></u></p></div><p class="MsoNormal">-- <u></u><u></u></p><div><div><div><div><div><div><div><p class="MsoNormal">What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener<u></u><u></u></p></div><div><p class="MsoNormal"> <u></u><u></u></p></div><div><p class="MsoNormal"><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><u></u><u></u></p></div></div></div></div></div></div></div></div></div></div></blockquote></div><p class="MsoNormal"><br clear="all"><u></u><u></u></p><div><p class="MsoNormal"><u></u> <u></u></p></div><p class="MsoNormal">-- <u></u><u></u></p><div><div><div><div><div><div><div><p class="MsoNormal">What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener<u></u><u></u></p></div><div><p class="MsoNormal"><u></u> <u></u></p></div><div><p class="MsoNormal"><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><u></u><u></u></p></div></div></div></div></div></div></div></div></div></div>
</blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr" class="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="http://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>