<div dir="ltr"><div dir="ltr">Dear Matt,<br><br>Thanks for the reply. <br><br>1. I realized that, for some reason, all command line options don't seem to work for me. For the simple code that you mentioned (also shown below), the only command line options that I am successful are<br><br>/home/subrambm/petsc/arch-linux-c-debug/bin/mpiexec -n 2 ./cavity_flow.out -dm_plex_shape sphere -dm_plex_dim 2 -dm_plex_simplex 1 -dm_plex_sphere_radius 1.0 -dm_refine_pre 3 -dm_view hdf5:mesh.h5 -dm_distribute 1<br><br>#include <petsc.h><br>int main(int argc, char **argv)<br>{<br> DM dm;<br> PetscErrorCode ierr;<br><br> ierr = PetscInitialize(&argc, &argv, NULL, NULL);if (ierr) return ierr;<br> ierr = DMCreate(PETSC_COMM_WORLD, &dm);CHKERRQ(ierr);<br> ierr = DMSetType(dm, DMPLEX);CHKERRQ(ierr);<br> ierr = DMSetFromOptions(dm);CHKERRQ(ierr);<br> ierr = DMViewFromOptions(dm, NULL, "-dm_view");CHKERRQ(ierr);<br> ierr = DMDestroy(&dm);CHKERRQ(ierr);<br> ierr = PetscFinalize();<br> return ierr;<br>}<br><br>2. Command line options after distributing the mesh, like dm_refine gives error (without any extrusion code or commands)<br><br>3. Even with a single processor, although extrude does not give any errors, it gives a bad mesh data. I tried with .h5 file format as well. <br>-dm_plex_shape sphere -dm_plex_dim 2 -dm_plex_simplex 1 -dm_plex_sphere_radius 1.0 -dm_refine_pre 3 -dm_view hdf5:mesh.h5 -dm_extrude_layers 4 -dm_extrude_thickness 0.1 -dm_extrude_column_first 0<br><br>-----------------------------------------------------------<br>Although this is not what I eventually want, only the following code with command line options (-dm_plex_shape sphere -dm_plex_dim 2 -dm_plex_simplex 1 -dm_plex_sphere_radius 1.0 -dm_refine_pre 3 -dm_view hdf5:mesh.h5) is giving the prismatic mesh in parallel.<br><br>ierr = PetscInitialize(&argc, &argv, NULL, NULL);if (ierr) return ierr;<br>ierr = DMCreate(PETSC_COMM_WORLD, &dm);CHKERRQ(ierr);<br>ierr = DMSetType(dm, DMPLEX);CHKERRQ(ierr);<br>ierr = DMSetFromOptions(dm);CHKERRQ(ierr);<br>{<br>DM dm3D;<br>ierr = DMPlexExtrude(dm, 4, 0.1, PETSC_FALSE, NULL, PETSC_TRUE, &dm3D); CHKERRQ(ierr);<br>if (dm3D) {<br>DMDestroy(&dm);<br>dm = dm3D;<br>}<br>}<br><br>{<br>DM dmDist;<br>ierr = DMPlexDistribute(dm, 0, NULL, &dmDist);CHKERRQ(ierr);<br>if (dmDist) {<br>DMDestroy(&dm);<br>dm = dmDist;<br>}<br>}<br>ierr = DMViewFromOptions(dm, NULL, "-dm_view");CHKERRQ(ierr);<br><br>-----------------------------------------------------------<br></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Nov 11, 2021 at 9:41 PM Matthew Knepley <<a href="mailto:knepley@gmail.com">knepley@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">On Thu, Nov 11, 2021 at 1:17 PM Bhargav Subramanya <<a href="mailto:bhargav.subramanya@kaust.edu.sa" target="_blank">bhargav.subramanya@kaust.edu.sa</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">Dear Matt,<div><br></div><div>I just realized that I used PETSC_COMM_SELF instead of PETSC_COMM_WORLD while performing the above test. I am sorry for that. After fixing it, I have the following 3 cases:</div><div><br></div><div>1. With the below command line options, where I simply create the spherical surface mesh, everything is fine. The mesh gets distributed in parallel with the check DMPlexIsDistributed.</div><div>/home/subrambm/petsc/arch-linux-c-debug/bin/mpiexec -n 2 ./cavity_flow.out -dm_plex_dim 2 -dm_plex_simplex 1 -dm_plex_shape sphere -dm_plex_sphere_radius 1.0 -dm_refine_pre 2 -dm_view vtk:mesh.vtk -dm_distribute 1<br></div><div><br></div><div>-------------------------------------------------------------------<br></div><div>2. This case also seems to work. However, the mesh file data is bad. Do you think it's because of the vtk file format?</div><div>-dm_plex_shape sphere -dm_plex_dim 2 -dm_plex_simplex 1 -dm_plex_sphere_radius 1.0 -dm_refine_pre 3 -dm_extrude_layers 4 -dm_extrude_thickness 0.1 -dm_extrude_column_first 0 -dm_distribute 1 -dm_view vtk:mesh.vtk</div></div></blockquote><div><br></div><div>It should work, but that would be the first place I would check for a bug. I would use</div><div><br></div><div> -dm_view hdf5:mesh.h5</div><div><br></div><div>and then</div><div><br></div><div> $PETSC_DIR/lib/petsc/bin/petsc_gen_xdmf.py mesh.h5</div><div><br></div><div>to make mesh.xmf, which can be loaded by ParaView.</div><div> -- this is very helpful.</div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>-------------------------------------------------------------------</div><div><br></div><div>3. This is the case which I want: To distribute the mesh, refine it in parallel and extrude it. </div><div>However, after distribution, the mesh does not seem to be refined as I get the following error:</div><div>-dm_plex_shape sphere -dm_plex_dim 2 -dm_plex_simplex 1 -dm_plex_sphere_radius 1.0 -dm_refine_pre 3 -dm_distribute 1 -dm_refine 1</div></div></blockquote><div><br></div><div>The problem here is that you are refining the extruded mesh, not refining the surface mesh. There must be something switched around in your file.</div></div></div></blockquote><div><br></div><div> --I was probably not clear here. I did not have anything related to extrusion in the code. I only had the command line options:</div>-dm_plex_shape sphere -dm_plex_dim 2 -dm_plex_simplex 1 -dm_plex_sphere_radius 1.0 -dm_refine_pre 3 -dm_distribute 1 -dm_refine 1 <br>As I mentioned above any refinement or extrusion after the mesh distribution fails for me.</div><div class="gmail_quote"><br><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div class="gmail_quote"><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>Ignoring PCI device with non-16bit domain.<br>Pass --enable-32bits-pci-domain to configure to support such devices<br>(warning: it would break the library ABI, don't enable unless really needed).<br>Ignoring PCI device with non-16bit domain.<br>Pass --enable-32bits-pci-domain to configure to support such devices<br>(warning: it would break the library ABI, don't enable unless really needed).<br>[1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>[1]PETSC ERROR: Nonconforming object sizes<br>[1]PETSC ERROR: The output section point (0) closure size 9 != dual space dimension 12 at height 0 in [0, 0]<br>[1]PETSC ERROR: See <a href="https://urldefense.com/v3/__https://petsc.org/release/faq/__;!!Nmw4Hv0!iY9qn1rPwcxbUsXYP7Sk8GWzw9AJdssBhLDhizIQ_Ar7VU4GcS0FNde9uSv4-MfkiX81RJYr8ho$" target="_blank">https://petsc.org/release/faq/</a> for trouble shooting.<br>[1]PETSC ERROR: Petsc Release Version 3.16.1, unknown<br>[1]PETSC ERROR: ./cavity_flow.out on a arch-linux-c-debug named kw60970 by subrambm Thu Nov 11 21:13:27 2021<br>[1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-mpich --download-fblaslapack --download-superlu_dist --download-hypre --download-fiat --download-generator --download-triangle --download-tetgen --download-chaco --download-make -download-boost --download-cmake --download-ml --download-mumps --download-scalapack --download-parmetis --download-metis --download-ptscotch --download-hdf5<br>[1]PETSC ERROR: #1 DMProjectLocal_Generic_Plex() at /home/subrambm/petsc/src/dm/impls/plex/plexproject.c:762<br>[1]PETSC ERROR: #2 DMProjectFieldLocal_Plex() at /home/subrambm/petsc/src/dm/impls/plex/plexproject.c:933<br>[1]PETSC ERROR: #3 DMProjectFieldLocal() at /home/subrambm/petsc/src/dm/interface/dm.c:8863<br>[1]PETSC ERROR: #4 DMPlexRemapGeometry() at /home/subrambm/petsc/src/dm/impls/plex/plexgeometry.c:3270<br>[1]PETSC ERROR: #5 DMSetFromOptions_Plex() at /home/subrambm/petsc/src/dm/impls/plex/plexcreate.c:3064<br>[1]PETSC ERROR: #6 DMSetFromOptions() at /home/subrambm/petsc/src/dm/interface/dm.c:902<br>[1]PETSC ERROR: #7 main() at meshtest.c:12<br>[1]PETSC ERROR: PETSc Option Table entries:<br>[1]PETSC ERROR: -dm_distribute 1<br>[1]PETSC ERROR: -dm_plex_dim 2<br>[1]PETSC ERROR: -dm_plex_shape sphere<br>[1]PETSC ERROR: -dm_plex_simplex 1<br>[1]PETSC ERROR: -dm_plex_sphere_radius 1.0<br>[1]PETSC ERROR: -dm_refine 1<br>[1]PETSC ERROR: -dm_refine_pre 3<br>[1]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint@mcs.anl.gov----------<br>application called MPI_Abort(MPI_COMM_WORLD, 60) - process 1<br></div><div><br></div><div>Thanks,</div><div>Bhargav</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Wed, Nov 10, 2021 at 4:33 PM Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">On Wed, Nov 10, 2021 at 8:26 AM Bhargav Subramanya <<a href="mailto:bhargav.subramanya@kaust.edu.sa" target="_blank">bhargav.subramanya@kaust.edu.sa</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">Dear Matt,<div><br><div>Thanks a lot for the reply. I am now able to generate the prismatic mesh properly. </div></div></div></blockquote><div><br></div><div>Cool.</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div><div>In the case of mpiexec -n 2 ./meshtest -dm_plex_shape sphere -dm_refine_pre 3 -dm_distribute -dm_refine 2 and DMExtrude(), where I am ordering the extruded cells on the layers first; Is the mesh extruded in parallel for this case? </div></div></div></blockquote><div><br></div><div>Yes.</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div><div>The first time I distributed the mesh, it was only the spherical surface mesh. After dmplex extrude, the mesh changes. Do I need to redistribute the mesh again?</div></div></div></blockquote><div><br></div><div>Probably not. The balance factor will not change, meaning the relative sizes of the partitions. However, if you have a lot of layers, maybe you want to. It is easy,</div><div>just call DMDistribute() again.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div><div>Thanks,</div><div>Bhargav</div></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Wed, Nov 10, 2021 at 12:56 AM Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr">On Tue, Nov 9, 2021 at 9:54 AM Bhargav Subramanya <<a href="mailto:bhargav.subramanya@kaust.edu.sa" target="_blank">bhargav.subramanya@kaust.edu.sa</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">Dear All,<div><br></div><div>I want to generate a prismatic mesh from a base spherical surface mesh. I first refine the base icosahedron sequential mesh to some extent, distribute the mesh, continue the refinement in parallel, project the mesh to a unit sphere, and then finally extrude the mesh. The following is the code I am using. As the extrusion is performed in parallel, the extruded mesh seems to be broken as shown in the attached figure. May I know how to get an intact extruded mesh in parallel? Also, is it possible to make mesh refinement respect the spherical surface geometry, without having to project it using a function as shown below?</div></div></blockquote><div><br></div><div>I think you can do most of what you want from the command line. Here is a simple code:</div><div><br></div><div>#include <petsc.h><br><br>int main(int argc, char **argv)<br>{<br> DM dm;<br> PetscErrorCode ierr;<br><br> ierr = PetscInitialize(&argc, &argv, NULL, NULL);if (ierr) return ierr;<br> ierr = DMCreate(PETSC_COMM_WORLD, &dm);CHKERRQ(ierr);<br> ierr = DMSetType(dm, DMPLEX);CHKERRQ(ierr);<br> ierr = DMSetFromOptions(dm);CHKERRQ(ierr);<br> ierr = DMViewFromOptions(dm, NULL, "-dm_view");CHKERRQ(ierr);<br> ierr = DMDestroy(&dm);CHKERRQ(ierr);<br> ierr = PetscFinalize();<br> return ierr;<br>}<br></div><div><br></div><div>that I run using</div><div><br></div><div> mpiexec -n 2 ./meshtest -dm_plex_shape sphere -dm_refine_pre 3 -dm_extrude 5 -dm_plex_transform_extrude_thickness 0.1 -dm_distribute -dm_view hdf5:mesh.h5</div><div><br></div><div>and get the attached picture. I am refining before distribution only here, since the after distribution refinement would refine the extrude thing, which is not what you want.</div><div>If you want refinement after distribution, you could use this code to do</div><div><br></div><div> mpiexec -n 2 ./meshtest -dm_plex_shape sphere -dm_refine_pre 3 -dm_distribute -dm_refine 2</div><div><br></div><div>and then call DMExtrude() by hand after that. Alternatively, I could make a pre-extrusion refinement option.</div><div><br></div><div> Thanks,</div><div><br></div><div> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div><div><div>Thanks,</div><div>Bhargav</div><div><br></div><div>/* refine the sequential mesh first */<br> for (int r = 0; r < numRefinePre; ++r) {<br> DM rdm = NULL;<br> DMRefine(dm, MPI_COMM_WORLD, &rdm);<br> DMDestroy(&dm);<br> dm = rdm;<br> }<br><br> /* project to a unit sphere */<br> ierr = ProjectToUnitSphere(dm); CHKERRQ(ierr);<br><br> /* create and distribute DM */<br> ierr = DMPlexDistribute(dm, 0, NULL, &dmDist);CHKERRQ(ierr);<br> if (dmDist) {<br> DMDestroy(&dm);<br> dm = dmDist;<br> }<br><br> /* refine the mesh in parallel */<br> for (int r = 0; r < numRefine; ++r) {<br> DM rdm = NULL;<br> DMRefine(dm, MPI_COMM_WORLD, &rdm);<br> DMDestroy(&dm);<br> dm = rdm;<br> }<br><br> /* project to a unit sphere */<br> ierr = ProjectToUnitSphere(dm); CHKERRQ(ierr);<br><br> ierr = DMPlexExtrude(dm, numLayers-1, radialThickness, PETSC_FALSE, NULL, PETSC_TRUE, &dm3D);CHKERRQ(ierr);<br><br> if (dm3D) {<br> DMDestroy(&dm);<br> dm = dm3D;<br> }<br></div><div><br></div></div></div></div>
<br>
<div><hr></div><font face="Arial" size="1">This message and its contents, including attachments are intended solely for the original recipient. If you are not the intended recipient or have received this message in error, please notify me immediately and delete this message from your computer system. Any unauthorized use or distribution is prohibited. Please consider the environment before printing this email.</font></blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="https://urldefense.com/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!Nmw4Hv0!lx1-mURFRmaF1Jo4rDYcZmLsnPsG8EO1HFWoGa2ukYZRt1fF4JLOpS3gHpaUHzL6_KersFjJlmA$" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>
</blockquote></div>
<br>
<div><hr></div><font face="Arial" size="1">This message and its contents, including attachments are intended solely for the original recipient. If you are not the intended recipient or have received this message in error, please notify me immediately and delete this message from your computer system. Any unauthorized use or distribution is prohibited. Please consider the environment before printing this email.</font></blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="https://urldefense.com/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!Nmw4Hv0!i2zXbyfKqyg5D1MzU0Kly__jzbsdtBbBFSsjjn7YY22imNeNSXw_Y4FZN5MzpIX6apvvV7rDhmg$" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>
</blockquote></div>
<br>
<div><hr></div><font face="Arial" size="1">This message and its contents, including attachments are intended solely for the original recipient. If you are not the intended recipient or have received this message in error, please notify me immediately and delete this message from your computer system. Any unauthorized use or distribution is prohibited. Please consider the environment before printing this email.</font></blockquote></div><br clear="all"><div><br></div>-- <br><div dir="ltr"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener</div><div><br></div><div><a href="https://urldefense.com/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!Nmw4Hv0!iY9qn1rPwcxbUsXYP7Sk8GWzw9AJdssBhLDhizIQ_Ar7VU4GcS0FNde9uSv4-MfkiX81QwA9OCs$" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br></div></div></div></div></div></div></div></div>
</blockquote></div></div>
<br>
<div><hr></div><font face="Arial" size="1">This message and its contents, including attachments are intended solely for the original recipient. If you are not the intended recipient or have received this message in error, please notify me immediately and delete this message from your computer system. Any unauthorized use or distribution is prohibited. Please consider the environment before printing this email.</font>