[petsc-users] Question about DMPlexDistribute & distribute mesh over processes

Sami BEN ELHAJ SALAH sami.ben-elhaj-salah at ensma.fr
Tue May 31 13:18:49 CDT 2022


Thank you very much ! 

I have succeed to distribute the mesh by this using this routine. 

PetscDM dmdist = NULL; 
PetscPartitioner part; 
DMPlexGetPartitioner(dm_mesh, &part);
 PetscPartitionerSetType(part, PETSCPARTITIONERPARMETIS); 
PetscPartitionerSetFromOptions(part); 
DMPlexDistribute(dm_mesh, 0, NULL, &dmdist);
 if (dmdist)
	 { 
		DMDestroy(&dm_mesh); 
		dm_mesh = dmdist; 
	}
DMView(dm_mesh, PETSC_VIEWER_STDOUT_WORLD);

Th result is given as below:
DM Object: Parallel Mesh 4 MPI processes
  type: plex
Parallel Mesh in 3 dimensions:
  0-cells: 12 12 12 12
  3-cells: 2 2 2 2
Labels:
  depth: 2 strata with value/size (0 (12), 1 (2))
  material-id: 1 strata with value/size (68 (2))

Thank you a lot for your help and have a good day,

Sami

--
Dr. Sami BEN ELHAJ SALAH
Ingénieur de Recherche (CNRS)
Institut Pprime - ISAE - ENSMA
Mobile: 06.62.51.26.74
Email: sami.ben-elhaj-salah at ensma.fr
www.samibenelhajsalah.com <https://samiben91.github.io/samibenelhajsalah/index.html>



> Le 31 mai 2022 à 18:20, Matthew Knepley <knepley at gmail.com> a écrit :
> 
> On Tue, May 31, 2022 at 12:08 PM Sami BEN ELHAJ SALAH <sami.ben-elhaj-salah at ensma.fr <mailto:sami.ben-elhaj-salah at ensma.fr>> wrote:
> Hi Matthew,
> 
> Two question in this mail:
> 
> I tested my code with 2 processes.  I use gmsh mesh( example:  8 hexahedral elements, 27 nodes, and Global Size of my jacobian matrix = 81 ). Then I used DMView(dm_mesh, PETSC_VIEWER_STDOUT_WORLD) to visualize my DM and see if it is correctly distributed over processes, so I got this:
> 
> (***)
> DM Object: DM_0x3_0 2 MPI processes
>   type: plex
> DM_0x3_0 in 3 dimensions:
>   0-cells: 27 0
>   3-cells: 8 0
> 
> Notice here that your whole mesh is on process 0. You would probably call DMPlexDistribute() to rebalance it.
>  
> Labels:
>   material-id: 3 strata with value/size (66 (4), 67 (2), 68 (2))
>   depth: 2 strata with value/size (0 (27), 1 (8))
>   celltype: 2 strata with value/size (7 (8), 0 (27))
> 
> As you see, I created some material-id labels, so I found them over my 8 cells when using DMView. So it seems good to me. The SNES viewer is shown below:
> 
> SNES Object: 2 MPI processes
>   type: ksponly
>   SNES has not been set up so information may be incomplete
>   maximum iterations=50, maximum function evaluations=10000
>   tolerances: relative=1e-08, absolute=1e-50, solution=1e-08
>   total number of linear solver iterations=0
>   total number of function evaluations=0
>   norm schedule ALWAYS
> SNES Object: 2 MPI processes
>   type: ksponly
>   SNES has not been set up so information may be incomplete
>   maximum iterations=50, maximum function evaluations=10000
>   tolerances: relative=1e-08, absolute=1e-50, solution=1e-08
>   total number of linear solver iterations=0
>   total number of function evaluations=0
>   norm schedule ALWAYS
>   SNESLineSearch Object: 2 MPI processes
>     type: bt
>       interpolation: cubic
>       alpha=1.000000e-04
>   SNESLineSearch Object: 2 MPI processes
>     type: bt
>       interpolation: cubic
>       alpha=1.000000e-04
>     maxstep=1.000000e+08, minlambda=1.000000e-12
>     tolerances: relative=1.000000e-08, absolute=1.000000e-15, lambda=1.000000e-08
>     maximum iterations=40
>     maxstep=1.000000e+08, minlambda=1.000000e-12
>     tolerances: relative=1.000000e-08, absolute=1.000000e-15, lambda=1.000000e-08
>     maximum iterations=40
>   KSP Object: 2 MPI processes
>     type: gmres
>       restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
>   KSP Object: 2 MPI processes
>     type: gmres
>       restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement
>       happy breakdown tolerance 1e-30
>     maximum iterations=10000, initial guess is zero
>     tolerances:  relative=1e-05, absolute=1e-50, divergence=10000.
>     left preconditioning
>     using DEFAULT norm type for convergence test
>   PC Object: 2 MPI processes
>       happy breakdown tolerance 1e-30
>     maximum iterations=10000, initial guess is zero
>     tolerances:  relative=1e-05, absolute=1e-50, divergence=10000.
>     left preconditioning
>     using DEFAULT norm type for convergence test
>   PC Object: 2 MPI processes
>     type: bjacobi
>     PC has not been set up so information may be incomplete
>       number of blocks = -1
>       Local solve is same for all blocks, in the following KSP and PC objects:
>     linear system matrix = precond matrix:
>     type: bjacobi
>     PC has not been set up so information may be incomplete
>       number of blocks = -1
>       Local solve is same for all blocks, in the following KSP and PC objects:
>     linear system matrix = precond matrix:
>     Mat Object: 2 MPI processes
>       type: mpiaij
>     Mat Object: 2 MPI processes
>       type: mpiaij
>       rows=81, cols=81, bs=3
>       rows=81, cols=81, bs=3
>       total: nonzeros=3087, allocated nonzeros=3087
>       total number of mallocs used during MatSetValues calls=0
>       total: nonzeros=3087, allocated nonzeros=3087
>       total number of mallocs used during MatSetValues calls=0
>         using I-node (on process 0) routines: found 27 nodes, limit used is 5
>         not using I-node (on process 0) routines
> 
> Question 1:
> Based on the result given by DMView (see (***) ), I didn't understand if my mesh is correctly distributed ? or my code is missing something ? because when I visualize the local and global sizes of my Jacobian matrix, I found
> 
> PETSc::NonLinearSolver::INIT Size from Jac Matrix: M=81 m =0   //(M: global size, m: local size) this result is given by the proc 1
> and 
> PETSc::NonLinearSolver::INIT Size from Jac Matrix: M=81 m =81 // this result is given by the proc 2
> 
> Yes, your mesh in only on process 0.
>  
> Let me give other information:
> I create my jacobian matrix using:
> PetscErrorCode err = DMCreateMatrix(dm_mesh, &_matrix);
> and I use the PetscSection to tell the DM to use this data layout.
> 
> In my code I wrote this line: 
> DMSetFromOptions(dm_mesh);
> Then, to run my code I use
> mpirun -np 2 /home/benelhasa/fox_petsc/build_test/bin/Debug/FoXtroT -snes_test_jacobian_view -snes_converged_reason -snes_monitor -ksp_monitor -ksp_xmonitor -dm_view -petscpartitioner_type parmetis -dm_distribute -dm_refine 0 cub_8C3D8.fxt
> 
> Something else is wrong in how this code is creating the mesh, because it is not distributing the mesh. Can you just use
> 
>   PetscCall(DMCreate(comm, dm));
>   PetscCall(DMSetType(*dm, DMPLEX));
>   PetscCall(DMSetFromOptions(*dm));
> 
> as we do in the examples and then
> 
>   -dm_plex_filename <your gmsh file>
>  
> Question 2:
> I have another question, just for the comprehension. I understand that DMSetFromOptions(dm_mesh) allows me to use the parameter dm_distribute but I didn't understand how I can use -petscpartitioner_type parmetis argument (see this example that I used to help me (https://gitlab.com/petsc/petsc/-/blob/main/src/dm/impls/plex/tutorials/ex14.c <https://gitlab.com/petsc/petsc/-/blob/main/src/dm/impls/plex/tutorials/ex14.c>), may be when the DM uses a data layout either by PetscSection or PetscDS, then I can use automatically the -petscpartitioner_type parmetis/simple/scotch ?? (Can you tell me more about this topic please) 
> 
> I do not yet understand the question. The partitioner only applies to the topology, not to functions on the mesh.
> 
>   Thanks,
> 
>      Matt
>  
> Ps: My solution is converging when I use 1 process and I don’t have problem.
> 
> --
> Dr. Sami BEN ELHAJ SALAH
> Ingénieur de Recherche (CNRS)
> Institut Pprime - ISAE - ENSMA
> Mobile: 06.62.51.26.74
> Email: sami.ben-elhaj-salah at ensma.fr <mailto:sami.ben-elhaj-salah at ensma.fr>
> www.samibenelhajsalah.com <https://samiben91.github.io/samibenelhajsalah/index.html>
> 
> 
> 
>> Le 29 mai 2022 à 18:02, Sami BEN ELHAJ SALAH <sami.ben-elhaj-salah at ensma.fr <mailto:sami.ben-elhaj-salah at ensma.fr>> a écrit :
>> 
>> Hi Matthew,
>> Thank you for this example. It seems exactly what I am looking for.
>> Thank you again for your help and have a good day.
>> Sami,
>> --
>> Dr. Sami BEN ELHAJ SALAH
>> Ingénieur de Recherche (CNRS)
>> Institut Pprime - ISAE - ENSMA
>> Mobile: 06.62.51.26.74
>> Email: sami.ben-elhaj-salah at ensma.fr <mailto:sami.ben-elhaj-salah at ensma.fr>
>> www.samibenelhajsalah.com <https://samiben91.github.io/samibenelhajsalah/index.html>
>> 
>> 
>> 
>>> Le 28 mai 2022 à 20:20, Matthew Knepley <knepley at gmail.com <mailto:knepley at gmail.com>> a écrit :
>>> 
>>> On Sat, May 28, 2022 at 2:19 PM Matthew Knepley <knepley at gmail.com <mailto:knepley at gmail.com>> wrote:
>>> On Sat, May 28, 2022 at 1:35 PM Sami BEN ELHAJ SALAH <sami.ben-elhaj-salah at ensma.fr <mailto:sami.ben-elhaj-salah at ensma.fr>> wrote:
>>> Hi Matthew,
>>> 
>>> Thank you for your response.
>>> 
>>> I don't have that. My DM object is not linked to PetscSection yet. I'll try that. 
>>> Is there any example that manages this case? (DMPlexDistribute & PetscSection) or any guideline will be helpful.
>>> 
>>> Here is an example where we create a section without DS.
>>> 
>>> Forgot the link: https://gitlab.com/petsc/petsc/-/blob/main/src/dm/impls/plex/tutorials/ex14.c <https://gitlab.com/petsc/petsc/-/blob/main/src/dm/impls/plex/tutorials/ex14.c>
>>>  
>>>   THanks,
>>> 
>>>     Matt
>>>  
>>> Thanks in advance,
>>> 
>>> Sami,
>>> 
>>> --
>>> Dr. Sami BEN ELHAJ SALAH
>>> Ingénieur de Recherche (CNRS)
>>> Institut Pprime - ISAE - ENSMA
>>> Mobile: 06.62.51.26.74
>>> Email: sami.ben-elhaj-salah at ensma.fr <mailto:sami.ben-elhaj-salah at ensma.fr>
>>> www.samibenelhajsalah.com <https://samiben91.github.io/samibenelhajsalah/index.html>
>>> 
>>> 
>>> 
>>>> Le 27 mai 2022 à 20:45, Matthew Knepley <knepley at gmail.com <mailto:knepley at gmail.com>> a écrit :
>>>> 
>>>> On Fri, May 27, 2022 at 9:42 AM Sami BEN ELHAJ SALAH <sami.ben-elhaj-salah at ensma.fr <mailto:sami.ben-elhaj-salah at ensma.fr>> wrote:
>>>> Hello Isaac,
>>>> 
>>>> Thank you for your reply!
>>>> 
>>>> Let me confirm that when I use DMCreateMatrix() with the orig_dm, I got my jacobian_matrix. Also, I have succeeded to solve my system and my solution was converged by using one process.
>>>> Let me give you some other information about my code. Currently, I am using my own discretization system and not the PetscDS object. Considering the nonlinear solver SNES, I use the following routines (like the basic snes usage):
>>>> - SNESSetFunction(snes,residual_vector,compute_residual ,(void*) _finite_element_formulation)
>>>> - SNESSetJacobian(snes, jacobian.matrix(), _jacobian_precond_matrix, compute_jacobian, (void*) _finite_element_formulation) 
>>>> - SNESSolve(snes,NULL,x)
>>>> 
>>>> Regarding  your last answer, I will try to reformulate my question as follows:
>>>> Using a distributed dm instead of the original dm (not distributed dm) and my own discretization system (not the PetscDS object),
>>>> 
>>>> You do not have to use PetscDS, but the DM does need to a PetscSection in order to compute sizes and sparsity patterns. Do you have that?
>>>> 
>>>>   Thanks,
>>>> 
>>>>      Matt
>>>>  
>>>> should I add something specific to get a distributed jacobian_matrix over processes?
>>>> I precise that I just replaced the orig_dm by a distributed mesh with the routine that I wrote in my first mail. So is it enough ?
>>>>  
>>>> Thank you and have a good day,
>>>> Sami,
>>>> 
>>>> 
>>>> --
>>>> Dr. Sami BEN ELHAJ SALAH
>>>> Ingénieur de Recherche (CNRS)
>>>> Institut Pprime - ISAE - ENSMA
>>>> Mobile: 06.62.51.26.74
>>>> Email: sami.ben-elhaj-salah at ensma.fr <mailto:sami.ben-elhaj-salah at ensma.fr>
>>>> www.samibenelhajsalah.com <https://samiben91.github.io/samibenelhajsalah/index.html>
>>>> 
>>>> 
>>>> 
>>>>> Le 25 mai 2022 à 19:41, Toby Isaac <toby.isaac at gmail.com <mailto:toby.isaac at gmail.com>> a écrit :
>>>>> 
>>>>> Hi Sami,
>>>>> 
>>>>> Just to verify: if you call DMCreateMatrix() on orig_dm, do you get a
>>>>> Jacobian matrix?
>>>>> 
>>>>> The DMPlex must be told what kind of discretized fields you want a
>>>>> matrix for and what equations you are discretizing.  This is handled
>>>>> by the PetscDS object.  In snes/tutorials/ex59.c, see the code after
>>>>> DMGetDS() for an example.
>>>>> 
>>>>> - Toby
>>>>> 
>>>>> On Wed, May 25, 2022 at 1:17 PM Sami BEN ELHAJ SALAH
>>>>> <sami.ben-elhaj-salah at ensma.fr <mailto:sami.ben-elhaj-salah at ensma.fr>> wrote:
>>>>>> 
>>>>>> Dear PETSc developer team,
>>>>>> 
>>>>>> I m trying to create à jacobian_matrix from distributed DM. I have followed the two examples (snes/tests/ex2.c and ex56.c). So I wrote this routine:
>>>>>> 
>>>>>> PetscDM orig_dm;
>>>>>> PetscDM dist_dm = NULL;
>>>>>> PetscPartitioner part;
>>>>>> DMPlexGetPartitioner(orig_dm, &part);
>>>>>> PetscPartitionerSetType(part, PETSCPARTITIONERPARMETIS);
>>>>>> DMPlexDistribute(orig_dm, 0, NULL, &dist_dm);
>>>>>> 
>>>>>> PetscErrorCode err = DMCreateMatrix(dist_dm, &jacobian_matrix);
>>>>>> PetscInt M, N, m, n;
>>>>>> MatGetSize(jacobian_matrix, &M, &N);
>>>>>> MatGetLocalSize(jacobian_matrix, &m, &n);
>>>>>> 
>>>>>> Then I run my code with 2 processes and I obtained this result:
>>>>>> Size from jacobian_matrix: M=0 m =0 (this result is the same in all processes).
>>>>>> 
>>>>>> I didn't understand if I forgot something in my code to obtain the correct values for the local and global sizes for the jacobean_matrix? (I mean if my code is missing something to obtain a distributed mesh over processes ?)
>>>>>> 
>>>>>> Thank you in advance for any help!
>>>>>> Sami,
>>>>>> 
>>>>>> 
>>>>>> --
>>>>>> Dr. Sami BEN ELHAJ SALAH
>>>>>> Ingénieur de Recherche (CNRS)
>>>>>> Institut Pprime - ISAE - ENSMA
>>>>>> Mobile: 06.62.51.26.74
>>>>>> Email: sami.ben-elhaj-salah at ensma.fr <mailto:sami.ben-elhaj-salah at ensma.fr>
>>>>>> www.samibenelhajsalah.com <http://www.samibenelhajsalah.com/>
>>>>>> 
>>>>>> 
>>>>>> 
>>>> 
>>>> 
>>>> 
>>>> -- 
>>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>>>> -- Norbert Wiener
>>>> 
>>>> https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
>>> 
>>> 
>>> 
>>> -- 
>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>>> -- Norbert Wiener
>>> 
>>> https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
>>> 
>>> 
>>> -- 
>>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>>> -- Norbert Wiener
>>> 
>>> https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
> 
> https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220531/d7ce1f55/attachment-0001.html>


More information about the petsc-users mailing list