[petsc-users] Question about DMPlexDistribute & distribute mesh over processes

Sami BEN ELHAJ SALAH sami.ben-elhaj-salah at ensma.fr
Sun May 29 11:02:07 CDT 2022


Hi Matthew,
Thank you for this example. It seems exactly what I am looking for.
Thank you again for your help and have a good day.
Sami,
--
Dr. Sami BEN ELHAJ SALAH
Ingénieur de Recherche (CNRS)
Institut Pprime - ISAE - ENSMA
Mobile: 06.62.51.26.74
Email: sami.ben-elhaj-salah at ensma.fr
www.samibenelhajsalah.com <https://samiben91.github.io/samibenelhajsalah/index.html>



> Le 28 mai 2022 à 20:20, Matthew Knepley <knepley at gmail.com> a écrit :
> 
> On Sat, May 28, 2022 at 2:19 PM Matthew Knepley <knepley at gmail.com <mailto:knepley at gmail.com>> wrote:
> On Sat, May 28, 2022 at 1:35 PM Sami BEN ELHAJ SALAH <sami.ben-elhaj-salah at ensma.fr <mailto:sami.ben-elhaj-salah at ensma.fr>> wrote:
> Hi Matthew,
> 
> Thank you for your response.
> 
> I don't have that. My DM object is not linked to PetscSection yet. I'll try that. 
> Is there any example that manages this case? (DMPlexDistribute & PetscSection) or any guideline will be helpful.
> 
> Here is an example where we create a section without DS.
> 
> Forgot the link: https://gitlab.com/petsc/petsc/-/blob/main/src/dm/impls/plex/tutorials/ex14.c <https://gitlab.com/petsc/petsc/-/blob/main/src/dm/impls/plex/tutorials/ex14.c>
>  
>   THanks,
> 
>     Matt
>  
> Thanks in advance,
> 
> Sami,
> 
> --
> Dr. Sami BEN ELHAJ SALAH
> Ingénieur de Recherche (CNRS)
> Institut Pprime - ISAE - ENSMA
> Mobile: 06.62.51.26.74
> Email: sami.ben-elhaj-salah at ensma.fr <mailto:sami.ben-elhaj-salah at ensma.fr>
> www.samibenelhajsalah.com <https://samiben91.github.io/samibenelhajsalah/index.html>
> 
> 
> 
>> Le 27 mai 2022 à 20:45, Matthew Knepley <knepley at gmail.com <mailto:knepley at gmail.com>> a écrit :
>> 
>> On Fri, May 27, 2022 at 9:42 AM Sami BEN ELHAJ SALAH <sami.ben-elhaj-salah at ensma.fr <mailto:sami.ben-elhaj-salah at ensma.fr>> wrote:
>> Hello Isaac,
>> 
>> Thank you for your reply!
>> 
>> Let me confirm that when I use DMCreateMatrix() with the orig_dm, I got my jacobian_matrix. Also, I have succeeded to solve my system and my solution was converged by using one process.
>> Let me give you some other information about my code. Currently, I am using my own discretization system and not the PetscDS object. Considering the nonlinear solver SNES, I use the following routines (like the basic snes usage):
>> - SNESSetFunction(snes,residual_vector,compute_residual ,(void*) _finite_element_formulation)
>> - SNESSetJacobian(snes, jacobian.matrix(), _jacobian_precond_matrix, compute_jacobian, (void*) _finite_element_formulation) 
>> - SNESSolve(snes,NULL,x)
>> 
>> Regarding  your last answer, I will try to reformulate my question as follows:
>> Using a distributed dm instead of the original dm (not distributed dm) and my own discretization system (not the PetscDS object),
>> 
>> You do not have to use PetscDS, but the DM does need to a PetscSection in order to compute sizes and sparsity patterns. Do you have that?
>> 
>>   Thanks,
>> 
>>      Matt
>>  
>> should I add something specific to get a distributed jacobian_matrix over processes?
>> I precise that I just replaced the orig_dm by a distributed mesh with the routine that I wrote in my first mail. So is it enough ?
>>  
>> Thank you and have a good day,
>> Sami,
>> 
>> 
>> --
>> Dr. Sami BEN ELHAJ SALAH
>> Ingénieur de Recherche (CNRS)
>> Institut Pprime - ISAE - ENSMA
>> Mobile: 06.62.51.26.74
>> Email: sami.ben-elhaj-salah at ensma.fr <mailto:sami.ben-elhaj-salah at ensma.fr>
>> www.samibenelhajsalah.com <https://samiben91.github.io/samibenelhajsalah/index.html>
>> 
>> 
>> 
>>> Le 25 mai 2022 à 19:41, Toby Isaac <toby.isaac at gmail.com <mailto:toby.isaac at gmail.com>> a écrit :
>>> 
>>> Hi Sami,
>>> 
>>> Just to verify: if you call DMCreateMatrix() on orig_dm, do you get a
>>> Jacobian matrix?
>>> 
>>> The DMPlex must be told what kind of discretized fields you want a
>>> matrix for and what equations you are discretizing.  This is handled
>>> by the PetscDS object.  In snes/tutorials/ex59.c, see the code after
>>> DMGetDS() for an example.
>>> 
>>> - Toby
>>> 
>>> On Wed, May 25, 2022 at 1:17 PM Sami BEN ELHAJ SALAH
>>> <sami.ben-elhaj-salah at ensma.fr <mailto:sami.ben-elhaj-salah at ensma.fr>> wrote:
>>>> 
>>>> Dear PETSc developer team,
>>>> 
>>>> I m trying to create à jacobian_matrix from distributed DM. I have followed the two examples (snes/tests/ex2.c and ex56.c). So I wrote this routine:
>>>> 
>>>> PetscDM orig_dm;
>>>> PetscDM dist_dm = NULL;
>>>> PetscPartitioner part;
>>>> DMPlexGetPartitioner(orig_dm, &part);
>>>> PetscPartitionerSetType(part, PETSCPARTITIONERPARMETIS);
>>>> DMPlexDistribute(orig_dm, 0, NULL, &dist_dm);
>>>> 
>>>> PetscErrorCode err = DMCreateMatrix(dist_dm, &jacobian_matrix);
>>>> PetscInt M, N, m, n;
>>>> MatGetSize(jacobian_matrix, &M, &N);
>>>> MatGetLocalSize(jacobian_matrix, &m, &n);
>>>> 
>>>> Then I run my code with 2 processes and I obtained this result:
>>>> Size from jacobian_matrix: M=0 m =0 (this result is the same in all processes).
>>>> 
>>>> I didn't understand if I forgot something in my code to obtain the correct values for the local and global sizes for the jacobean_matrix? (I mean if my code is missing something to obtain a distributed mesh over processes ?)
>>>> 
>>>> Thank you in advance for any help!
>>>> Sami,
>>>> 
>>>> 
>>>> --
>>>> Dr. Sami BEN ELHAJ SALAH
>>>> Ingénieur de Recherche (CNRS)
>>>> Institut Pprime - ISAE - ENSMA
>>>> Mobile: 06.62.51.26.74
>>>> Email: sami.ben-elhaj-salah at ensma.fr <mailto:sami.ben-elhaj-salah at ensma.fr>
>>>> www.samibenelhajsalah.com <http://www.samibenelhajsalah.com/>
>>>> 
>>>> 
>>>> 
>> 
>> 
>> 
>> -- 
>> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
>> -- Norbert Wiener
>> 
>> https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
> 
> https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
> -- Norbert Wiener
> 
> https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220529/ca4ea2e3/attachment.html>


More information about the petsc-users mailing list