[petsc-users] Configure nested PCFIELDSPLIT with general index sets
Natacha BEREUX
natacha.bereux at gmail.com
Wed Mar 22 13:45:48 CDT 2017
Hello Matt,
Thanks a lot for your answers.
Since I am working on a large FEM Fortran code, I have to stick to Fortran.
Do you know if someone plans to add this Fortran interface? Or may be I
could do it myself ? Is this particular interface very hard to add ?
Perhaps could I mimic some other interface ?
What would you advise ?
Best regards,
Natacha
On Wed, Mar 22, 2017 at 12:33 PM, Matthew Knepley <knepley at gmail.com> wrote:
> On Wed, Mar 22, 2017 at 10:03 AM, Natacha BEREUX <natacha.bereux at gmail.com
> > wrote:
>
>> Hello,
>> if my understanding is correct, the approach proposed by Matt and
>> Lawrence is the following :
>> - create a DMShell (DMShellCreate)
>> - define my own CreateFieldDecomposition to return the index sets I need
>> (for displacement, pressure and temperature degrees of freedom) :
>> myCreateFieldDecomposition(... )
>> - set it in the DMShell ( DMShellSetCreateFieldDecomposition)
>> - then sets the DM in KSP context (KSPSetDM)
>>
>> I have some more questions
>> - I did not succeed in setting my own CreateFieldDecomposition in the
>> DMShell : link fails with " unknown reference to «
>> dmshellsetcreatefielddecomposition_ ». Could it be a Fortran problem (I
>> am using Fortran)? Is this routine available in PETSc Fortran interface ?
>> \
>>
>
> Yes, exactly. The Fortran interface for passing function pointers is
> complex, and no one has added this function yet.
>
>
>> - CreateFieldDecomposition is supposed to return an array of dms (to
>> define the fields). I am not able to return such datas. Do I return a
>> PETSC_NULL_OBJECT instead ?
>>
>
> Yes.
>
>
>> - do I have to provide something else to define the DMShell ?
>>
>
> I think you will have to return local and global vectors, but this just
> means creating a vector of the correct size and distribution.
>
> Thanks,
>
> Matt
>
>
>> Thanks a lot for your help
>> Natacha
>>
>> On Tue, Mar 21, 2017 at 2:44 PM, Natacha BEREUX <natacha.bereux at gmail.com
>> > wrote:
>>
>>> Thanks for your quick answers. To be honest, I am not familiar at all
>>> with DMShells and DMPlexes. But since it is what I need, I am going to try
>>> it.
>>> Thanks again for your advices,
>>> Natacha
>>>
>>> On Tue, Mar 21, 2017 at 2:27 PM, Lawrence Mitchell <
>>> lawrence.mitchell at imperial.ac.uk> wrote:
>>>
>>>>
>>>> > On 21 Mar 2017, at 13:24, Matthew Knepley <knepley at gmail.com> wrote:
>>>> >
>>>> > I think the remedy is as easy as specifying a DMShell that has a
>>>> PetscSection (DMSetDefaultSection) with your ordering, and
>>>> > I think this is how Firedrake (http://www.firedrakeproject.org/)
>>>> does it.
>>>>
>>>> We actually don't use a section, but we do provide
>>>> DMCreateFieldDecomposition_Shell.
>>>>
>>>> If you have a section that describes all the fields, then I think if
>>>> the DMShell knows about it, you effectively get the same behaviour as
>>>> DMPlex (which does the decomposition in the same manner?).
>>>>
>>>> > However, I usually use a DMPlex which knows about my
>>>> > mesh, so I am not sure if this strategy has any holes.
>>>>
>>>> I haven't noticed anything yet.
>>>>
>>>> Lawrence
>>>
>>>
>>>
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20170322/2767d690/attachment-0001.html>
More information about the petsc-users
mailing list