[petsc-users] Creating dmplex in every rank and redistributing

Prateek Gupta prateekgupta1709 at gmail.com
Mon Jul 11 12:05:02 CDT 2022


Thank you. In the meantime, can you please let me know what these are? They
look like hash maps to me but I am not sure what functions like

PetscHSetIGetElems are for and would be really helpful if you could
maybe pass on the old documentation information


Thank you.
Sincerely,
Prateek Gupta, PhD


On Mon, Jul 11, 2022 at 5:19 PM Matthew Knepley <knepley at gmail.com> wrote:

> On Mon, Jul 11, 2022 at 5:54 AM Prateek Gupta <prateekgupta1709 at gmail.com>
> wrote:
>
>> Thanks Matt.
>> While looking into the function, I noted PetscHSetI and related functions
>> but couldn't find their docs (404 error). Have these functions and data
>> types deprecated?
>>
>
> No. These are a bit complicated because they are made by the preprocessor
> from a 3rd party package.
>
> We broke the links when we moved all the documentation to Sphinx. I will
> see if I can fix them.
>
>   Thanks,
>
>       Matt
>
>
>> Thank you.
>> Sincerely,
>> Prateek Gupta
>>
>>
>> On Mon, Jul 4, 2022 at 6:09 PM Matthew Knepley <knepley at gmail.com> wrote:
>>
>>> On Mon, Jul 4, 2022 at 2:29 AM Prateek Gupta <prateekgupta1709 at gmail.com>
>>> wrote:
>>>
>>>> Hi,
>>>>
>>>> Using dmplex, I am trying to create an example where I can start with a
>>>> poor distribution of an unstructured mesh (reading from a file in parallel)
>>>> and then use redistribution to optimize it.
>>>>
>>>
>>> You can call DMPlexDistribute() on any mesh, including a parallel one.
>>>
>>>
>>>> I know that I can call ReBalanceSharedPoints on an already created
>>>> distribution from dmplexDistribute. But is it possible to initialize a
>>>> dmplex from each rank (each rank initializes its own chunk of the mesh) and
>>>> then call this function?
>>>>
>>>
>>> Yes, this is done in
>>>
>>>
>>> https://petsc.org/main/docs/manualpages/DMPLEX/DMPlexCreateFromCellListParallelPetsc/
>>>
>>> It is not trivial to do by hand, so I would look at that code first if
>>> you want to do that.
>>>
>>>
>>>> Most of the numbering in dmplex DAG representation is local, but while
>>>> reading from a file in parallel, I only have access to global numbering of
>>>> nodes. Do I need to reassign this to a local numbering? Is there a
>>>> datastructure within petsc that can help with this?
>>>>
>>>>
>>>> Thank you.
>>>> Sincerely,
>>>> Prateek Gupta, PhD
>>>>
>>> --
>>> What most experimenters take for granted before they begin their
>>> experiments is infinitely more interesting than any results to which their
>>> experiments lead.
>>> -- Norbert Wiener
>>>
>>> https://www.cse.buffalo.edu/~knepley/
>>> <http://www.cse.buffalo.edu/~knepley/>
>>>
>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220711/0ebd442d/attachment.html>


More information about the petsc-users mailing list