[petsc-users] Creating dmplex in every rank and redistributing

Prateek Gupta prateekgupta1709 at gmail.com
Mon Jul 11 05:54:18 CDT 2022


Thanks Matt.
While looking into the function, I noted PetscHSetI and related functions
but couldn't find their docs (404 error). Have these functions and data
types deprecated?

Thank you.
Sincerely,
Prateek Gupta


On Mon, Jul 4, 2022 at 6:09 PM Matthew Knepley <knepley at gmail.com> wrote:

> On Mon, Jul 4, 2022 at 2:29 AM Prateek Gupta <prateekgupta1709 at gmail.com>
> wrote:
>
>> Hi,
>>
>> Using dmplex, I am trying to create an example where I can start with a
>> poor distribution of an unstructured mesh (reading from a file in parallel)
>> and then use redistribution to optimize it.
>>
>
> You can call DMPlexDistribute() on any mesh, including a parallel one.
>
>
>> I know that I can call ReBalanceSharedPoints on an already created
>> distribution from dmplexDistribute. But is it possible to initialize a
>> dmplex from each rank (each rank initializes its own chunk of the mesh) and
>> then call this function?
>>
>
> Yes, this is done in
>
>
> https://petsc.org/main/docs/manualpages/DMPLEX/DMPlexCreateFromCellListParallelPetsc/
>
> It is not trivial to do by hand, so I would look at that code first if you
> want to do that.
>
>
>> Most of the numbering in dmplex DAG representation is local, but while
>> reading from a file in parallel, I only have access to global numbering of
>> nodes. Do I need to reassign this to a local numbering? Is there a
>> datastructure within petsc that can help with this?
>>
>>
>> Thank you.
>> Sincerely,
>> Prateek Gupta, PhD
>>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20220711/1fbd50cb/attachment.html>


More information about the petsc-users mailing list