[petsc-users] DMPlex Distribution

Mark Adams mfadams at lbl.gov
Tue Sep 17 11:43:05 CDT 2019


On Tue, Sep 17, 2019 at 12:07 PM Mohammad Hassan via petsc-users <
petsc-users at mcs.anl.gov> wrote:

> Thanks for suggestion. I am going to use a block-based amr. I think I need
> to know exactly the mesh distribution of blocks across different processors
> for implementation of amr.
>
> And as a general question, can we set block size of vector on each rank?
>

I don't understand what you mean by AMR in this context exactly. And I'm
not sure what you mean by blocks size. Block size is the number of dof per
vertex (eg, 3) and it is a constant for a vector.


> Thanks
>
> Amir
>
>
>
> *From:* Matthew Knepley [mailto:knepley at gmail.com]
> *Sent:* Tuesday, September 17, 2019 11:04 PM
> *To:* Mohammad Hassan <mhbaghaei at mail.sjtu.edu.cn>
> *Cc:* PETSc <petsc-users at mcs.anl.gov>
> *Subject:* Re: [petsc-users] DMPlex Distribution
>
>
>
> On Tue, Sep 17, 2019 at 9:27 AM Mohammad Hassan via petsc-users <
> petsc-users at mcs.anl.gov> wrote:
>
> Hi
>
> I am using DMPlexCreateFromDAG() to construct my DM. Is it possible to set
> the distribution across processors manually. I mean, how can I set the
> share of dm on each rank (local)?
>
>
>
> You could make a Shell partitioner and tell it the entire partition:
>
>
>
>
> https://www.mcs.anl.gov/petsc/petsc-master/docs/manualpages/DMPLEX/PetscPartitionerShellSetPartition.html
>
>
>
> However, I would be surprised if you could do this. It is likely that you
> just want to mess with the weights in ParMetis.
>
>
>
>   Thanks,
>
>
>
>     Matt
>
>
>
> Thanks
>
> Amir
>
>
>
>
> --
>
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
>
>
> https://www.cse.buffalo.edu/~knepley/
> <http://www.cse.buffalo.edu/~knepley/>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190917/7850a785/attachment.html>


More information about the petsc-users mailing list