[petsc-users] DMPlex Distribution

Mohammad Hassan mhbaghaei at mail.sjtu.edu.cn
Wed Sep 18 10:01:57 CDT 2019


Thanks for your suggestion, Matthew. I will certainly look into DMForest for refining of my base DMPlex dm.

 

From: Matthew Knepley [mailto:knepley at gmail.com] 
Sent: Wednesday, September 18, 2019 10:35 PM
To: Mohammad Hassan <mhbaghaei at mail.sjtu.edu.cn>
Cc: PETSc <petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] DMPlex Distribution

 

On Wed, Sep 18, 2019 at 10:27 AM Mohammad Hassan <mhbaghaei at mail.sjtu.edu.cn <mailto:mhbaghaei at mail.sjtu.edu.cn> > wrote:

I want to implement block-based AMR, which turns my base conformal mesh to non-conformal.  My question is how DMPlex renders a mesh that it cannot support non-conformal meshes. 

 

Mark misspoke. Plex _does_ support geometrically non-conforming meshing, e.g. "hanging nodes". The easiest way to

use Plex this way is to use DMForest, which uses Plex underneath.

 

There are excellent p4est tutorials. What you would do is create your conformal mesh, using Plex if you want, and

use that for the p4est base mesh (you would have the base mesh be the forest roots).

 

  Thanks,

 

     Matt

 

If DMPlex does not work, I will try to use DMForest.  

 

From: Matthew Knepley [mailto:knepley at gmail.com <mailto:knepley at gmail.com> ] 
Sent: Wednesday, September 18, 2019 9:50 PM
To: Mohammad Hassan <mhbaghaei at mail.sjtu.edu.cn <mailto:mhbaghaei at mail.sjtu.edu.cn> >
Cc: Mark Adams <mfadams at lbl.gov <mailto:mfadams at lbl.gov> >; PETSc <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov> >
Subject: Re: [petsc-users] DMPlex Distribution

 

On Wed, Sep 18, 2019 at 9:35 AM Mohammad Hassan via petsc-users <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov> > wrote:

If DMPlex does not support, I may need to use PARAMESH or CHOMBO. Is there any way that we can construct non-conformal layout for DM in petsc?

 

Lets see. Plex does support geometrically non-conforming meshes. This is how we support p4est. However, if

you want that, you can just use DMForest I think. So you jsut want structured AMR?

 

  Thanks,

 

    Matt

 

 

From: Mark Adams [mailto: <mailto:mfadams at lbl.gov> mfadams at lbl.gov] 
Sent: Wednesday, September 18, 2019 9:23 PM
To: Mohammad Hassan < <mailto:mhbaghaei at mail.sjtu.edu.cn> mhbaghaei at mail.sjtu.edu.cn>
Cc: Matthew Knepley < <mailto:knepley at gmail.com> knepley at gmail.com>; PETSc users list < <mailto:petsc-users at mcs.anl.gov> petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] DMPlex Distribution

 

I'm puzzled. It sounds like you are doing non-conforming AMR (structured block AMR), but Plex does not support that.

 

On Tue, Sep 17, 2019 at 11:41 PM Mohammad Hassan via petsc-users <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov> > wrote:

Mark is  right. The functionality of AMR does not relate to parallelization of that. The vector size (global or local) does not conflict with AMR functions.

Thanks

 

Amir

 

From: Matthew Knepley [mailto: <mailto:knepley at gmail.com> knepley at gmail.com] 
Sent: Wednesday, September 18, 2019 12:59 AM
To: Mohammad Hassan < <mailto:mhbaghaei at mail.sjtu.edu.cn> mhbaghaei at mail.sjtu.edu.cn>
Cc: PETSc < <mailto:petsc-maint at mcs.anl.gov> petsc-maint at mcs.anl.gov>
Subject: Re: [petsc-users] DMPlex Distribution

 

On Tue, Sep 17, 2019 at 12:03 PM Mohammad Hassan <mhbaghaei at mail.sjtu.edu.cn <mailto:mhbaghaei at mail.sjtu.edu.cn> > wrote:

Thanks for suggestion. I am going to use a block-based amr. I think I need to know exactly the mesh distribution of blocks across different processors for implementation of amr.

 

Hi Amir,

 

How are you using Plex if the block-AMR is coming from somewhere else? This will help

me tell you what would be best.

 

And as a general question, can we set block size of vector on each rank?

 

I think as Mark says that you are using "blocksize" is a different way than PETSc.

 

  Thanks,

 

    Matt

 

Thanks

Amir

 

From: Matthew Knepley [mailto: <mailto:knepley at gmail.com> knepley at gmail.com] 
Sent: Tuesday, September 17, 2019 11:04 PM
To: Mohammad Hassan < <mailto:mhbaghaei at mail.sjtu.edu.cn> mhbaghaei at mail.sjtu.edu.cn>
Cc: PETSc < <mailto:petsc-users at mcs.anl.gov> petsc-users at mcs.anl.gov>
Subject: Re: [petsc-users] DMPlex Distribution

 

On Tue, Sep 17, 2019 at 9:27 AM Mohammad Hassan via petsc-users <petsc-users at mcs.anl.gov <mailto:petsc-users at mcs.anl.gov> > wrote:

Hi

I am using DMPlexCreateFromDAG() to construct my DM. Is it possible to set the distribution across processors manually. I mean, how can I set the share of dm on each rank (local)?

 

You could make a Shell partitioner and tell it the entire partition:

 

  https://www.mcs.anl.gov/petsc/petsc-master/docs/manualpages/DMPLEX/PetscPartitionerShellSetPartition.html

 

However, I would be surprised if you could do this. It is likely that you just want to mess with the weights in ParMetis.

 

  Thanks,

 

    Matt

 

Thanks

Amir




 

-- 

What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

 

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/> 




 

-- 

What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

 

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/> 




 

-- 

What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

 

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/> 




 

-- 

What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

 

https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190918/78177dbf/attachment-0001.html>


More information about the petsc-users mailing list