[petsc-users] parallel dual porosity

Adrian Croucher a.croucher at auckland.ac.nz
Mon May 27 18:26:14 CDT 2019


hi

A couple of years back I was asking questions here about implementing 
"dual porosity" finite volume methods via PETSc (in which flow in 
fractured media is represented by adding extra "matrix" cells nested 
inside the original mesh cells).

At the time I was asking about how to solve the resulting linear 
equations more efficiently (I still haven't worked on that part of it 
yet, so at present it's still just using a naive linear solve which 
doesn't take advantage of the particular sparsity pattern), and about 
how to add the extra cells into the DMPlex mesh, which I figured out how 
to do.

It is working OK except that strong scaling performance is not very 
good, if dual porosity is applied over only part of the mesh. I think 
the reason is that I read the mesh in and distribute it, then add the 
dual porosity cells in parallel on each process. So some processes can 
end up with more cells than others, in which case the load balancing is bad.

I'm considering trying to change it so that I add the dual porosity 
cells to the DMPlex in serial, before distribution, to regain decent 
load balancing.

To do that, I'd also need to compute the cell centroids in serial (as 
they are often used to identify which cells should have dual porosity 
applied), using DMPlexComputeGeometryFVM(). The geometry vectors would 
then have to be distributed later, I guess using something like 
DMPlexDistributeField().

Should I expect a significant performance hit from calling 
DMPlexComputeGeometryFVM() on the serial mesh compared with doing it (as 
now) on the distributed mesh? It will increase the serial fraction of 
the code but as it's only done once at the start I'm hoping the benefits 
will outweigh the costs.

- Adrian

-- 
Dr Adrian Croucher
Senior Research Fellow
Department of Engineering Science
University of Auckland, New Zealand
email:a.croucher at auckland.ac.nz
tel: +64 (0)9 923 4611



More information about the petsc-users mailing list