[petsc-users] Software for load balancing to improve parallel performance to be used with PETSc

TAY wee-beng zonexo at gmail.com
Mon Jan 9 13:11:00 CST 2012


Hi Jed,

On 9/1/2012 5:07 PM, Jed Brown wrote:
> On Mon, Jan 9, 2012 at 09:57, TAY wee-beng <zonexo at gmail.com 
> <mailto:zonexo at gmail.com>> wrote:
>
>     I only compute rows which the processor own. Can it be the memory
>     allocation? I'll check on that.
>
>
> Usually memory allocation mistakes cost much more.
>
>>
>>     Most of your solve time is going into PCSetUp() and PCApply, both
>>     of which are getting more expensive as you add processes. These
>>     are more than 10x more than spent in MatMult() and MatMult()
>>     takes slightly less time on more processes, so the increase isn't
>>     entirely due to memory issues.
>>
>>     What methods are you using?
>
>     What do you mean methods? I am doing Cartesian grid 3d CFD,
>
>
> Then just use a regular partition. If your meshes have boundary 
> layers, you might want to adjust the subdomain aspect ratios so that 
> strongly coupled cells tend to reside on the same process, but don't 
> bother with general graph partitions (like ParMETIS) because you will 
> end up writing most of an unstructured CFD code in order to use those 
> partitions efficiently.

Can you explain a bit more about how to adjust the subdomain aspect 
ratios so that strongly coupled cells tend to reside on the same process?
>
>     using fractional mtd which solves the momentum and Poisson eqns. I
>     construct the linear eqn matrix and insert them in PETSc
>     matrix/vectors. Then I solve using Bicsstab and hypre AMG
>     respectively. Why is PCSetUp() and PCApply using more time?
>
>
> It is expensive because BoomerAMG setup and apply is expensive.

So this is normal? Is there any suggestion to improve performance?

Thanks!
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120109/666a7d52/attachment.htm>


More information about the petsc-users mailing list