[PETSC #17426] DA object and dynamic load balancing

Barry Smith bsmith at mcs.anl.gov
Fri Mar 7 19:49:35 CST 2008


   At this point I would say it is time to move on to using  
dynamically partitioned unstructured
mesh representation (note the discretization mesh might still be  
logically rectangular, it is
just the partition information is stored as unstructured.)

   With modern processes (not vector machines) it is a myth that a  
properly written structured
grid computation is much faster (in terms of flop rate) than using he  
same discretization on a
properly written unstructured representation.

    Barry


On Mar 7, 2008, at 5:21 PM, Mehdi Bostandoost wrote:

> Hi
>
> I used DA object to write our code for our 3D Reacting flow solver  
> and it was quite useful.
> because of the reaction inside the domain,it takes different amount  
> of time to solve stiff equations.In part of the domain,which you  
> don't have reaction, solving equations will be fast,but in the part  
> that you have reaction, it takes much longer time to solve the  
> equation.
> therefore,there is inherently load imbalance inside the domain.
>
> when you use the DA object to decompose the mesh,I could not not  
> find a way to enforce this issue through the DA functions.
>
> it would be great if you can have a function like
>
> DACreate1d(MPI_Comm comm,DAPeriodicType wrap,PetscInt M,PetscInt  
> dof,PetscInt s,PetscInt weight,PetscInt *lc,DA *inra)
>
> through using weight, or something like that, you can dynamically  
> balance your domain as you march in time.(for example after each 50  
> iterations,you can time your code for each sub-domain,and then you  
> can define weight=time_subdomain/time_max_subdomains.)
>
> Is it possible to have something like this?or do you have any  
> suggestion?
>
> thanks
>
> mehdi
>
>
>
>
>
>
> Never miss a thing. Make Yahoo your homepage.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-dev/attachments/20080307/e77b9b43/attachment.html>


More information about the petsc-dev mailing list