[petsc-users] Direct solver for 3D-FEM

Thomas Witkowski thomas.witkowski at tu-dresden.de
Fri May 20 08:59:55 CDT 2011


Jed Brown wrote:
> On Fri, May 20, 2011 at 15:16, Thomas Witkowski 
> <thomas.witkowski at tu-dresden.de 
> <mailto:thomas.witkowski at tu-dresden.de>> wrote:
>
>     Thanks for the information. One related question: The system that
>     is solved here is a 3x3 block matrix: Two of these blocks are
>     empty, two are the discretized (FEM) identity, three are the
>     discretized Laplace operator
>
>
> What equation? Have you assumed special boundary conditions? Problems 
> like elasticity and Stokes normally need a symmetric stress tensor. 
> Only for constant coefficients and certain boundary conditions, is it 
> okay to use the "block Laplacian" form.
It is the so called phase field crystal equation (PFC, a 6th-order 
nonlinear parabolic PDE). The block structure results because my code 
can solve only system of 2nd-order PDEs. Because there is a Laplace^3 in 
the original equation I get all the Laplacian due to operator spliting. 
The whole block looks as follows: [-L, 0, I;  I, -L, 0; C, I, -L], where 
I is the identity operator, L the Laplace operator and C = 2 L - eps I + 
v I, with eps a number and v the solution of the old timestep). There 
are many different ways to write the system. This here is not symmetric, 
but is more diagonal dominant. Boundary conditions are zero-flux on all 
boundaries.
>  
>
>     and one block is the discretization of a complex sum of operators.
>     At all, there are only three different blocks. Can I make somehow
>     advantages of this structure? I could create the three inverses of
>     the three different blocks, but exists than a cheap/simple way to
>     produce the action of the inverse of the whole 3x3 block matrix?
>
>
> For the special case of a Kronecker product, you can solve using 
> solves with the blocks, but not in the general case. Consider the 
> structured grid 2D Laplacian [A, -I; -I, A] where A is the tridiagonal 
> problem for a 1D Laplacian plus 2*I on the diagonal. If you could 
> build a direct solver for the 2D using direct solvers for the 1D 
> problem, then you would have an O(n) algorithm for a direct solve of a 
> 2D Laplacian.
>
> You can, however, build an iterative method from the blocks. If each 
> block is built separately, you can (with petsc-dev) put them into 
> MatNest and use PCFieldSplit with direct solves in splits.
I will check this.
>
> An alternative is to interlace the degrees of freedom and use the 
> (S)BAIJ matrix format. This will reduce the amount of memory needed 
> for your matrix (by using a more efficient storage format) and will 
> improve the memory performance. Unfortunately, there are no 
> third-party direct solvers for BAIJ. Note that your assembly code can 
> always call MatSetValuesBlocked() (or ..BlockedLocal()) regardless of 
> the matrix format, so you can switch between AIJ and BAIJ at run-time.
>
> Why bother with a direct solver if your problem has constant 
> coefficients? 



More information about the petsc-users mailing list