[petsc-users] Using petsc for banded matrices and 2D finite differences

Brandt Belson bbelson at princeton.edu
Tue Nov 15 09:43:48 CST 2011


Hi all,
I'm writing a 3D incompressible fluids solver for transitional and
turbulent boundary layers, and would like to make use of petsc if possible.
At each time-step I'll need to solve matrix equations arising from finite
differences in two dimensions (x and y) on a structured grid. The matrix is
block tri/penta-diagonal, depending on the stencil, and the blocks are also
tri/penta-diagonal. Correct me if I'm wrong, but I believe these types of
matrix equations can be solved directly and cheaply on one node.

The two options I'm comparing are:
1. Distribute the data in z and solve x-y plane matrices with LAPACK or
other serial or shared-memory libraries. Then do an MPI all-to-all to
distribute the data in x and/or y, and do all computations in z. This
method allows all calculations to be done with all necessary data available
to one node, so serial or shared-memory algorithms can be used. The
disadvantages are that the MPI all-to-all can be expensive and the number
of nodes is limited by the number of points in the z direction.

2. Distribute the data in x-y only and use petsc to do matrix solves in the
x-y plane across nodes. The data would always be contiguous in z. The
possible disadvantage is that the x-y plane matrix solves could be slower.
However, there is no need for an all-to-all and the number of nodes is
roughly limited by nx*ny instead of nz.

The size of the grid will be about 1000 x 100 x 500 in x, y, and z, so
matrices would be about 100,000 x 100,000, but the grid size could vary.

For anyone interested, the derivatives in x and y are done with compact
finite differences, and in z with discrete Fourier transforms. I also hope
to make use petsc4py and python.

Thanks for your help,
Brandt
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20111115/71c61897/attachment.htm>


More information about the petsc-users mailing list