[petsc-users] Block Tridiagonal Solver

Jed Brown jed at jedbrown.org
Fri Sep 6 16:32:48 CDT 2019


Where do your tridiagonal systems come from?  Do you need to solve one
at a time, or batches of tridiagonal problems?

Although it is not in PETSc, we have some work on solving the sort of
tridiagonal systems that arise in compact discretizations, which it
turns out can be solved much faster than generic tridiagonal problems.

  https://tridiaglu.github.io/index.html

"John L. Papp via petsc-users" <petsc-users at mcs.anl.gov> writes:

> Hello,
>
> I need a parallel block tridiagonal solver and thought PETSc would be 
> perfect.  However, there seems to be no specific example showing exactly 
> which VecCreate and MatCreate functions to use.  I searched the archive 
> and the web and there is no explicit block tridiagonal examples 
> (although ex23.c example solves a tridiagonal matrix) and the manual is 
> vague on the subject.  So a couple of questions:
>
>  1. Is it better to create a monolithic matrix (MatCreateAIJ) and vector
>     (VecCreate)?
>  2. Is it better to create a block matrix (MatCreateBAIJ) and vector
>     (VecCreate and then VecSetBlockSize or is there an equivalent block
>     vector create)?
>  3. What is the best parallel solver(s) to invert the Dx=b when D is a
>     block tridiagonal matrix?
>
> If this helps, each row will be owned by the same process.  In other 
> words, the data used to fill the [A] [B] [C] block matrices in a row of 
> the D block tridiagonal matrix will reside on the same process.  Hence, 
> I don't need to store the individual [A], [B], and [C] block matrices in 
> parallel, just the over all block tridiagonal matrix on a row by row basis.
>
> Thanks in advance,
>
> John
>
> -- 
> **************************************************************
> Dr. John Papp
> Senior Research Scientist
> CRAFT Tech.
> 6210 Kellers Church Road
> Pipersville, PA 18947
>
> Email:  jpapp at craft-tech.com
> Phone:  (215) 766-1520
> Fax  :  (215) 766-1524
> Web  :  http://www.craft-tech.com
>
> **************************************************************


More information about the petsc-users mailing list