[petsc-users] Block Tridiagonal Solver

John L. Papp jpapp at craft-tech.com
Mon Sep 9 08:30:43 CDT 2019


Hello,

The block matrices tend to be dense and can be large depending on the 
amount of unknowns.  The overall block tridiagonal can be large as well 
as the size depends on the number of grid points in a given index 
direction.  It would not be unheard of to have greater 100 rows in the 
global block tridiagonal matrix with greater than 25 unknowns in each 
block matrix element.  Based on this, you would suggest BAIJ or would I 
be pushing the limits of parallel matrix solve?

Thanks,

John

On 9/8/2019 12:51 AM, Smith, Barry F. wrote:
>     John,
>
>        How large are your blocks and are they dense? Also generally how many blocks do you have? The BAIJ formats are for when the blocks are dense.
>
>        As Jed notes we don't have specific parallel block tridiagonal solvers. You can use the parallel direct solvers such as MUMPS, SuperLU_DIST, or PastiX from PETSc or standard iterative methods such as block Jacobi or the overlapping additive Schwarz method. Depending on your needs any of these may be suitable.
>
>     For MPI parallelism
>
>      -pc_type lu -pc_factor_mat_solver_type    mumps    superlu_dist or pastix  mkl_cpardiso (you need to ./configure PETSc with --download-mumps --download-scalapack or --download-superlu_dist or --download-pastix  or --with-mkl_cpardiso)
>
>     -pc_type bjacobi     or   -pc_type asm
>
>    For OpenMP parallelism of the linear solver you can use --with-mkl_pardiso or --download-mumps --with-mumps-serial
>
>    Barry
>
>
>> On Sep 6, 2019, at 4:32 PM, Jed Brown via petsc-users <petsc-users at mcs.anl.gov> wrote:
>>
>> Where do your tridiagonal systems come from?  Do you need to solve one
>> at a time, or batches of tridiagonal problems?
>>
>> Although it is not in PETSc, we have some work on solving the sort of
>> tridiagonal systems that arise in compact discretizations, which it
>> turns out can be solved much faster than generic tridiagonal problems.
>>
>>   https://tridiaglu.github.io/index.html
>>
>> "John L. Papp via petsc-users" <petsc-users at mcs.anl.gov> writes:
>>
>>> Hello,
>>>
>>> I need a parallel block tridiagonal solver and thought PETSc would be
>>> perfect.  However, there seems to be no specific example showing exactly
>>> which VecCreate and MatCreate functions to use.  I searched the archive
>>> and the web and there is no explicit block tridiagonal examples
>>> (although ex23.c example solves a tridiagonal matrix) and the manual is
>>> vague on the subject.  So a couple of questions:
>>>
>>> 1. Is it better to create a monolithic matrix (MatCreateAIJ) and vector
>>>     (VecCreate)?
>>> 2. Is it better to create a block matrix (MatCreateBAIJ) and vector
>>>     (VecCreate and then VecSetBlockSize or is there an equivalent block
>>>     vector create)?
>>> 3. What is the best parallel solver(s) to invert the Dx=b when D is a
>>>     block tridiagonal matrix?
>>>
>>> If this helps, each row will be owned by the same process.  In other
>>> words, the data used to fill the [A] [B] [C] block matrices in a row of
>>> the D block tridiagonal matrix will reside on the same process.  Hence,
>>> I don't need to store the individual [A], [B], and [C] block matrices in
>>> parallel, just the over all block tridiagonal matrix on a row by row basis.
>>>
>>> Thanks in advance,
>>>
>>> John
>>>
>>> -- 
>>> **************************************************************
>>> Dr. John Papp
>>> Senior Research Scientist
>>> CRAFT Tech.
>>> 6210 Kellers Church Road
>>> Pipersville, PA 18947
>>>
>>> Email:  jpapp at craft-tech.com
>>> Phone:  (215) 766-1520
>>> Fax  :  (215) 766-1524
>>> Web  :  http://www.craft-tech.com
>>>
>>> **************************************************************

-- 
**************************************************************
Dr. John Papp
Senior Research Scientist
CRAFT Tech.
6210 Kellers Church Road
Pipersville, PA 18947

Email:  jpapp at craft-tech.com
Phone:  (215) 766-1520
Fax  :  (215) 766-1524
Web  :  http://www.craft-tech.com

**************************************************************



More information about the petsc-users mailing list