[petsc-users] Sparse triangular solver

Hoang-Vu Dang dang.hvu at gmail.com
Sun Mar 8 22:40:37 CDT 2015


Sorry for causing the confusion, I should have clarify the term "triangular
solver".

What I mean is that I do not need a factorization from a general matrix to
LU form.

 I already have the matrix in lower/upper triangular form. So the solver is
really just backward/forward substitution.

See here, I got the terminology from http://www.mcs.anl.gov/papers/P1658.pdf

The algorithm "sparse triangular solve", unless I misunderstood the paper.
The paper specifically mentioned PETSC so that's where I'm starting from.

I hope that makes thing clearer,

Cheers,
Vu

On Sun, Mar 8, 2015 at 9:28 PM, Hong <hzhang at mcs.anl.gov> wrote:

> Hoang-Vu :
>>
>> If I do not need the full solver/factorization but just the backward
>> subs, do i need any special treatment ? Is there a way to hint the solver
>> to apply only the last step to reduce overhead ?
>>
> What do you mean " do not need the full solver/factorization"?
> Do you need incomplete matrix factorization, e.g., ILU, instead of full
> factorization?
> The backward subs are steps AFTER matrix factorization.
>
> Hong
>
> On Mar 8, 2015 6:26 PM, "Barry Smith" <bsmith at mcs.anl.gov> wrote:
>>
>>>
>>>   PETSc provides sparse parallel LU (and Cholesky) factorizations and
>>> solves via the external packages SuperLU_Dist, MUMPS, and Pastix. You need
>>> to first configure PETSc to use one or more of those packages for example
>>> ./configure --download-superlu_dist --download-metis --download-parmetis.
>>>
>>>   It is generally best to use the linear solvers via the PETSc KSP
>>> interface (even for direct solvers such as LU). So you create a KSP object,
>>> provide the matrix object and call KSPSolve(). You can control the solver
>>> used via the options database; to use the installed SuperLU_Dist you would
>>> use -pc_type lu -pc_factor_mat_solver_package superlu_dist
>>>
>>>   The MatrixMarket format is no good for parallel computing so you must
>>> first convert the file from MatrixMarket format to the PETSc binary format
>>> (see
>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#sparse-matrix-ascii-format
>>> ) and then  you can use MatLoad() to load the matrix in parallel and then
>>> pass it to the KSP solver. For example
>>> src/ksp/ksp/examples/tutorials/ex10.c does this.
>>>
>>>
>>>   Barry
>>>
>>> > On Mar 8, 2015, at 6:08 PM, Hoang-Vu Dang <dang.hvu at gmail.com> wrote:
>>> >
>>> > Hi,
>>> >
>>> > I would like to use petcs to perform parallel backward/forward
>>> substitution for sparse triangular matrices in a distributed memory cluster
>>> (with MPI).
>>> >
>>> > Could someone provide me some pointers on how to do this or whether
>>> petsc is good for this task ?
>>> >
>>> > I think there is MatSolve method, but unsure whether it supports good
>>> algorithm for sparse triangular matrices and how to provide an input in a
>>> MartrixMarket format / CSR format.
>>> >
>>> > Thank you
>>> > Vu
>>>
>>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150308/88064c38/attachment-0001.html>


More information about the petsc-users mailing list