[petsc-dev] PETSc LU, Lapack and Preconditioning Matrices

Barry Smith bsmith at mcs.anl.gov
Fri Dec 16 20:57:16 CST 2011


   Dave,

    Band solvers (like in LAPACK) handle all the matrix entries from the band to the diagonal as nonzero (even though in your case the vast majority of those values are zero).  General purpose sparse solvers like PETSc, MUMPS, SuperLU etc explicitly handle only the nonzero values and fill induced by those nonzero values. By first reordering the matrix sparse direct solvers end up having much much less full than a bandsolver and hence are much faster. Band solvers  only make sense when the matrix is dense within the band and not mostly empty like with PDE problems.

   Barry

On Dec 16, 2011, at 6:12 PM, Dave Nystrom wrote:. . 

> Matthew Knepley writes:
>> On Fri, Dec 16, 2011 at 9:37 AM, Dave Nystrom <dnystrom1 at comcast.net> wrote:
>> 
>>> I'm trying to figure out whether I can do a couple of things with petsc.
>>> 
>>> 1.  It looks like the preconditioning matrix can actually be different from
>>> the full problem matrix.  So I'm wondering if I could provide a different
>>> preconditioning matrix for my problem and then do an LU solve of the
>>> preconditioning matrix using the -pc_type lu as my preconditioner.
>> 
>> Yes, that is what it is for.
> 
> Thanks.  I think I will try that and see what sort of results I get.  This
> sounds like a very encouraging discovery to me.
> 
>>> 2.  When I build petsc, I use the --download-f-blas-lapack=yes option.  I'm
>>> wondering if petsc uses lapack under the hood or has the capability to use
>>> lapack under the hood when one uses the -pc_type lu option.  In particular,
>>> since my matrices are band matrices from doing a discretization on a 2d
>>> regular mesh, I'm wondering if the petsc lu solve has the ability to use
>>> the lapack band solver dgbsv or dgbsvx.  Or is it possible to use the
>>> lapack band solver through one of the external packages that petsc can
>>> interface with.  I'm interested in this capability for smaller problem
>>> sizes that fit on a single node and that make sense.
>> 
>> We do not have any banded matrix stuff. Its either dense or sparse right
>> now.
> 
> OK.  I had always been used to thinking of a banded system as sparse,
> relatively speaking, when compared to a full system.  Based also on Barry's
> response, I guess I am not well enough educated on the nuances of sparse
> versus banded.  For instance, when I use "-ksp_type preonly -pc_type lu" to
> solve one of my systems, I had assumed that the LU factorization computed by
> petsc was really filling in the 2*nx+1 bandwidth even though petsc might not
> be explicitly using the banded nature of the matrix.  So I am not sure at all
> what is going on under the hood in petsc for this set of solver options.  Nor
> do I really know how to find out without reading the source code which might
> be fairly daunting.
> 
>>> 3.  I'm also wondering how I might be able to learn more about the petsc
>>> ilu capability.  My impression is that it does ilu(k) and I have tried
>>> it with k>0 but am wondering if one of the options might allow it to do
>>> ilut and whether as k gets big whether ilu(k) approximates lu.  I
>>> currently do not understand the petsc ilu well enough to know how much
>>> extra fill I get as I increase k and where that extra fill might be
>>> located for the case of a band matrix that one gets from discretization
>>> on a regular 2d mesh.
>> 
>> We do not do ilu(dt). Its complicated, and we determined that it was not
>> worth the effort. You can get that from Hypre is you want. Certainly, for
>> big enough k, ilu(k) is lu but its a slow way to do it.
> 
> Thanks.  I need to experiment more with ilu(k) on a couple of my linear
> systems.
> 
>> Matt
>> 
>> 
>>> Thanks,
>>> 
>>> Dave




More information about the petsc-dev mailing list