[petsc-users] Direct inversion methods in parallel

Dave May dave.mayhem23 at gmail.com
Fri Sep 25 00:34:56 CDT 2015


On 25 September 2015 at 07:24, Timothée Nicolas <timothee.nicolas at gmail.com>
wrote:

> Hi all, from the manual, I get that the options
>
> -ksp_type preonly -pc_type lu
>
> to solve a problem by direct LU inversion
>
This is doing LU factorization.
The inverse matrix is not assembled explicitly.


> are available only for sequential matrices. Should I conclude that there
> is no method to try a direct inversion of a big problem in parallel ?
>

The packages,
  superlu_dist, mumps and pastix
provided support for parallel LU factorization.
These packages can be installed by petsc'c configure system.


> I plan to use the direct inversion only as a check that my approximation
> to the inverse problem is OK, because so far my algorithm which should work
> is not working at all and I need to debug what is going on. Namely I use an
> approximation to the linear problem using an approximate Schur complement,
> and I want to know if my approximation is false or if from the start my
> matrices are false.
>
> I have tried a direct inversion on one process with the above options for
> a quite small problem (12x12x40 with 8 dof), but it did not work, I suppose
> because of memory limitation (output with log_summary at the end attached
> just in case).
>

>From the output it appears you are running a debug build of petsc.
If you want to see an immediate gain in performance, profile your algorithm
with an optimized build of petsc.
Also, if you want to get better performance from sequential sparse direct
solvers, consider using the packages
  umfpack (or cholmod if the matrix is symmetric)
available from suitesparse.
These libraries are great.
The implementations also leverage multi-threaded BLAS thus they will be
much faster than using petsc default LU.

Cheers
   Dave

Best
>
> Timothee NICOLAS
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150925/042b61ad/attachment.html>


More information about the petsc-users mailing list