[petsc-users] Solving a sequence of linear systems stored on disk with MUMPS

Zhang, Hong hzhang at mcs.anl.gov
Tue Jul 23 11:02:42 CDT 2019


Thibaut :
How large is your A_m? Write/Read to/from the disc is non-scalable. It would be the last resort when matrix factor is too large for the memory. PETSc/mumps interface does not support this feature.

If A_m is not huge, you can create one KSP, with matrix
A = diag(A_m).
Keep LU factor of A = diag( LU factor of A_m), then solve A x = b repeatedly with changing b.
You can use '-pc_type bjacobi -pc_bjacobi_blocks M -sub_pc_type lu'
Hong

Dear PETSc users,

I need to solve several linear systems successively, with LU
factorization, as part of an iterative process in my Fortran application
code.

The process would solve M systems (A_m)(x_m,i) = (b_m,i) for m=1,M at
each iteration i, but computing the LU factorization of A_m only once.
The RHSs (b_m,i+1) are computed from all the different (x_m,i) and all
depend upon each other.

The way I envisage to perform that is to use MUMPS to compute,
successively, each of the LU factorizations (m) in parallel and store
the factors on disk, creating/assembling/destroying the matrices A_m on
the go.
Then whenever needed, read the factors in parallel to solve the systems.
Since version 5.2, MUMPS has a save/restore feature that allows that,
see http://mumps.enseeiht.fr/doc/userguide_5.2.1.pdf p.20, 24 and 58.

In its current state, the PETSc/MUMPS interface does not incorporate
that feature. I'm an advanced Fortran programmer but not in C so I don't
think I would do an amazing job having a go inside
src/mat/impls/aij/mpi/mumps/mumps.c.

I was picturing something like creating as many KSP objects as linear
systems to be solved, with some sort of flag to force the storage of LU
factors on disk after the first call to KSPSolve. Then keep calling
KSPSolve as many times as needed.

Would you support such a feature?

Thanks for your support,

Thibaut
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20190723/040f4e4e/attachment.html>


More information about the petsc-users mailing list