[petsc-users] SuperLU MPI-problem

Xiaoye S. Li xsli at lbl.gov
Mon Jul 20 11:12:04 CDT 2015


The default SuperLU_DIST setting is to serial symbolic factorization.
Therefore, what matters is how much memory do you have per MPI task?

The code failed to malloc memory during redistribution of matrix A to {L\U}
data struction (using result of serial symbolic factorization.)

You can use parallel symbolic factorization, by runtime option:
'-mat_superlu_dist_parsymbfact'

Sherry Li


On Mon, Jul 20, 2015 at 8:59 AM, Mahir.Ulker-Kaustell at tyrens.se <
Mahir.Ulker-Kaustell at tyrens.se> wrote:

>  Hong:
>
>
>
> Previous experiences with this equation have shown that it is very
> difficult to solve it iteratively. Hence the use of a direct solver.
>
>
>
> The large test problem I am trying to solve has slightly less than 10^6
> degrees of freedom. The matrices are derived from finite elements so they
> are sparse.
>
> The machine I am working on has 128GB ram. I have estimated the memory
> needed to less than 20GB, so if the solver needs twice or even three times
> as much, it should still work well. Or have I completely misunderstood
> something here?
>
>
>
> Mahir
>
>
>
>
>
>
>
> *From:* Hong [mailto:hzhang at mcs.anl.gov]
> *Sent:* den 20 juli 2015 17:39
> *To:* Ülker-Kaustell, Mahir
> *Cc:* petsc-users
> *Subject:* Re: [petsc-users] SuperLU MPI-problem
>
>
>
> Mahir:
>
> Direct solvers consume large amount of memory. Suggest to try followings:
>
>
>
> 1. A sparse iterative solver if  [-omega^2M + K] is not too
> ill-conditioned. You may test it using the small matrix.
>
>
>
> 2. Incrementally increase your matrix sizes. Try different matrix
> orderings.
>
> Do you get memory crash in the 1st symbolic factorization?
>
> In your case, matrix data structure stays same when omega changes, so you
> only need to do one matrix symbolic factorization and reuse it.
>
>
>
> 3. Use a machine that gives larger memory.
>
>
>
> Hong
>
>
>
> Dear Petsc-Users,
>
>
>
> I am trying to use PETSc to solve a set of linear equations arising from
> Naviers equation (elastodynamics) in the frequency domain.
>
> The frequency dependency of the problem requires that the system
>
>
>
>                              [-omega^2M + K]u = F
>
>
>
> where M and K are constant, square, positive definite matrices (mass and
> stiffness respectively) is solved for each frequency omega of interest.
>
> K is a complex matrix, including material damping.
>
>
>
> I have written a PETSc program which solves this problem for a small (1000
> degrees of freedom) test problem on one or several processors, but it keeps
> crashing when I try it on my full scale (in the order of 10^6 degrees of
> freedom) problem.
>
>
>
> The program crashes at KSPSetUp() and from what I can see in the error
> messages, it appears as if it consumes too much memory.
>
>
>
> I would guess that similar problems have occurred in this mail-list, so I
> am hoping that someone can push  me in the right direction…
>
>
>
> Mahir
>
>
>
>
>
>
>
>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150720/5e3eac2e/attachment.html>


More information about the petsc-users mailing list