[petsc-users] SuperLU MPI-problem
Mahir.Ulker-Kaustell at tyrens.se
Mahir.Ulker-Kaustell at tyrens.se
Mon Jul 20 12:00:06 CDT 2015
Ok, I realize that I have overseen the fact that the space needed for the LU-factorization of a sparse matrix can be far greater than that needed to store the matrix itself.
So, if I can’t find an iterative solver, I have to divide the work on several nodes so as to ensure that there is enough memory?
Mahir
From: Matthew Knepley [mailto:knepley at gmail.com]
Sent: den 20 juli 2015 18:17
To: Ülker-Kaustell, Mahir
Cc: Hong; petsc-users
Subject: Re: [petsc-users] SuperLU MPI-problem
On Mon, Jul 20, 2015 at 11:14 AM, Mahir.Ulker-Kaustell at tyrens.se<mailto:Mahir.Ulker-Kaustell at tyrens.se> <Mahir.Ulker-Kaustell at tyrens.se<mailto:Mahir.Ulker-Kaustell at tyrens.se>> wrote:
I am roughly guessing that my sparse matrix will have 1000 non-zeros on each row. As the matrix I symmetric I divide by half:
1000*1000000(rows)*8(bytes)*2(complex numbers)/2 = 8GB
1) You have missed the memory to represent the columns, so multiply by 1.5
2) Are you using a symmetric format?
3) This is only the space to represent the matrix, not its factors.
Matt
However, it turns out to be much less: I create the matrices in another program and the binary file holding my matrix K is only 800MB. M is a lumped mass matrix…
By the way, I tried using MUMPS and have the same problem there.
Mahir
From: Matthew Knepley [mailto:knepley at gmail.com<mailto:knepley at gmail.com>]
Sent: den 20 juli 2015 18:05
To: Ülker-Kaustell, Mahir
Cc: Hong; petsc-users
Subject: Re: [petsc-users] SuperLU MPI-problem
On Mon, Jul 20, 2015 at 10:59 AM, Mahir.Ulker-Kaustell at tyrens.se<mailto:Mahir.Ulker-Kaustell at tyrens.se> <Mahir.Ulker-Kaustell at tyrens.se<mailto:Mahir.Ulker-Kaustell at tyrens.se>> wrote:
Hong:
Previous experiences with this equation have shown that it is very difficult to solve it iteratively. Hence the use of a direct solver.
The large test problem I am trying to solve has slightly less than 10^6 degrees of freedom. The matrices are derived from finite elements so they are sparse.
Estimated how? It is very difficult to estimate flll-in.
Matt
The machine I am working on has 128GB ram. I have estimated the memory needed to less than 20GB, so if the solver needs twice or even three times as much, it should still work well. Or have I completely misunderstood something here?
Mahir
From: Hong [mailto:hzhang at mcs.anl.gov<mailto:hzhang at mcs.anl.gov>]
Sent: den 20 juli 2015 17:39
To: Ülker-Kaustell, Mahir
Cc: petsc-users
Subject: Re: [petsc-users] SuperLU MPI-problem
Mahir:
Direct solvers consume large amount of memory. Suggest to try followings:
1. A sparse iterative solver if [-omega^2M + K] is not too ill-conditioned. You may test it using the small matrix.
2. Incrementally increase your matrix sizes. Try different matrix orderings.
Do you get memory crash in the 1st symbolic factorization?
In your case, matrix data structure stays same when omega changes, so you only need to do one matrix symbolic factorization and reuse it.
3. Use a machine that gives larger memory.
Hong
Dear Petsc-Users,
I am trying to use PETSc to solve a set of linear equations arising from Naviers equation (elastodynamics) in the frequency domain.
The frequency dependency of the problem requires that the system
[-omega^2M + K]u = F
where M and K are constant, square, positive definite matrices (mass and stiffness respectively) is solved for each frequency omega of interest.
K is a complex matrix, including material damping.
I have written a PETSc program which solves this problem for a small (1000 degrees of freedom) test problem on one or several processors, but it keeps crashing when I try it on my full scale (in the order of 10^6 degrees of freedom) problem.
The program crashes at KSPSetUp() and from what I can see in the error messages, it appears as if it consumes too much memory.
I would guess that similar problems have occurred in this mail-list, so I am hoping that someone can push me in the right direction…
Mahir
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150720/6e7d93c1/attachment.html>
More information about the petsc-users
mailing list