[petsc-users] SuperLU MPI-problem

Mahir.Ulker-Kaustell at tyrens.se Mahir.Ulker-Kaustell at tyrens.se
Mon Jul 20 12:03:41 CDT 2015


Ok. So I have been creating the full factorization on each process. That gives me some hope!

I followed your suggestion and tried to use the runtime option ‘-mat_superlu_dist_parsymbfact’.
However, now the program crashes with:

Invalid ISPEC at line 484 in file get_perm_c.c

And so on…

From the SuperLU manual; I should give the option either YES or NO, however -mat_superlu_dist_parsymbfact YES makes the program crash in the same way as above.
Also I can’t find any reference to -mat_superlu_dist_parsymbfact in the PETSc documentation

Mahir

________________________________
Mahir Ülker-Kaustell, Kompetenssamordnare, Brokonstruktör, Tekn. Dr, Tyréns AB
010 452 30 82, Mahir.Ulker-Kaustell at tyrens.se
________________________________

From: Xiaoye S. Li [mailto:xsli at lbl.gov]
Sent: den 20 juli 2015 18:12
To: Ülker-Kaustell, Mahir
Cc: Hong; petsc-users
Subject: Re: [petsc-users] SuperLU MPI-problem

The default SuperLU_DIST setting is to serial symbolic factorization. Therefore, what matters is how much memory do you have per MPI task?
The code failed to malloc memory during redistribution of matrix A to {L\U} data struction (using result of serial symbolic factorization.)

You can use parallel symbolic factorization, by runtime option: '-mat_superlu_dist_parsymbfact'
Sherry Li

On Mon, Jul 20, 2015 at 8:59 AM, Mahir.Ulker-Kaustell at tyrens.se<mailto:Mahir.Ulker-Kaustell at tyrens.se> <Mahir.Ulker-Kaustell at tyrens.se<mailto:Mahir.Ulker-Kaustell at tyrens.se>> wrote:
Hong:

Previous experiences with this equation have shown that it is very difficult to solve it iteratively. Hence the use of a direct solver.

The large test problem I am trying to solve has slightly less than 10^6 degrees of freedom. The matrices are derived from finite elements so they are sparse.
The machine I am working on has 128GB ram. I have estimated the memory needed to less than 20GB, so if the solver needs twice or even three times as much, it should still work well. Or have I completely misunderstood something here?

Mahir



From: Hong [mailto:hzhang at mcs.anl.gov<mailto:hzhang at mcs.anl.gov>]
Sent: den 20 juli 2015 17:39
To: Ülker-Kaustell, Mahir
Cc: petsc-users
Subject: Re: [petsc-users] SuperLU MPI-problem

Mahir:
Direct solvers consume large amount of memory. Suggest to try followings:

1. A sparse iterative solver if  [-omega^2M + K] is not too ill-conditioned. You may test it using the small matrix.

2. Incrementally increase your matrix sizes. Try different matrix orderings.
Do you get memory crash in the 1st symbolic factorization?
In your case, matrix data structure stays same when omega changes, so you only need to do one matrix symbolic factorization and reuse it.

3. Use a machine that gives larger memory.

Hong

Dear Petsc-Users,

I am trying to use PETSc to solve a set of linear equations arising from Naviers equation (elastodynamics) in the frequency domain.
The frequency dependency of the problem requires that the system

                             [-omega^2M + K]u = F

where M and K are constant, square, positive definite matrices (mass and stiffness respectively) is solved for each frequency omega of interest.
K is a complex matrix, including material damping.

I have written a PETSc program which solves this problem for a small (1000 degrees of freedom) test problem on one or several processors, but it keeps crashing when I try it on my full scale (in the order of 10^6 degrees of freedom) problem.

The program crashes at KSPSetUp() and from what I can see in the error messages, it appears as if it consumes too much memory.

I would guess that similar problems have occurred in this mail-list, so I am hoping that someone can push  me in the right direction…

Mahir






-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150720/74970119/attachment-0001.html>


More information about the petsc-users mailing list