[petsc-users] SuperLU MPI-problem

Hong hzhang at mcs.anl.gov
Mon Jul 20 10:38:45 CDT 2015


Mahir:
Direct solvers consume large amount of memory. Suggest to try followings:

1. A sparse iterative solver if  [-omega^2M + K] is not too
ill-conditioned. You may test it using the small matrix.

2. Incrementally increase your matrix sizes. Try different matrix orderings.
Do you get memory crash in the 1st symbolic factorization?
In your case, matrix data structure stays same when omega changes, so you
only need to do one matrix symbolic factorization and reuse it.

3. Use a machine that gives larger memory.

Hong

 Dear Petsc-Users,
>
>
>
> I am trying to use PETSc to solve a set of linear equations arising from
> Naviers equation (elastodynamics) in the frequency domain.
>
> The frequency dependency of the problem requires that the system
>
>
>
>                              [-omega^2M + K]u = F
>
>
>
> where M and K are constant, square, positive definite matrices (mass and
> stiffness respectively) is solved for each frequency omega of interest.
>
> K is a complex matrix, including material damping.
>
>
>
> I have written a PETSc program which solves this problem for a small (1000
> degrees of freedom) test problem on one or several processors, but it keeps
> crashing when I try it on my full scale (in the order of 10^6 degrees of
> freedom) problem.
>
>
>
> The program crashes at KSPSetUp() and from what I can see in the error
> messages, it appears as if it consumes too much memory.
>
>
>
> I would guess that similar problems have occurred in this mail-list, so I
> am hoping that someone can push  me in the right direction…
>
>
>
> Mahir
>
>
>
>
>
>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150720/5b8b265f/attachment-0001.html>


More information about the petsc-users mailing list