[petsc-users] Can't expand MemType 1: jcol 16104
Anthony Paul Haas
aph at email.arizona.edu
Tue Jul 7 14:37:53 CDT 2015
Hi Jose,
In my code, I use once PETSc to solve a linear system to get the baseflow
(without using SLEPc) and then I use SLEPc to do the stability analysis of
that baseflow. This is why, there are some SLEPc options that are not used
in test.out-superlu_dist-151x151 (when I am solving for the baseflow with
PETSc only). I have attached a 101x101 case for which I get the
eigenvalues. That case works fine. However If i increase to 151x151, I get
the error that you can see in test.out-superlu_dist-151x151 (similar error
with mumps: see test.out-mumps-151x151 line 2918 ). If you look a the very
end of the files test.out-superlu_dist-151x151 and test.out-mumps-151x151,
you will see that the last info message printed is:
On Processor (after EPSSetFromOptions) 0 memory:
0.65073152000E+08 =====> (see line 807 of module_petsc.F90)
This means that the memory error probably occurs in the call to EPSSolve
(see module_petsc.F90 line 810). I would like to evaluate how much memory
is required by the most memory intensive operation within EPSSolve. Since I
am solving a generalized EVP, I would imagine that it would be the LU
decomposition. But is there an accurate way of doing it?
Before starting with iterative solvers, I would like to exploit as much as
I can direct solvers. I tried GMRES with default preconditioner at some
point but I had convergence problem. What solver/preconditioner would you
recommend for a generalized non-Hermitian (EPS_GNHEP) EVP?
Thanks,
Anthony
On Tue, Jul 7, 2015 at 12:17 AM, Jose E. Roman <jroman at dsic.upv.es> wrote:
>
> El 07/07/2015, a las 02:33, Anthony Haas escribió:
>
> > Hi,
> >
> > I am computing eigenvalues using PETSc/SLEPc and superlu_dist for the LU
> decomposition (my problem is a generalized eigenvalue problem). The code
> runs fine for a grid with 101x101 but when I increase to 151x151, I get the
> following error:
> >
> > Can't expand MemType 1: jcol 16104 (and then [NID 00037] 2015-07-06
> 19:19:17 Apid 31025976: OOM killer terminated this process.)
> >
> > It seems to be a memory problem. I monitor the memory usage as far as I
> can and it seems that memory usage is pretty low. The most memory intensive
> part of the program is probably the LU decomposition in the context of the
> generalized EVP. Is there a way to evaluate how much memory will be
> required for that step? I am currently running the debug version of the
> code which I would assume would use more memory?
> >
> > I have attached the output of the job. Note that the program uses twice
> PETSc: 1) to solve a linear system for which no problem occurs, and, 2) to
> solve the Generalized EVP with SLEPc, where I get the error.
> >
> > Thanks
> >
> > Anthony
> > <test.out-superlu_dist-151x151>
>
> In the output you are attaching there are no SLEPc objects in the report
> and SLEPc options are not used. It seems that SLEPc calls are skipped?
>
> Do you get the same error with MUMPS? Have you tried to solve linear
> systems with a preconditioned iterative solver?
>
> Jose
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150707/b04515f5/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: module_petsc.F90
Type: text/x-fortran
Size: 37154 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150707/b04515f5/attachment-0001.bin>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: test.out-mumps-151x151
Type: application/octet-stream
Size: 128414 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150707/b04515f5/attachment-0003.obj>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: test.out_superlu_dist-101x101
Type: application/octet-stream
Size: 150123 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150707/b04515f5/attachment-0004.obj>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: test.out-superlu_dist-151x151
Type: application/octet-stream
Size: 115086 bytes
Desc: not available
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20150707/b04515f5/attachment-0005.obj>
More information about the petsc-users
mailing list