[petsc-users] mumps running out of memory, depending on an overall numerical factor?
Matthew Knepley
knepley at gmail.com
Sat Feb 1 13:17:47 CST 2014
On Sat, Feb 1, 2014 at 1:16 PM, David Liu <daveliu at mit.edu> wrote:
> I see. Does that s^2 memory scaling mean that sparse direct solvers are
> not meant to be used beyond a certain point? I.e. if the supercomputer I'm
> using doesn't have enough memory per core to store even a single row of the
> factored matrix, then I'm out of luck?
>
Yes exactly
Matt
> On Fri, Jan 31, 2014 at 9:51 PM, Jed Brown <jed at jedbrown.org> wrote:
>
>> David Liu <daveliu at mit.edu> writes:
>>
>> > Hi, I'm solving a 3d problem with mumps. When I increased the grid size
>> to
>> > 70x60x20 with 6 unknowns per point, I started noticing that the program
>> was
>> > crashing at runtime at the factoring stage, with the mumps error code:
>> >
>> > -17 The internal send buffer that was allocated dynamically by MUMPS on
>> the
>> > processor is too small.
>> > The user should increase the value of ICNTL(14) before calling MUMPS
>> again.
>> >
>> > However, when I increase the grid spacing in the z direction by about
>> 50%,
>> > this crash does not happen.
>> >
>> > Why would how much memory an LU factorization uses depend on an overall
>> > numerical factor (for part of the matrix at least) like this?
>>
>> I'm not sure exactly what you're asking, but the complexity of direct
>> solves depend on the minimal vertex separators in the sparse
>> matrix/graph. Yours will be s=60*20*6 (more if your stencil needs
>> second neighbors). The memory usage scales with s^2 and the
>> factorization time scales with s^3.
>>
>
>
--
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20140201/0406a029/attachment-0001.html>
More information about the petsc-users
mailing list