<div dir="ltr">Hi Barry,<div>Your suggestion of removing the "<span style="font-size:12.8px">if (mumps->CleanUpMUMPS)" in mumps.c did resolve the problem for me.</span></div><div><span style="font-size:12.8px">Thanks,</span></div><div><span style="font-size:12.8px">-Matt</span></div></div><div class="gmail_extra"><br><div class="gmail_quote">On Wed, Sep 30, 2015 at 6:28 PM, Barry Smith <span dir="ltr"><<a href="mailto:bsmith@mcs.anl.gov" target="_blank">bsmith@mcs.anl.gov</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><br>
Matt,<br>
<br>
Please try the following: edit<br>
<br>
#undef __FUNCT__<br>
#define __FUNCT__ "MatDestroy_MUMPS"<br>
PetscErrorCode MatDestroy_MUMPS(Mat A)<br>
{<br>
Mat_MUMPS *mumps=(Mat_MUMPS*)A->spptr;<br>
PetscErrorCode ierr;<br>
<br>
PetscFunctionBegin;<br>
if (mumps->CleanUpMUMPS) {<br>
<br>
Remove this if () test and just always do the lines of clean up code after it. Let us know if this resolves the problem?<br>
<br>
Thanks<br>
<br>
Barry<br>
<br>
This CleanUpMUMPS flag has always be goofy and definitely needs to be removed, the only question is if some other changes are needed when it is removed.<br>
<div class="HOEnZb"><div class="h5"><br>
<br>
> On Sep 30, 2015, at 4:59 PM, Barry Smith <<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>> wrote:<br>
><br>
><br>
> Matt,<br>
><br>
> Yes, you must be right The MatDestroy() on the "partially factored" matrix should clean up everything properly but it sounds like it is not. I'll look at it right now but I only have a few minutes; if I can't resolve it really quickly it may take a day or two.<br>
><br>
><br>
> Barry<br>
><br>
>> On Sep 30, 2015, at 4:10 PM, Matt Landreman <<a href="mailto:matt.landreman@gmail.com">matt.landreman@gmail.com</a>> wrote:<br>
>><br>
>> Hi Barry,<br>
>> I tried adding PetscMallocDump after SNESDestroy as you suggested. When mumps fails, PetscMallocDump shows a number of mallocs which are absent when mumps succeeds, the largest being MatConvertToTriples_mpiaij_mpiaij() (line 638 in petsc-3.6.0/src/mat/impls/aij/mpi/mumps/mumps.c). The total memory reported by PetscMallocDump after SNESDestroy is substantially (>20x) larger when mumps fails than when mumps succeeds, and this amount increases uniformly with each mumps failure. So I think some of the mumps-related structures are not being deallocated by SNESDestroy if mumps generates an error.<br>
>> Thanks,<br>
>> -Matt<br>
>><br>
>> On Wed, Sep 30, 2015 at 2:16 PM, Barry Smith <<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>> wrote:<br>
>><br>
>>> On Sep 30, 2015, at 1:06 PM, Matt Landreman <<a href="mailto:matt.landreman@gmail.com">matt.landreman@gmail.com</a>> wrote:<br>
>>><br>
>>> PETSc developers,<br>
>>><br>
>>> I tried implementing a system for automatically increasing MUMPS ICNTL(14), along the lines described in this recent thread. If SNESSolve returns ierr .ne. 0 due to MUMPS error -9, I call SNESDestroy, re-initialize SNES, call MatMumpsSetIcntl with a larger value of ICNTL(14), call SNESSolve again, and repeat as needed. The procedure works, but the peak memory required (as measured by the HPC system) is 50%-100% higher if the MUMPS solve has to be repeated compared to when MUMPS works on the 1st try (by starting with a large ICNTL(14)), even though SNESDestroy is called in between the attempts. Are there some PETSc or MUMPS structures which would not be deallocated immediately by SNESDestroy? If so, how do I deallocate them?<br>
>><br>
>> They should be all destroyed automatically for you. You can use PetscMallocDump() after the SNES is destroyed to verify that all that memory is not properly freed.<br>
>><br>
>> My guess is that your new malloc() with the bigger workspace cannot "reuse" the space that was previously freed; so to the OS it looks like you are using a lot more space but in terms of physical memory you are not using more.<br>
>><br>
>> Barry<br>
>><br>
>>><br>
>>> Thanks,<br>
>>> Matt Landreman<br>
>>><br>
>>><br>
>>> On Tue, Sep 15, 2015 at 7:47 AM, David Knezevic <<a href="mailto:david.knezevic@akselos.com">david.knezevic@akselos.com</a>> wrote:<br>
>>> On Tue, Sep 15, 2015 at 7:29 PM, Matthew Knepley <<a href="mailto:knepley@gmail.com">knepley@gmail.com</a>> wrote:<br>
>>> On Tue, Sep 15, 2015 at 4:30 AM, David Knezevic <<a href="mailto:david.knezevic@akselos.com">david.knezevic@akselos.com</a>> wrote:<br>
>>> In some cases, I get MUMPS error -9, i.e.:<br>
>>> [2]PETSC ERROR: Error reported by MUMPS in numerical factorization phase: INFO(1)=-9, INFO(2)=98927<br>
>>><br>
>>> This is easily fixed by re-running the executable with -mat_mumps_icntl_14 on the commandline.<br>
>>><br>
>>> However, I would like to update my code in order to do this automatically, i.e. detect the -9 error and re-run with the appropriate option. Is there a recommended way to do this? It seems to me that I could do this with a PETSc error handler (e.g. PetscPushErrorHandler) in order to call a function that sets the appropriate option and solves again, is that right? Are there any examples that illustrate this type of thing?<br>
>>><br>
>>> I would not use the error handler. I would just check the ierr return code from the solver. I think you need the<br>
>>> INFO output, for which you can use MatMumpsGetInfo().<br>
>>><br>
>>><br>
>>> OK, that sounds good (and much simpler than what I had in mind), thanks for the help!<br>
>>><br>
>>> David<br>
>>><br>
>>><br>
>><br>
>><br>
><br>
<br>
</div></div></blockquote></div><br></div>