<html><head></head><body><div style="font-family: Verdana;font-size: 12.0px;"><div> </div>
<div>
<div>
<p style="margin-bottom:11px"><span style="font-size:11pt"><span style="line-height:107%"><span style="font-family:Calibri,sans-serif">Thanks for the swift help.</span></span></span></p>
<p style="margin-bottom:11px"><span style="font-size:11pt"><span style="line-height:107%"><span style="font-family:Calibri,sans-serif">I switched from openmpi to mpich, now it seems to work.</span></span></span></p>
<p style="margin-bottom:11px"> </p>
<p style="margin-bottom:11px"><span style="font-size:11pt"><span style="line-height:107%"><span style="font-family:Calibri,sans-serif">Marius</span></span></span></p>
<div name="quote" style="margin:10px 5px 5px 10px; padding: 10px 0 10px 10px; border-left:2px solid #C3D9E5; word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space;">
<div style="margin:0 0 10px 0;"><b>Gesendet:</b> Mittwoch, 25. Dezember 2019 um 02:25 Uhr<br/>
<b>Von:</b> "Smith, Barry F." <bsmith@mcs.anl.gov><br/>
<b>An:</b> "Marius Buerkle" <mbuerkle@web.de><br/>
<b>Cc:</b> "Mark Adams" <mfadams@lbl.gov>, "petsc-usersmcs.anl.gov" <petsc-users@mcs.anl.gov><br/>
<b>Betreff:</b> Re: [petsc-users] possible memory leak</div>
<div name="quoted-content"><br/>
I see,<br/>
<br/>
What version of MPI are you using? Some versions of OpenMPI have serious memory leaks. Maybe try the exact same version of OpenMPI you used previously with success?<br/>
<br/>
You can also use valgrind with some options to track the memory usage.<br/>
<br/>
Barry<br/>
<br/>
<br/>
> On Dec 24, 2019, at 10:03 AM, Marius Buerkle <mbuerkle@web.de> wrote:<br/>
><br/>
> I see. For actual production runs on 512 processes with larger matrices this causes eventually a seg fault due to running out of memory. I am not sure but think this did not happen with some "older" version of PETSC. I am not sure which one (or which revision), I wil check it.<br/>
><br/>
><br/>
> Gesendet: Mittwoch, 25. Dezember 2019 um 00:53 Uhr<br/>
> Von: "Smith, Barry F." <bsmith@mcs.anl.gov><br/>
> An: "Marius Buerkle" <mbuerkle@web.de><br/>
> Cc: "Mark Adams" <mfadams@lbl.gov>, "petsc-usersmcs.anl.gov" <petsc-users@mcs.anl.gov><br/>
> Betreff: Re: [petsc-users] possible memory leak<br/>
><br/>
> There are no leaks but it appears what is happening is that rather than recycle the memory PETSc is returning to the system the system is generating new space as needed. Since the old PETSc pages are never used again this should be harmless.<br/>
><br/>
> Barry<br/>
><br/>
><br/>
><br/>
> > On Dec 24, 2019, at 9:47 AM, Marius Buerkle <mbuerkle@web.de> wrote:<br/>
> ><br/>
> > thanks for the swift reply. This didn't give any output.<br/>
> ><br/>
> ><br/>
> > Gesendet: Dienstag, 24. Dezember 2019 um 23:56 Uhr<br/>
> > Von: "Mark Adams" <mfadams@lbl.gov><br/>
> > An: "Marius Buerkle" <mbuerkle@web.de><br/>
> > Cc: "petsc-usersmcs.anl.gov" <petsc-users@mcs.anl.gov><br/>
> > Betreff: Re: [petsc-users] possible memory leak<br/>
> > Try running with -malloc_debug<br/>
> ><br/>
> > This should print out where unfreed memory was allocated. See what pops up.<br/>
> ><br/>
> > Mark<br/>
> ><br/>
> > On Tue, Dec 24, 2019 at 2:27 AM Marius Buerkle <mbuerkle@web.de> wrote:<br/>
> > Hi,<br/>
> ><br/>
> ><br/>
> > In my code I create and destroy during one run a lot of MPIAIJ matrices and it seems that not all memory is freed when I destroy the matrices. Moreover, it looks like the amount of “disappearing” memory increases with the number of processes. I am not so sure if I am doing something wrong or if it is really a memory leak. I have attached a simple reproducer.<br/>
> ><br/>
> ><br/>
> > Best,<br/>
> ><br/>
> > Marius<br/>
> ><br/>
><br/>
</div>
</div>
</div>
</div></div></body></html>