<div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr">This kind of issue is difficult to untangle because you have potentially three pieces of software which might have changed between v3.9 and v3.12, namely<div>PETSc, SLEPC and SuperLU_dist. </div><div>You need to isolate which software component is responsible for the 2x increase in memory.</div><div><br></div><div>When I look at the memory usage in the log files, things look very very similar for the raw PETSc objects.</div><div><br></div><div>[v3.9]</div><div><span style="font-variant-ligatures:no-common-ligatures;font-family:Menlo;font-size:11px">--- Event Stage 0: Main Stage</span><br></div><div>
<p style="margin:0px;font:11px Menlo;min-height:13px"><span style="font-variant-ligatures:no-common-ligatures"></span><br></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>              </span>Viewer <span>    </span>4<span>              </span>3 <span>        </span>2520 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>              </span>Matrix<span>    </span>15 <span>            </span>15<span>    </span>125236536 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>              </span>Vector<span>    </span>22 <span>            </span>22 <span>    </span>19713856 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>           </span>Index Set<span>    </span>10 <span>            </span>10 <span>      </span>995280 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>         </span>Vec Scatter <span>    </span>4<span>              </span>4 <span>        </span>4928 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>          </span>EPS Solver <span>    </span>1<span>              </span>1 <span>        </span>2276 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>  </span>Spectral Transform <span>    </span>1<span>              </span>1<span>          </span>848 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>       </span>Basis Vectors <span>    </span>1<span>              </span>1 <span>        </span>2168 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>         </span>PetscRandom <span>    </span>1<span>              </span>1<span>          </span>662 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>              </span>Region <span>    </span>1<span>              </span>1<span>          </span>672 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>       </span>Direct Solver <span>    </span>1<span>              </span>1<span>        </span>17440 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>       </span>Krylov Solver <span>    </span>1<span>              </span>1 <span>        </span>1176 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>      </span>Preconditioner <span>    </span>1<span>              </span>1 <span>        </span>1000 <span>    </span>0.</span></p></div><div><br></div><div>versus </div><div><br></div><div>[v3.12]</div><div>







<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures">--- Event Stage 0: Main Stage</span></p>
<p style="margin:0px;font:11px Menlo;min-height:13px"><span style="font-variant-ligatures:no-common-ligatures"></span><br></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>              </span>Viewer <span>    </span>4<span>              </span>3 <span>        </span>2520 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>              </span>Matrix<span>    </span>15 <span>            </span>15<span>    </span>125237144 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>              </span>Vector<span>    </span>22 <span>            </span>22 <span>    </span>19714528 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>           </span>Index Set<span>    </span>10 <span>            </span>10 <span>      </span>995096 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>         </span>Vec Scatter <span>    </span>4<span>              </span>4 <span>        </span>3168 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>   </span>Star Forest Graph <span>    </span>4<span>              </span>4 <span>        </span>3936 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>          </span>EPS Solver <span>    </span>1<span>              </span>1 <span>        </span>2292 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>  </span>Spectral Transform <span>    </span>1<span>              </span>1<span>          </span>848 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>       </span>Basis Vectors <span>    </span>1<span>              </span>1 <span>        </span>2184 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>         </span>PetscRandom <span>    </span>1<span>              </span>1<span>          </span>662 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>              </span>Region <span>    </span>1<span>              </span>1<span>          </span>672 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>       </span>Direct Solver <span>    </span>1<span>              </span>1<span>        </span>17456 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>       </span>Krylov Solver <span>    </span>1<span>              </span>1 <span>        </span>1400 <span>    </span>0.</span></p>
<p style="margin:0px;font:11px Menlo"><span style="font-variant-ligatures:no-common-ligatures"><span>      </span>Preconditioner <span>    </span>1<span>              </span>1 <span>        </span>1000 <span>    </span>0.</span></p></div><div><br></div><div>Certainly there is no apparent factor 2x increase in memory usage in the underlying petsc objects themselves.</div><div>Furthermore, the counts of creations of petsc objects in toobig.log and justfine.log match, indicating that none of the implementations used in either PETSc or SLEPc have fundamentally changed wrt the usage of the native petsc objects.</div><div><br></div><div>It is also curious that VecNorm is called 3 times in "justfine.log" and 19 times in "toobig.log" - although I don't see how that could be related to you problem...</div><div><br></div><div>The above at least gives me the impression that issue of memory increase is likely not coming from PETSc.</div><div>I just read Barry's useful email which is even more compelling and also indicates SLEPc is not the likely culprit either as it uses PetscMalloc() internally.</div><div><br></div><div>Some options to identify the problem:</div><div><br></div><div>1/ Eliminate SLEPc as a possible culprit by not calling EPSSolve() and rather just call KSPSolve() with some RHS vector.</div><div>* If you still see a 2x increase, switch the preconditioner to using -pc_type bjacobi -ksp_max_it 10 rather than superlu_dist.</div><div>If the memory usage is good, you can be pretty certain the issue arises internally to superl_dist.</div><div><br></div><div>2/ Leave your code as is and perform your profiling using mumps rather than superlu_dist. <br></div><div>This is a less reliable test than 1/ since the mumps implementation used with v3.9 and v3.12 may differ...</div><div><br></div><div>Thanks</div><div>Dave</div><div><br></div></div><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, 9 Jan 2020 at 20:17, Santiago Andres Triana <<a href="mailto:repepo@gmail.com" target="_blank">repepo@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">Dear all,<div><br></div><div>I think parmetis is not involved since I still run out of memory if I use the following options:</div><div><font face="monospace">export opts='-st_type sinvert -st_ksp_type preonly -st_pc_type lu -st_pc_factor_mat_solver_type superlu_dist -eps_true_residual 1'</font><br></div><div>and  issuing:</div><div><font face="monospace">mpiexec -n 24 ./ex7 -f1 A.petsc -f2 B.petsc -eps_nev 1 -eps_target -4.008e-3+1.57142i $opts -eps_target_magnitude -eps_tol 1e-14 -memory_view</font><br></div><div><br></div><div>Bottom line is that the memory usage of petsc-3.9.4 / slepc-3.9.2 is much lower than current version. I can only solve relatively small problems using the 3.12 series :(</div><div>I have an example with smaller matrices that will likely fail in a 32 Gb ram machine with petsc-3.12 but runs just fine with petsc-3.9. The -memory_view output is</div><div><br></div><div>with petsc-3.9.4: (log 'justfine.log' attached)</div><div><br></div><div><font face="monospace">Summary of Memory Usage in PETSc<br>Maximum (over computational time) process memory:        total 1.6665e+10 max 7.5674e+08 min 6.4215e+08<br>Current process memory:                                  total 1.5841e+10 max 7.2881e+08 min 6.0905e+08<br>Maximum (over computational time) space PetscMalloc()ed: total 3.1290e+09 max 1.5868e+08 min 1.0179e+08<br>Current space PetscMalloc()ed:                           total 1.8808e+06 max 7.8368e+04 min 7.8368e+04</font><br></div><div><br></div><div><br></div><div>with petsc-3.12.2: (log 'toobig.log' attached)</div><div><br></div><div><font face="monospace">Summary of Memory Usage in PETSc<br>Maximum (over computational time) process memory:        total 3.1564e+10 max 1.3662e+09 min 1.2604e+09<br>Current process memory:                                  total 3.0355e+10 max 1.3082e+09 min 1.2254e+09<br>Maximum (over computational time) space PetscMalloc()ed: total 2.7618e+09 max 1.4339e+08 min 8.6493e+07<br>Current space PetscMalloc()ed:                           total 3.6127e+06 max 1.5053e+05 min 1.5053e+05</font><br></div><div><br></div><div>Strangely, monitoring with 'top' I can see *appreciably higher* peak memory use, usually twice what -memory_view ends up reporting, both for petsc-3.9.4 and current. Program fails usually at this peak if not enough ram available</div><div><br></div><div>The matrices for the example quoted above can be downloaded here (I use slepc's tutorial ex7.c to solve the problem):</div><div><a href="https://www.dropbox.com/s/as9bec9iurjra6r/A.petsc?dl=0" target="_blank">https://www.dropbox.com/s/as9bec9iurjra6r/A.petsc?dl=0</a>  (about 600 Mb)<br></div><div><a href="https://www.dropbox.com/s/u2bbmng23rp8l91/B.petsc?dl=0" target="_blank">https://www.dropbox.com/s/u2bbmng23rp8l91/B.petsc?dl=0</a>  (about 210 Mb)<br></div><div><br></div><div>I haven't been able to use a debugger successfully since I am using a compute node without the possibility of an xterm ... note that I have no experience using a debugger so any help on that will also be appreciated!</div><div>Hope I can switch to the current petsc/slepc version for my production runs soon...</div><div><br></div><div>Thanks again!</div><div>Santiago</div><div><br></div><div><br></div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Jan 9, 2020 at 4:25 PM Stefano Zampini <<a href="mailto:stefano.zampini@gmail.com" target="_blank">stefano.zampini@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div>Can you reproduce the issue with smaller matrices? Or with a debug build (i.e. using —with-debugging=1 and compilation flags -02 -g)? <div><br></div><div>The only changes in parmetis between the two PETSc releases are these below, but I don’t see how they could cause issues</div><div><br></div><div><div style="margin:0px;font-stretch:normal;line-height:normal"><span style="font-variant-ligatures:no-common-ligatures">kl-18448:pkg-parmetis szampini$ git log -2</span></div><div style="margin:0px;font-stretch:normal;line-height:normal;color:rgb(159,160,28)"><span style="font-variant-ligatures:no-common-ligatures">commit ab4fedc6db1f2e3b506be136e3710fcf89ce16ea (</span><span style="font-variant-ligatures:no-common-ligatures;color:rgb(46,174,187)"><b>HEAD -> </b></span><span style="font-variant-ligatures:no-common-ligatures;color:rgb(47,180,29)"><b>master</b></span><span style="font-variant-ligatures:no-common-ligatures">, <b>tag: v4.0.3-p5</b>, </span><span style="font-variant-ligatures:no-common-ligatures;color:rgb(180,36,25)"><b>origin/master</b></span><span style="font-variant-ligatures:no-common-ligatures">, </span><span style="font-variant-ligatures:no-common-ligatures;color:rgb(180,36,25)"><b>origin/dalcinl/random</b></span><span style="font-variant-ligatures:no-common-ligatures">, </span><span style="font-variant-ligatures:no-common-ligatures;color:rgb(180,36,25)"><b>origin/HEAD</b></span><span style="font-variant-ligatures:no-common-ligatures">)</span></div><div style="margin:0px;font-stretch:normal;line-height:normal"><span style="font-variant-ligatures:no-common-ligatures">Author: Lisandro Dalcin <<a href="mailto:dalcinl@gmail.com" target="_blank">dalcinl@gmail.com</a>></span></div><div style="margin:0px;font-stretch:normal;line-height:normal"><span style="font-variant-ligatures:no-common-ligatures">Date:   Thu May 9 18:44:10 2019 +0300</span></div><div style="margin:0px;font-stretch:normal;line-height:normal;min-height:24px"><span style="font-variant-ligatures:no-common-ligatures"></span><br></div><div style="margin:0px;font-stretch:normal;line-height:normal"><span style="font-variant-ligatures:no-common-ligatures">    GKLib: Make FPRFX##randInRange() portable for 32bit/64bit indices</span></div><div style="margin:0px;font-stretch:normal;line-height:normal;min-height:24px"><span style="font-variant-ligatures:no-common-ligatures"></span><br></div><div style="margin:0px;font-stretch:normal;line-height:normal;color:rgb(159,160,28)"><span style="font-variant-ligatures:no-common-ligatures">commit 2b4afc79a79ef063f369c43da2617fdb64746dd7</span></div><div style="margin:0px;font-stretch:normal;line-height:normal"><span style="font-variant-ligatures:no-common-ligatures">Author: Lisandro Dalcin <<a href="mailto:dalcinl@gmail.com" target="_blank">dalcinl@gmail.com</a>></span></div><div style="margin:0px;font-stretch:normal;line-height:normal"><span style="font-variant-ligatures:no-common-ligatures">Date:   Sat May 4 17:22:19 2019 +0300</span></div><div style="margin:0px;font-stretch:normal;line-height:normal;min-height:24px"><span style="font-variant-ligatures:no-common-ligatures"></span><br></div><div style="margin:0px;font-stretch:normal;line-height:normal"><span style="font-variant-ligatures:no-common-ligatures">    GKlib: Use gk_randint32() to define the RandomInRange() macro</span></div><div><span style="font-variant-ligatures:no-common-ligatures"><br></span></div><div><span style="font-variant-ligatures:no-common-ligatures"><br></span></div><div><br><blockquote type="cite"><div>On Jan 9, 2020, at 4:31 AM, Smith, Barry F. via petsc-users <<a href="mailto:petsc-users@mcs.anl.gov" target="_blank">petsc-users@mcs.anl.gov</a>> wrote:</div><br><div><div><br>  This is extremely worrisome:<br><br>==23361== Use of uninitialised value of size 8<br>==23361==    at 0x847E939: gk_randint64 (random.c:99)<br>==23361==    by 0x847EF88: gk_randint32 (random.c:128)<br>==23361==    by 0x81EBF0B: libparmetis__Match_Global (in /space/hpc-home/trianas/petsc-3.12.3/arch-linux2-c-debug/lib/libparmetis.so)<br><br>do you get that with PETSc-3.9.4 or only with 3.12.3?  <br><br>   This may result in Parmetis using non-random numbers and then giving back an inappropriate ordering that requires more memory for SuperLU_DIST.<br><br>  Suggest looking at the code, or running in the debugger to see what is going on there. We use parmetis all the time and don't see this.<br><br>  Barry<br><br><br><br><br><br><br><blockquote type="cite">On Jan 8, 2020, at 4:34 PM, Santiago Andres Triana <<a href="mailto:repepo@gmail.com" target="_blank">repepo@gmail.com</a>> wrote:<br><br>Dear Matt, petsc-users:<br><br>Finally back after the holidays to try to solve this issue, thanks for your patience!<br>I compiled the latest petsc (3.12.3) with debugging enabled, the same problem appears: relatively large matrices result in out of memory errors. This is not the case for petsc-3.9.4, all fine there.<br>This is a non-hermitian, generalized eigenvalue problem, I generate the A and B matrices myself and then I use example 7 (from the slepc tutorial at $SLEPC_DIR/src/eps/examples/tutorials/ex7.c ) to solve the problem:<br><br>mpiexec -n 24 valgrind --tool=memcheck -q --num-callers=20 --log-file=valgrind.log.%p ./ex7 -malloc off -f1 A.petsc -f2 B.petsc -eps_nev 1 -eps_target -2.5e-4+1.56524i -eps_target_magnitude -eps_tol 1e-14 $opts<br><br>where the $opts variable is:<br>export opts='-st_type sinvert -st_ksp_type preonly -st_pc_type lu -eps_error_relative ::ascii_info_detail -st_pc_factor_mat_solver_type superlu_dist -mat_superlu_dist_iterrefine 1 -mat_superlu_dist_colperm PARMETIS -mat_superlu_dist_parsymbfact 1 -eps_converged_reason -eps_conv_rel -eps_monitor_conv -eps_true_residual 1'<br><br>the output from valgrind (sample from one processor) and from the program are attached.<br>If it's of any use the matrices are here (might need at least 180 Gb of ram to solve the problem succesfully under petsc-3.9.4):<br><br><a href="https://www.dropbox.com/s/as9bec9iurjra6r/A.petsc?dl=0" target="_blank">https://www.dropbox.com/s/as9bec9iurjra6r/A.petsc?dl=0</a><br><a href="https://www.dropbox.com/s/u2bbmng23rp8l91/B.petsc?dl=0" target="_blank">https://www.dropbox.com/s/u2bbmng23rp8l91/B.petsc?dl=0</a><br><br>WIth petsc-3.9.4 and slepc-3.9.2 I can use matrices up to 10Gb (with 240 Gb ram), but only up to 3Gb with the latest petsc/slepc.<br>Any suggestions, comments or any other help are very much appreciated!<br><br>Cheers,<br>Santiago<br><br><br><br>On Mon, Dec 23, 2019 at 11:19 PM Matthew Knepley <<a href="mailto:knepley@gmail.com" target="_blank">knepley@gmail.com</a>> wrote:<br>On Mon, Dec 23, 2019 at 3:14 PM Santiago Andres Triana <<a href="mailto:repepo@gmail.com" target="_blank">repepo@gmail.com</a>> wrote:<br>Dear all,<br><br>After upgrading to petsc 3.12.2 my solver program crashes consistently. Before the upgrade I was using petsc 3.9.4 with no problems.<br><br>My application deals with a complex-valued, generalized eigenvalue problem. The matrices involved are relatively large, typically 2 to 10 Gb in size, which is no problem for petsc 3.9.4.<br><br>Are you sure that your indices do not exceed 4B? If so, you need to configure using<br><br>  --with-64-bit-indices<br><br>Also, it would be nice if you ran with the debugger so we can get a stack trace for the SEGV.<br><br>  Thanks,<br><br>    Matt<br><br>However, after the upgrade I can only obtain solutions when the matrices are small, the solver crashes when the matrices' size exceed about 1.5 Gb:<br><br>[0]PETSC ERROR: ------------------------------------------------------------------------<br>[0]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the batch system) has told this process to end<br>[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger<br>[0]PETSC ERROR: or see <a href="https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind" target="_blank">https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind</a><br>[0]PETSC ERROR: or try <a href="http://valgrind.org" target="_blank">http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors<br>[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run <br>[0]PETSC ERROR: to get more information on the crash.<br><br>and so on for each cpu.<br><br><br>I tried using valgrind and this is the typical output:<br><br>==2874== Conditional jump or move depends on uninitialised value(s)<br>==2874==    at 0x4018178: index (in /lib64/<a href="http://ld-2.22.so" target="_blank">ld-2.22.so</a>)<br>==2874==    by 0x400752D: expand_dynamic_string_token (in /lib64/<a href="http://ld-2.22.so" target="_blank">ld-2.22.so</a>)<br>==2874==    by 0x4008009: _dl_map_object (in /lib64/<a href="http://ld-2.22.so" target="_blank">ld-2.22.so</a>)<br>==2874==    by 0x40013E4: map_doit (in /lib64/<a href="http://ld-2.22.so" target="_blank">ld-2.22.so</a>)<br>==2874==    by 0x400EA53: _dl_catch_error (in /lib64/<a href="http://ld-2.22.so" target="_blank">ld-2.22.so</a>)<br>==2874==    by 0x4000ABE: do_preload (in /lib64/<a href="http://ld-2.22.so" target="_blank">ld-2.22.so</a>)<br>==2874==    by 0x4000EC0: handle_ld_preload (in /lib64/<a href="http://ld-2.22.so" target="_blank">ld-2.22.so</a>)<br>==2874==    by 0x40034F0: dl_main (in /lib64/<a href="http://ld-2.22.so" target="_blank">ld-2.22.so</a>)<br>==2874==    by 0x4016274: _dl_sysdep_start (in /lib64/<a href="http://ld-2.22.so" target="_blank">ld-2.22.so</a>)<br>==2874==    by 0x4004A99: _dl_start (in /lib64/<a href="http://ld-2.22.so" target="_blank">ld-2.22.so</a>)<br>==2874==    by 0x40011F7: ??? (in /lib64/<a href="http://ld-2.22.so" target="_blank">ld-2.22.so</a>)<br>==2874==    by 0x12: ???<br>==2874== <br><br><br>These are my configuration options. Identical for both petsc 3.9.4 and 3.12.2:<br><br>./configure --with-scalar-type=complex --download-mumps --download-parmetis --download-metis --download-scalapack=1 --download-fblaslapack=1 --with-debugging=0 --download-superlu_dist=1 --download-ptscotch=1 CXXOPTFLAGS='-O3 -march=native' FOPTFLAGS='-O3 -march=native' COPTFLAGS='-O3 -march=native'<br><br><br>Thanks in advance for any comments or ideas!<br><br>Cheers,<br>Santiago<br><br><br>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>-- Norbert Wiener<br><br><a href="https://www.cse.buffalo.edu/~knepley/" target="_blank">https://www.cse.buffalo.edu/~knepley/</a><br><test1.e6034496><valgrind.log.23361><br></blockquote><br></div></div></blockquote></div><br></div></div></blockquote></div>
</blockquote></div></div></div></div>