<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
<style type="text/css" style="display:none"><!--P{margin-top:0;margin-bottom:0;} --></style>
</head>
<body dir="ltr" style="font-size:12pt;color:#000000;background-color:#FFFFFF;font-family:Calibri,Arial,Helvetica,sans-serif;">
<p>Ok, thank you. I have 1-2 minutes in a commercial code and was of course hoping that PETScSuperLU would be at least that fast.</p>
<p>Currently PETSc/SuperLU is around 10 times slower, so I have to dig a little deeper, but now I know it will be worthwhile...
</p>
<p><br>
</p>
<p>Mahir<br>
</p>
<p><br>
</p>
<p><br>
</p>
<div style="color: rgb(33, 33, 33);">
<hr tabindex="-1" style="display:inline-block; width:98%">
<div id="divRplyFwdMsg" dir="ltr"><font style="font-size:11pt" face="Calibri, sans-serif" color="#000000"><b>Från:</b> Xiaoye S. Li <xsli@lbl.gov><br>
<b>Skickat:</b> den 12 augusti 2015 01:49<br>
<b>Till:</b> Ülker-Kaustell, Mahir<br>
<b>Kopia:</b> petsc-users<br>
<b>Ämne:</b> Re: [petsc-users] SuperLU MPI-problem</font>
<div> </div>
</div>
<div>
<div dir="ltr">
<div class="gmail_default" style="font-family:arial,helvetica,sans-serif">It's hard to say. For 3D problems, you may get a fill factor about 30x-50x (can be larger or smaller depending on problem.) The time may be in seconds, or minutes at most.</div>
<div class="gmail_default" style="font-family:arial,helvetica,sans-serif"><br>
</div>
<div class="gmail_default" style="font-family:arial,helvetica,sans-serif">Sherry</div>
</div>
<div class="gmail_extra"><br>
<div class="gmail_quote">On Tue, Aug 11, 2015 at 7:31 AM, <a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">
Mahir.Ulker-Kaustell@tyrens.se</a> <span dir="ltr"><<a href="mailto:Mahir.Ulker-Kaustell@tyrens.se" target="_blank">Mahir.Ulker-Kaustell@tyrens.se</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex; border-left:1px #ccc solid; padding-left:1ex">
Yes! Doing:<br>
<br>
$PETSC_DIR/$PETSC_ARCH/bin/mpiexec<br>
<br>
instead of<br>
<br>
mpiexec<br>
<br>
makes the program run as expected.<br>
<br>
Thank you all for your patience and encouragement.<br>
<br>
Sherry: I have noticed that you have been involved in some publications related to my current work, i.e. wave propagation in elastic solids. What computation time would you expect using SuperLU to solve one linear system with say 800000 degrees of freedom and
4-8 processes (on a single node) with a finite element discretization?<br>
<span class="HOEnZb"><font color="#888888"><br>
Mahir<br>
</font></span><span class="im HOEnZb"><br>
<br>
<br>
<br>
<br>
-----Original Message-----<br>
From: Satish Balay [mailto:<a href="mailto:balay@mcs.anl.gov">balay@mcs.anl.gov</a>]<br>
Sent: den 7 augusti 2015 18:09<br>
To: Ülker-Kaustell, Mahir<br>
</span>
<div class="HOEnZb">
<div class="h5">Cc: Hong; PETSc users list<br>
Subject: Re: [petsc-users] SuperLU MPI-problem<br>
<br>
This usually happens if you use the wrong MPIEXEC<br>
<br>
i.e use the mpiexec from the MPI you built PETSc with.<br>
<br>
Satish<br>
<br>
On Fri, 7 Aug 2015, <a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">Mahir.Ulker-Kaustell@tyrens.se</a> wrote:<br>
<br>
> Hong,<br>
><br>
> Running example 2 with the command line given below gives me two uniprocessor runs!?<br>
><br>
> $ mpiexec -n 2 ./ex2 -pc_type lu -pc_factor_mat_solver_package superlu_dist -ksp_view<br>
> KSP Object: 1 MPI processes<br>
> type: gmres<br>
> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement<br>
> GMRES: happy breakdown tolerance 1e-30<br>
> maximum iterations=10000, initial guess is zero<br>
> tolerances: relative=0.000138889, absolute=1e-50, divergence=10000<br>
> left preconditioning<br>
> using PRECONDITIONED norm type for convergence test<br>
> PC Object: 1 MPI processes<br>
> type: lu<br>
> LU: out-of-place factorization<br>
> tolerance for zero pivot 2.22045e-14<br>
> matrix ordering: nd<br>
> factor fill ratio given 0, needed 0<br>
> Factored matrix follows:<br>
> Mat Object: 1 MPI processes<br>
> type: seqaij<br>
> rows=56, cols=56<br>
> package used to perform factorization: superlu_dist<br>
> total: nonzeros=0, allocated nonzeros=0<br>
> total number of mallocs used during MatSetValues calls =0<br>
> SuperLU_DIST run parameters:<br>
> Process grid nprow 1 x npcol 1<br>
> Equilibrate matrix TRUE<br>
> Matrix input mode 0<br>
> Replace tiny pivots TRUE<br>
> Use iterative refinement FALSE<br>
> Processors in row 1 col partition 1<br>
> Row permutation LargeDiag<br>
> Column permutation METIS_AT_PLUS_A<br>
> Parallel symbolic factorization FALSE<br>
> Repeated factorization SamePattern_SameRowPerm<br>
> linear system matrix = precond matrix:<br>
> Mat Object: 1 MPI processes<br>
> type: seqaij<br>
> rows=56, cols=56<br>
> total: nonzeros=250, allocated nonzeros=280<br>
> total number of mallocs used during MatSetValues calls =0<br>
> not using I-node routines<br>
> Norm of error 5.21214e-15 iterations 1<br>
> KSP Object: 1 MPI processes<br>
> type: gmres<br>
> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement<br>
> GMRES: happy breakdown tolerance 1e-30<br>
> maximum iterations=10000, initial guess is zero<br>
> tolerances: relative=0.000138889, absolute=1e-50, divergence=10000<br>
> left preconditioning<br>
> using PRECONDITIONED norm type for convergence test<br>
> PC Object: 1 MPI processes<br>
> type: lu<br>
> LU: out-of-place factorization<br>
> tolerance for zero pivot 2.22045e-14<br>
> matrix ordering: nd<br>
> factor fill ratio given 0, needed 0<br>
> Factored matrix follows:<br>
> Mat Object: 1 MPI processes<br>
> type: seqaij<br>
> rows=56, cols=56<br>
> package used to perform factorization: superlu_dist<br>
> total: nonzeros=0, allocated nonzeros=0<br>
> total number of mallocs used during MatSetValues calls =0<br>
> SuperLU_DIST run parameters:<br>
> Process grid nprow 1 x npcol 1<br>
> Equilibrate matrix TRUE<br>
> Matrix input mode 0<br>
> Replace tiny pivots TRUE<br>
> Use iterative refinement FALSE<br>
> Processors in row 1 col partition 1<br>
> Row permutation LargeDiag<br>
> Column permutation METIS_AT_PLUS_A<br>
> Parallel symbolic factorization FALSE<br>
> Repeated factorization SamePattern_SameRowPerm<br>
> linear system matrix = precond matrix:<br>
> Mat Object: 1 MPI processes<br>
> type: seqaij<br>
> rows=56, cols=56<br>
> total: nonzeros=250, allocated nonzeros=280<br>
> total number of mallocs used during MatSetValues calls =0<br>
> not using I-node routines<br>
> Norm of error 5.21214e-15 iterations 1<br>
><br>
> Mahir<br>
><br>
> From: Hong [mailto:<a href="mailto:hzhang@mcs.anl.gov">hzhang@mcs.anl.gov</a>]<br>
> Sent: den 6 augusti 2015 16:36<br>
> To: Ülker-Kaustell, Mahir<br>
> Cc: Hong; Xiaoye S. Li; PETSc users list<br>
> Subject: Re: [petsc-users] SuperLU MPI-problem<br>
><br>
> Mahir:<br>
><br>
> I have been using PETSC_COMM_WORLD.<br>
><br>
> What do you get by running a petsc example, e.g.,<br>
> petsc/src/ksp/ksp/examples/tutorials<br>
> mpiexec -n 2 ./ex2 -pc_type lu -pc_factor_mat_solver_package superlu_dist -ksp_view<br>
><br>
> KSP Object: 2 MPI processes<br>
> type: gmres<br>
> ...<br>
><br>
> Hong<br>
><br>
> From: Hong [mailto:<a href="mailto:hzhang@mcs.anl.gov">hzhang@mcs.anl.gov</a><mailto:<a href="mailto:hzhang@mcs.anl.gov">hzhang@mcs.anl.gov</a>>]<br>
> Sent: den 5 augusti 2015 17:11<br>
> To: Ülker-Kaustell, Mahir<br>
> Cc: Hong; Xiaoye S. Li; PETSc users list<br>
> Subject: Re: [petsc-users] SuperLU MPI-problem<br>
><br>
> Mahir:<br>
> As you noticed, you ran the code in serial mode, not parallel.<br>
> Check your code on input communicator, e.g., what input communicator do you use in<br>
> KSPCreate(comm,&ksp)?<br>
><br>
> I have added error flag to superlu_dist interface (released version). When user uses '-mat_superlu_dist_parsymbfact'<br>
> in serial mode, this option is ignored with a warning.<br>
><br>
> Hong<br>
><br>
> Hong,<br>
><br>
> If I set parsymbfact:<br>
><br>
> $ mpiexec -n 2 ./solve -ksp_type preonly -pc_type lu -pc_factor_mat_solver_package superlu_dist -mat_superlu_dist_matinput DISTRIBUTED -mat_superlu_dist_parsymbfact -ksp_view<br>
> Invalid ISPEC at line 484 in file get_perm_c.c<br>
> Invalid ISPEC at line 484 in file get_perm_c.c<br>
> -------------------------------------------------------<br>
> Primary job terminated normally, but 1 process returned<br>
> a non-zero exit code.. Per user-direction, the job has been aborted.<br>
> -------------------------------------------------------<br>
> --------------------------------------------------------------------------<br>
> mpiexec detected that one or more processes exited with non-zero status, thus causing<br>
> the job to be terminated. The first process to do so was:<br>
><br>
> Process name: [[63679,1],0]<br>
> Exit code: 255<br>
> --------------------------------------------------------------------------<br>
><br>
> Since the program does not finish the call to KSPSolve(), we do not get any information about the KSP from –ksp_view.<br>
><br>
> If I do not set it, I get a serial run even if I specify –n 2:<br>
><br>
> mpiexec -n 2 ./solve -ksp_type preonly -pc_type lu -pc_factor_mat_solver_package superlu_dist -ksp_view<br>
> …<br>
> KSP Object: 1 MPI processes<br>
> type: preonly<br>
> maximum iterations=10000, initial guess is zero<br>
> tolerances: relative=1e-05, absolute=1e-50, divergence=10000<br>
> left preconditioning<br>
> using NONE norm type for convergence test<br>
> PC Object: 1 MPI processes<br>
> type: lu<br>
> LU: out-of-place factorization<br>
> tolerance for zero pivot 2.22045e-14<br>
> matrix ordering: nd<br>
> factor fill ratio given 0, needed 0<br>
> Factored matrix follows:<br>
> Mat Object: 1 MPI processes<br>
> type: seqaij<br>
> rows=954, cols=954<br>
> package used to perform factorization: superlu_dist<br>
> total: nonzeros=0, allocated nonzeros=0<br>
> total number of mallocs used during MatSetValues calls =0<br>
> SuperLU_DIST run parameters:<br>
> Process grid nprow 1 x npcol 1<br>
> Equilibrate matrix TRUE<br>
> Matrix input mode 0<br>
> Replace tiny pivots TRUE<br>
> Use iterative refinement FALSE<br>
> Processors in row 1 col partition 1<br>
> Row permutation LargeDiag<br>
> Column permutation METIS_AT_PLUS_A<br>
> Parallel symbolic factorization FALSE<br>
> Repeated factorization SamePattern_SameRowPerm<br>
> linear system matrix = precond matrix:<br>
> Mat Object: 1 MPI processes<br>
> type: seqaij<br>
> rows=954, cols=954<br>
> total: nonzeros=34223, allocated nonzeros=34223<br>
> total number of mallocs used during MatSetValues calls =0<br>
> using I-node routines: found 668 nodes, limit used is 5<br>
><br>
> I am running PETSc via Cygwin on a windows machine.<br>
> When I installed PETSc the tests with different numbers of processes ran well.<br>
><br>
> Mahir<br>
><br>
><br>
> From: Hong [mailto:<a href="mailto:hzhang@mcs.anl.gov">hzhang@mcs.anl.gov</a><mailto:<a href="mailto:hzhang@mcs.anl.gov">hzhang@mcs.anl.gov</a>>]<br>
> Sent: den 3 augusti 2015 19:06<br>
> To: Ülker-Kaustell, Mahir<br>
> Cc: Hong; Xiaoye S. Li; PETSc users list<br>
> Subject: Re: [petsc-users] SuperLU MPI-problem<br>
><br>
> Mahir,<br>
><br>
><br>
> I have not used …parsymbfact in sequential runs or set matinput=GLOBAL for parallel runs.<br>
><br>
> If I use 2 processors, the program runs if I use –mat_superlu_dist_parsymbfact=1:<br>
> mpiexec -n 2 ./solve -ksp_type preonly -pc_type lu -pc_factor_mat_solver_package superlu_dist -mat_superlu_dist_matinput GLOBAL -mat_superlu_dist_parsymbfact=1<br>
><br>
> The incorrect option '-mat_superlu_dist_parsymbfact=1' is not taken, so your code runs well without parsymbfact.<br>
><br>
> Please run it with '-ksp_view' and see what<br>
> 'SuperLU_DIST run parameters:' are being used, e.g.<br>
> petsc/src/ksp/ksp/examples/tutorials (maint)<br>
> $ mpiexec -n 2 ./ex2 -pc_type lu -pc_factor_mat_solver_package superlu_dist -mat_superlu_dist_parsymbfact=1 -ksp_view<br>
><br>
> ...<br>
> SuperLU_DIST run parameters:<br>
> Process grid nprow 2 x npcol 1<br>
> Equilibrate matrix TRUE<br>
> Matrix input mode 1<br>
> Replace tiny pivots TRUE<br>
> Use iterative refinement FALSE<br>
> Processors in row 2 col partition 1<br>
> Row permutation LargeDiag<br>
> Column permutation METIS_AT_PLUS_A<br>
> Parallel symbolic factorization FALSE<br>
> Repeated factorization SamePattern_SameRowPerm<br>
><br>
> I do not understand why your code uses matrix input mode = global.<br>
><br>
> Hong<br>
><br>
><br>
><br>
> From: Hong [mailto:<a href="mailto:hzhang@mcs.anl.gov">hzhang@mcs.anl.gov</a><mailto:<a href="mailto:hzhang@mcs.anl.gov">hzhang@mcs.anl.gov</a>>]<br>
> Sent: den 3 augusti 2015 16:46<br>
> To: Xiaoye S. Li<br>
> Cc: Ülker-Kaustell, Mahir; Hong; PETSc users list<br>
><br>
> Subject: Re: [petsc-users] SuperLU MPI-problem<br>
><br>
> Mahir,<br>
><br>
> Sherry found the culprit. I can reproduce it:<br>
> petsc/src/ksp/ksp/examples/tutorials<br>
> mpiexec -n 2 ./ex2 -pc_type lu -pc_factor_mat_solver_package superlu_dist -mat_superlu_dist_matinput GLOBAL -mat_superlu_dist_parsymbfact<br>
><br>
> Invalid ISPEC at line 484 in file get_perm_c.c<br>
> Invalid ISPEC at line 484 in file get_perm_c.c<br>
> -------------------------------------------------------<br>
> Primary job terminated normally, but 1 process returned<br>
> a non-zero exit code.. Per user-direction, the job has been aborted.<br>
> -------------------------------------------------------<br>
> ...<br>
><br>
> PETSc-superlu_dist interface sets matinput=DISTRIBUTED as default when using more than one processes.<br>
> Did you either use '-mat_superlu_dist_parsymbfact' for sequential run or set matinput=GLOBAL for parallel run?<br>
><br>
> I'll add an error flag for these use cases.<br>
><br>
> Hong<br>
><br>
> On Mon, Aug 3, 2015 at 9:17 AM, Xiaoye S. Li <<a href="mailto:xsli@lbl.gov">xsli@lbl.gov</a><mailto:<a href="mailto:xsli@lbl.gov">xsli@lbl.gov</a>>> wrote:<br>
> I think I know the problem. Since zdistribute.c is called, I guess you are using the global (replicated) matrix input interface, pzgssvx_ABglobal(). This interface does not allow you to use parallel symbolic factorization (since matrix is centralized).<br>
><br>
> That's why you get the following error:<br>
> Invalid ISPEC at line 484 in file get_perm_c.c<br>
><br>
> You need to use distributed matrix input interface pzgssvx() (without ABglobal)<br>
><br>
> Sherry<br>
><br>
><br>
> On Mon, Aug 3, 2015 at 5:02 AM, <a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">
Mahir.Ulker-Kaustell@tyrens.se</a><mailto:<a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">Mahir.Ulker-Kaustell@tyrens.se</a>> <<a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">Mahir.Ulker-Kaustell@tyrens.se</a><mailto:<a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">Mahir.Ulker-Kaustell@tyrens.se</a>>>
wrote:<br>
> Hong and Sherry,<br>
><br>
> I have rebuilt PETSc with SuperLU 4.1. Unfortunately, the problem remains:<br>
><br>
> If I use -mat_superlu_dist_parsymbfact, the program crashes with: Invalid ISPEC at line 484 in file get_perm_c.c<br>
> If I use -mat_superlu_dist_parsymbfact=1 or leave this flag out, the program crashes with: Calloc fails for SPA dense[]. at line 438 in file zdistribute.c<br>
><br>
> Mahir<br>
><br>
> From: Hong [mailto:<a href="mailto:hzhang@mcs.anl.gov">hzhang@mcs.anl.gov</a><mailto:<a href="mailto:hzhang@mcs.anl.gov">hzhang@mcs.anl.gov</a>>]<br>
> Sent: den 30 juli 2015 02:58<br>
> To: Ülker-Kaustell, Mahir<br>
> Cc: Xiaoye Li; PETSc users list<br>
><br>
> Subject: Fwd: [petsc-users] SuperLU MPI-problem<br>
><br>
> Mahir,<br>
><br>
> Sherry fixed several bugs in superlu_dist-v4.1.<br>
> The current petsc-release interfaces with superlu_dist-v4.0.<br>
> We do not know whether the reported issue (attached below) has been resolved or not. If not, can you test it with the latest superlu_dist-v4.1?<br>
><br>
> Here is how to do it:<br>
> 1. download superlu_dist v4.1<br>
> 2. remove existing PETSC_ARCH directory, then configure petsc with<br>
> '--download-superlu_dist=superlu_dist_4.1.tar.gz'<br>
> 3. build petsc<br>
><br>
> Let us know if the issue remains.<br>
><br>
> Hong<br>
><br>
><br>
> ---------- Forwarded message ----------<br>
> From: Xiaoye S. Li <<a href="mailto:xsli@lbl.gov">xsli@lbl.gov</a><mailto:<a href="mailto:xsli@lbl.gov">xsli@lbl.gov</a>>><br>
> Date: Wed, Jul 29, 2015 at 2:24 PM<br>
> Subject: Fwd: [petsc-users] SuperLU MPI-problem<br>
> To: Hong Zhang <<a href="mailto:hzhang@mcs.anl.gov">hzhang@mcs.anl.gov</a><mailto:<a href="mailto:hzhang@mcs.anl.gov">hzhang@mcs.anl.gov</a>>><br>
> Hong,<br>
> I am cleaning the mailbox, and saw this unresolved issue. I am not sure whether the new fix to parallel symbolic factorization solves the problem. What bothers be is that he is getting the following error:<br>
><br>
> Invalid ISPEC at line 484 in file get_perm_c.c<br>
> This has nothing to do with my bug fix.<br>
> Shall we ask him to try the new version, or try to get him matrix?<br>
> Sherry<br>
> <br>
><br>
> ---------- Forwarded message ----------<br>
> From: <a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">Mahir.Ulker-Kaustell@tyrens.se</a><mailto:<a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">Mahir.Ulker-Kaustell@tyrens.se</a>> <<a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">Mahir.Ulker-Kaustell@tyrens.se</a><mailto:<a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">Mahir.Ulker-Kaustell@tyrens.se</a>>><br>
> Date: Wed, Jul 22, 2015 at 1:32 PM<br>
> Subject: RE: [petsc-users] SuperLU MPI-problem<br>
> To: Hong <<a href="mailto:hzhang@mcs.anl.gov">hzhang@mcs.anl.gov</a><mailto:<a href="mailto:hzhang@mcs.anl.gov">hzhang@mcs.anl.gov</a>>>, "Xiaoye S. Li" <<a href="mailto:xsli@lbl.gov">xsli@lbl.gov</a><mailto:<a href="mailto:xsli@lbl.gov">xsli@lbl.gov</a>>><br>
> Cc: petsc-users <<a href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a><mailto:<a href="mailto:petsc-users@mcs.anl.gov">petsc-users@mcs.anl.gov</a>>><br>
> The 1000 was just a conservative guess. The number of non-zeros per row is in the tens in general but certain constraints lead to non-diagonal streaks in the sparsity-pattern.<br>
> Is it the reordering of the matrix that is killing me here? How can I set options.ColPerm?<br>
><br>
> If i use -mat_superlu_dist_parsymbfact the program crashes with<br>
><br>
> Invalid ISPEC at line 484 in file get_perm_c.c<br>
> -------------------------------------------------------<br>
> Primary job terminated normally, but 1 process returned<br>
> a non-zero exit code.. Per user-direction, the job has been aborted.<br>
> -------------------------------------------------------<br>
> [0]PETSC ERROR: ------------------------------------------------------------------------<br>
> [0]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the batch system) has told this process to end<br>
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger<br>
> [0]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind" rel="noreferrer" target="_blank">
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind</a><br>
> [0]PETSC ERROR: or try <a href="http://valgrind.org" rel="noreferrer" target="_blank">
http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors<br>
> [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run<br>
> [0]PETSC ERROR: to get more information on the crash.<br>
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>
> [0]PETSC ERROR: Signal received<br>
> [0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">
http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br>
> [0]PETSC ERROR: Petsc Release Version 3.6.0, Jun, 09, 2015<br>
> [0]PETSC ERROR: ./solve on a cygwin-complex-nodebug named CZC5202SM2 by muk Wed Jul 22 21:59:23 2015<br>
> [0]PETSC ERROR: Configure options PETSC_DIR=/packages/petsc-3.6.0 PETSC_ARCH=cygwin-complex-nodebug --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --with-debugging=0 --with-fortran-kernels=1 --with-scalar-type=complex --download-fblaspack --download-mpich
--download-scalapack --download-mumps --download-metis --download-parmetis --download-superlu --download-superlu_dist --download-fftw<br>
> [0]PETSC ERROR: #1 User provided function() line 0 in unknown file<br>
> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0<br>
> [unset]: aborting job:<br>
> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0<br>
> [0]PETSC ERROR: ------------------------------------------------------------------------<br>
><br>
> If i use -mat_superlu_dist_parsymbfact=1 the program crashes (somewhat later) with<br>
><br>
> Malloc fails for Lnzval_bc_ptr[*][] at line 626 in file zdistribute.c<br>
> col block 3006 -------------------------------------------------------<br>
> Primary job terminated normally, but 1 process returned<br>
> a non-zero exit code.. Per user-direction, the job has been aborted.<br>
> -------------------------------------------------------<br>
> col block 1924 [0]PETSC ERROR: ------------------------------------------------------------------------<br>
> [0]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the batch system) has told this process to end<br>
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger<br>
> [0]PETSC ERROR: or see <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind" rel="noreferrer" target="_blank">
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind</a><br>
> [0]PETSC ERROR: or try <a href="http://valgrind.org" rel="noreferrer" target="_blank">
http://valgrind.org</a> on GNU/linux and Apple Mac OS X to find memory corruption errors<br>
> [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run<br>
> [0]PETSC ERROR: to get more information on the crash.<br>
> [0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------<br>
> [0]PETSC ERROR: Signal received<br>
> [0]PETSC ERROR: See <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html" rel="noreferrer" target="_blank">
http://www.mcs.anl.gov/petsc/documentation/faq.html</a> for trouble shooting.<br>
> [0]PETSC ERROR: Petsc Release Version 3.6.0, Jun, 09, 2015<br>
> [0]PETSC ERROR: ./solve on a cygwin-complex-nodebug named CZC5202SM2 by muk Wed Jul 22 21:59:58 2015<br>
> [0]PETSC ERROR: Configure options PETSC_DIR=/packages/petsc-3.6.0 PETSC_ARCH=cygwin-complex-nodebug --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --with-debugging=0 --with-fortran-kernels=1 --with-scalar-type=complex --download-fblaspack --download-mpich
--download-scalapack --download-mumps --download-metis --download-parmetis --download-superlu --download-superlu_dist --download-fftw<br>
> [0]PETSC ERROR: #1 User provided function() line 0 in unknown file<br>
> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0<br>
> [unset]: aborting job:<br>
> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0<br>
> [0]PETSC ERROR: ------------------------------------------------------------------------<br>
><br>
><br>
> /Mahir<br>
><br>
><br>
> From: Hong [mailto:<a href="mailto:hzhang@mcs.anl.gov">hzhang@mcs.anl.gov</a><mailto:<a href="mailto:hzhang@mcs.anl.gov">hzhang@mcs.anl.gov</a>>]<br>
> Sent: den 22 juli 2015 21:34<br>
> To: Xiaoye S. Li<br>
> Cc: Ülker-Kaustell, Mahir; petsc-users<br>
><br>
> Subject: Re: [petsc-users] SuperLU MPI-problem<br>
><br>
> In Petsc/superlu_dist interface, we set default<br>
><br>
> options.ParSymbFact = NO;<br>
><br>
> When user raises the flag "-mat_superlu_dist_parsymbfact",<br>
> we set<br>
><br>
> options.ParSymbFact = YES;<br>
> options.ColPerm = PARMETIS; /* in v2.2, PARMETIS is forced for ParSymbFact regardless of user ordering setting */<br>
><br>
> We do not change anything else.<br>
><br>
> Hong<br>
><br>
> On Wed, Jul 22, 2015 at 2:19 PM, Xiaoye S. Li <<a href="mailto:xsli@lbl.gov">xsli@lbl.gov</a><mailto:<a href="mailto:xsli@lbl.gov">xsli@lbl.gov</a>>> wrote:<br>
> I am trying to understand your problem. You said you are solving Naviers equation (elastodynamics) in the frequency domain, using finite element discretization. I wonder why you have about 1000 nonzeros per row. Usually in many PDE discretized matrices,
the number of nonzeros per row is in the tens (even for 3D problems), not in the thousands. So, your matrix is quite a bit denser than many sparse matrices we deal with.<br>
><br>
> The number of nonzeros in the L and U factors is much more than that in original matrix A -- typically we see 10-20x fill ratio for 2D, or can be as bad as 50-100x fill ratio for 3D. But since your matrix starts much denser (i.e., the underlying graph has
many connections), it may not lend to any good ordering strategy to preserve sparsity of L and U; that is, the L and U fill ratio may be large.<br>
><br>
> I don't understand why you get the following error when you use<br>
> ‘-mat_superlu_dist_parsymbfact’.<br>
><br>
> Invalid ISPEC at line 484 in file get_perm_c.c<br>
><br>
> Perhaps Hong Zhang knows; she built the SuperLU_DIST interface for PETSc.<br>
><br>
> Hong -- in order to use parallel symbolic factorization, is it sufficient to specify only<br>
> ‘-mat_superlu_dist_parsymbfact’<br>
> ? (the default is to use sequential symbolic factorization.)<br>
><br>
><br>
> Sherry<br>
><br>
> On Wed, Jul 22, 2015 at 9:11 AM, <a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">
Mahir.Ulker-Kaustell@tyrens.se</a><mailto:<a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">Mahir.Ulker-Kaustell@tyrens.se</a>> <<a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">Mahir.Ulker-Kaustell@tyrens.se</a><mailto:<a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">Mahir.Ulker-Kaustell@tyrens.se</a>>>
wrote:<br>
> Thank you for your reply.<br>
><br>
> As you have probably figured out already, I am not a computational scientist. I am a researcher in civil engineering (railways for high-speed traffic), trying to produce some, from my perspective, fairly large parametric studies based on finite element discretizations.<br>
><br>
> I am working in a Windows-environment and have installed PETSc through Cygwin.<br>
> Apparently, there is no support for Valgrind in this OS.<br>
><br>
> If I have understood you correct, the memory issues are related to superLU and given my background, there is not much I can do. Is this correct?<br>
><br>
><br>
> Best regards,<br>
> Mahir<br>
><br>
> ______________________________________________<br>
> Mahir Ülker-Kaustell, Kompetenssamordnare, Brokonstruktör, Tekn. Dr, Tyréns AB<br>
> 010 452 30 82, <a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">Mahir.Ulker-Kaustell@tyrens.se</a><mailto:<a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">Mahir.Ulker-Kaustell@tyrens.se</a>><br>
> ______________________________________________<br>
><br>
> -----Original Message-----<br>
> From: Barry Smith [mailto:<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a><mailto:<a href="mailto:bsmith@mcs.anl.gov">bsmith@mcs.anl.gov</a>>]<br>
> Sent: den 22 juli 2015 02:57<br>
> To: Ülker-Kaustell, Mahir<br>
> Cc: Xiaoye S. Li; petsc-users<br>
> Subject: Re: [petsc-users] SuperLU MPI-problem<br>
><br>
><br>
> Run the program under valgrind <a href="http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind" rel="noreferrer" target="_blank">
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind</a> . When I use the option -mat_superlu_dist_parsymbfact I get many scary memory problems some involving for example ddist_psymbtonum (pdsymbfact_distdata.c:1332)<br>
><br>
> Note that I consider it unacceptable for running programs to EVER use uninitialized values; until these are all cleaned up I won't trust any runs like this.<br>
><br>
> Barry<br>
><br>
><br>
><br>
><br>
> ==42050== Conditional jump or move depends on uninitialised value(s)<br>
> ==42050== at 0x10274C436: MPI_Allgatherv (allgatherv.c:1053)<br>
> ==42050== by 0x101557F60: get_perm_c_parmetis (get_perm_c_parmetis.c:285)<br>
> ==42050== by 0x101501192: pdgssvx (pdgssvx.c:934)<br>
> ==42050== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42050== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42050== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42050== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42050== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42050== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42050== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42050== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42050== by 0x100001B3C: main (in ./ex19)<br>
> ==42050== Uninitialised value was created by a stack allocation<br>
> ==42050== at 0x10155751B: get_perm_c_parmetis (get_perm_c_parmetis.c:96)<br>
> ==42050==<br>
> ==42050== Conditional jump or move depends on uninitialised value(s)<br>
> ==42050== at 0x102851C61: MPIR_Allgatherv_intra (allgatherv.c:651)<br>
> ==42050== by 0x102853EC7: MPIR_Allgatherv (allgatherv.c:903)<br>
> ==42050== by 0x102853F84: MPIR_Allgatherv_impl (allgatherv.c:944)<br>
> ==42050== by 0x10274CA41: MPI_Allgatherv (allgatherv.c:1107)<br>
> ==42050== by 0x101557F60: get_perm_c_parmetis (get_perm_c_parmetis.c:285)<br>
> ==42050== by 0x101501192: pdgssvx (pdgssvx.c:934)<br>
> ==42050== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42050== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42050== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42050== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42050== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42050== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42050== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42050== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42050== by 0x100001B3C: main (in ./ex19)<br>
> ==42050== Uninitialised value was created by a stack allocation<br>
> ==42050== at 0x10155751B: get_perm_c_parmetis (get_perm_c_parmetis.c:96)<br>
> ==42050==<br>
> ==42049== Syscall param writev(vector[...]) points to uninitialised byte(s)<br>
> ==42049== at 0x102DA1C3A: writev (in /usr/lib/system/libsystem_kernel.dylib)<br>
> ==42049== by 0x10296A0DC: MPL_large_writev (mplsock.c:32)<br>
> ==42049== by 0x10295F6AD: MPIDU_Sock_writev (sock_immed.i:610)<br>
> ==42049== by 0x102943FCA: MPIDI_CH3_iSendv (ch3_isendv.c:84)<br>
> ==42049== by 0x102934361: MPIDI_CH3_EagerContigIsend (ch3u_eager.c:556)<br>
> ==42049== by 0x102939531: MPID_Isend (mpid_isend.c:138)<br>
> ==42049== by 0x10277656E: MPI_Isend (isend.c:125)<br>
> ==42049== by 0x102088B66: libparmetis__gkMPI_Isend (gkmpi.c:63)<br>
> ==42049== by 0x10208140F: libparmetis__CommInterfaceData (comm.c:298)<br>
> ==42049== by 0x1020A8758: libparmetis__CompactGraph (ometis.c:553)<br>
> ==42049== by 0x1020A77BB: libparmetis__MultilevelOrder (ometis.c:225)<br>
> ==42049== by 0x1020A7493: ParMETIS_V32_NodeND (ometis.c:151)<br>
> ==42049== by 0x1020A6AFB: ParMETIS_V3_NodeND (ometis.c:34)<br>
> ==42049== by 0x101557CFC: get_perm_c_parmetis (get_perm_c_parmetis.c:241)<br>
> ==42049== by 0x101501192: pdgssvx (pdgssvx.c:934)<br>
> ==42049== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42049== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42049== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42049== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42048== Syscall param writev(vector[...]) points to uninitialised byte(s)<br>
> ==42049== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42049== Address 0x105edff70 is 1,424 bytes inside a block of size 752,720 alloc'd<br>
> ==42049== at 0x1000183B1: malloc (vg_replace_malloc.c:303)<br>
> ==42049== by 0x1020EB90C: gk_malloc (memory.c:147)<br>
> ==42049== by 0x1020EAA28: gk_mcoreCreate (mcore.c:28)<br>
> ==42048== at 0x102DA1C3A: writev (in /usr/lib/system/libsystem_kernel.dylib)<br>
> ==42048== by 0x10296A0DC: MPL_large_writev (mplsock.c:32)<br>
> ==42049== by 0x1020BA5CF: libparmetis__AllocateWSpace (wspace.c:23)<br>
> ==42049== by 0x1020A6E84: ParMETIS_V32_NodeND (ometis.c:98)<br>
> ==42048== by 0x10295F6AD: MPIDU_Sock_writev (sock_immed.i:610)<br>
> ==42048== by 0x102943FCA: MPIDI_CH3_iSendv (ch3_isendv.c:84)<br>
> ==42048== by 0x102934361: MPIDI_CH3_EagerContigIsend (ch3u_eager.c:556)<br>
> ==42049== by 0x1020A6AFB: ParMETIS_V3_NodeND (ometis.c:34)<br>
> ==42049== by 0x101557CFC: get_perm_c_parmetis (get_perm_c_parmetis.c:241)<br>
> ==42049== by 0x101501192: pdgssvx (pdgssvx.c:934)<br>
> ==42048== by 0x102939531: MPID_Isend (mpid_isend.c:138)<br>
> ==42048== by 0x10277656E: MPI_Isend (isend.c:125)<br>
> ==42049== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42049== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42049== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42049== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42049== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42048== by 0x102088B66: libparmetis__gkMPI_Isend (gkmpi.c:63)<br>
> ==42048== by 0x10208140F: libparmetis__CommInterfaceData (comm.c:298)<br>
> ==42049== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42049== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42048== by 0x1020A8758: libparmetis__CompactGraph (ometis.c:553)<br>
> ==42048== by 0x1020A77BB: libparmetis__MultilevelOrder (ometis.c:225)<br>
> ==42048== by 0x1020A7493: ParMETIS_V32_NodeND (ometis.c:151)<br>
> ==42049== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42049== by 0x100001B3C: main (in ./ex19)<br>
> ==42049== Uninitialised value was created by a heap allocation<br>
> ==42049== at 0x1000183B1: malloc (vg_replace_malloc.c:303)<br>
> ==42049== by 0x1020EB90C: gk_malloc (memory.c:147)<br>
> ==42048== by 0x1020A6AFB: ParMETIS_V3_NodeND (ometis.c:34)<br>
> ==42048== by 0x101557CFC: get_perm_c_parmetis (get_perm_c_parmetis.c:241)<br>
> ==42048== by 0x101501192: pdgssvx (pdgssvx.c:934)<br>
> ==42048== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42049== by 0x10211C50B: libmetis__imalloc (gklib.c:24)<br>
> ==42049== by 0x1020A8566: libparmetis__CompactGraph (ometis.c:519)<br>
> ==42049== by 0x1020A77BB: libparmetis__MultilevelOrder (ometis.c:225)<br>
> ==42048== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42049== by 0x1020A7493: ParMETIS_V32_NodeND (ometis.c:151)<br>
> ==42049== by 0x1020A6AFB: ParMETIS_V3_NodeND (ometis.c:34)<br>
> ==42049== by 0x101557CFC: get_perm_c_parmetis (get_perm_c_parmetis.c:241)<br>
> ==42049== by 0x101501192: pdgssvx (pdgssvx.c:934)<br>
> ==42049== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42049== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42048== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42049== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42049== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42049== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42049== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42049== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42048== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42048== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42048== Address 0x10597a860 is 1,408 bytes inside a block of size 752,720 alloc'd<br>
> ==42049== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42049== by 0x100001B3C: main (in ./ex19)<br>
> ==42049==<br>
> ==42048== at 0x1000183B1: malloc (vg_replace_malloc.c:303)<br>
> ==42048== by 0x1020EB90C: gk_malloc (memory.c:147)<br>
> ==42048== by 0x1020EAA28: gk_mcoreCreate (mcore.c:28)<br>
> ==42048== by 0x1020BA5CF: libparmetis__AllocateWSpace (wspace.c:23)<br>
> ==42048== by 0x1020A6E84: ParMETIS_V32_NodeND (ometis.c:98)<br>
> ==42048== by 0x1020A6AFB: ParMETIS_V3_NodeND (ometis.c:34)<br>
> ==42048== by 0x101557CFC: get_perm_c_parmetis (get_perm_c_parmetis.c:241)<br>
> ==42048== by 0x101501192: pdgssvx (pdgssvx.c:934)<br>
> ==42048== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42048== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42048== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42048== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42048== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42048== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42048== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42048== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42048== by 0x100001B3C: main (in ./ex19)<br>
> ==42048== Uninitialised value was created by a heap allocation<br>
> ==42048== at 0x1000183B1: malloc (vg_replace_malloc.c:303)<br>
> ==42048== by 0x1020EB90C: gk_malloc (memory.c:147)<br>
> ==42048== by 0x10211C50B: libmetis__imalloc (gklib.c:24)<br>
> ==42048== by 0x1020A8566: libparmetis__CompactGraph (ometis.c:519)<br>
> ==42048== by 0x1020A77BB: libparmetis__MultilevelOrder (ometis.c:225)<br>
> ==42048== by 0x1020A7493: ParMETIS_V32_NodeND (ometis.c:151)<br>
> ==42048== by 0x1020A6AFB: ParMETIS_V3_NodeND (ometis.c:34)<br>
> ==42048== by 0x101557CFC: get_perm_c_parmetis (get_perm_c_parmetis.c:241)<br>
> ==42048== by 0x101501192: pdgssvx (pdgssvx.c:934)<br>
> ==42048== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42048== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42048== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42048== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42048== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42048== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42048== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42048== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42048== by 0x100001B3C: main (in ./ex19)<br>
> ==42048==<br>
> ==42048== Syscall param write(buf) points to uninitialised byte(s)<br>
> ==42048== at 0x102DA1C22: write (in /usr/lib/system/libsystem_kernel.dylib)<br>
> ==42048== by 0x10295F5BD: MPIDU_Sock_write (sock_immed.i:525)<br>
> ==42048== by 0x102944839: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:86)<br>
> ==42048== by 0x102933B80: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:257)<br>
> ==42048== by 0x10293ADBA: MPID_Send (mpid_send.c:130)<br>
> ==42048== by 0x10277A1FA: MPI_Send (send.c:127)<br>
> ==42048== by 0x10155802F: get_perm_c_parmetis (get_perm_c_parmetis.c:299)<br>
> ==42048== by 0x101501192: pdgssvx (pdgssvx.c:934)<br>
> ==42048== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42048== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42048== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42048== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42048== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42048== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42048== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42048== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42048== by 0x100001B3C: main (in ./ex19)<br>
> ==42048== Address 0x104810704 is on thread 1's stack<br>
> ==42048== in frame #3, created by MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:218)<br>
> ==42048== Uninitialised value was created by a heap allocation<br>
> ==42048== at 0x1000183B1: malloc (vg_replace_malloc.c:303)<br>
> ==42048== by 0x10153B704: superlu_malloc_dist (memory.c:108)<br>
> ==42048== by 0x101557AB9: get_perm_c_parmetis (get_perm_c_parmetis.c:185)<br>
> ==42048== by 0x101501192: pdgssvx (pdgssvx.c:934)<br>
> ==42048== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42048== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42048== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42048== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42048== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42048== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42048== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42048== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42048== by 0x100001B3C: main (in ./ex19)<br>
> ==42048==<br>
> ==42050== Conditional jump or move depends on uninitialised value(s)<br>
> ==42050== at 0x102744CB8: MPI_Alltoallv (alltoallv.c:480)<br>
> ==42050== by 0x101510B3E: dist_symbLU (pdsymbfact_distdata.c:539)<br>
> ==42050== by 0x10150A5C6: ddist_psymbtonum (pdsymbfact_distdata.c:1275)<br>
> ==42050== by 0x1015018C2: pdgssvx (pdgssvx.c:1057)<br>
> ==42050== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42050== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42050== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42050== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42050== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42050== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42050== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42050== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42050== by 0x100001B3C: main (in ./ex19)<br>
> ==42050== Uninitialised value was created by a stack allocation<br>
> ==42050== at 0x10150E4C4: dist_symbLU (pdsymbfact_distdata.c:96)<br>
> ==42050==<br>
> ==42050== Conditional jump or move depends on uninitialised value(s)<br>
> ==42050== at 0x102744E43: MPI_Alltoallv (alltoallv.c:490)<br>
> ==42050== by 0x101510B3E: dist_symbLU (pdsymbfact_distdata.c:539)<br>
> ==42050== by 0x10150A5C6: ddist_psymbtonum (pdsymbfact_distdata.c:1275)<br>
> ==42050== by 0x1015018C2: pdgssvx (pdgssvx.c:1057)<br>
> ==42050== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42050== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42050== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42050== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42050== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42050== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42050== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42050== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42050== by 0x100001B3C: main (in ./ex19)<br>
> ==42050== Uninitialised value was created by a stack allocation<br>
> ==42050== at 0x10150E4C4: dist_symbLU (pdsymbfact_distdata.c:96)<br>
> ==42050==<br>
> ==42050== Conditional jump or move depends on uninitialised value(s)<br>
> ==42050== at 0x102744EBF: MPI_Alltoallv (alltoallv.c:497)<br>
> ==42050== by 0x101510B3E: dist_symbLU (pdsymbfact_distdata.c:539)<br>
> ==42050== by 0x10150A5C6: ddist_psymbtonum (pdsymbfact_distdata.c:1275)<br>
> ==42050== by 0x1015018C2: pdgssvx (pdgssvx.c:1057)<br>
> ==42050== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42050== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42050== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42050== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42050== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42050== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42050== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42050== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42050== by 0x100001B3C: main (in ./ex19)<br>
> ==42050== Uninitialised value was created by a stack allocation<br>
> ==42050== at 0x10150E4C4: dist_symbLU (pdsymbfact_distdata.c:96)<br>
> ==42050==<br>
> ==42050== Conditional jump or move depends on uninitialised value(s)<br>
> ==42050== at 0x1027450B1: MPI_Alltoallv (alltoallv.c:512)<br>
> ==42050== by 0x101510B3E: dist_symbLU (pdsymbfact_distdata.c:539)<br>
> ==42050== by 0x10150A5C6: ddist_psymbtonum (pdsymbfact_distdata.c:1275)<br>
> ==42050== by 0x1015018C2: pdgssvx (pdgssvx.c:1057)<br>
> ==42050== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42050== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42050== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42050== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42050== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42050== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42050== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42050== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42050== by 0x100001B3C: main (in ./ex19)<br>
> ==42050== Uninitialised value was created by a stack allocation<br>
> ==42050== at 0x10150E4C4: dist_symbLU (pdsymbfact_distdata.c:96)<br>
> ==42050==<br>
> ==42050== Conditional jump or move depends on uninitialised value(s)<br>
> ==42050== at 0x10283FB06: MPIR_Alltoallv_intra (alltoallv.c:92)<br>
> ==42050== by 0x1028407B6: MPIR_Alltoallv (alltoallv.c:343)<br>
> ==42050== by 0x102840884: MPIR_Alltoallv_impl (alltoallv.c:380)<br>
> ==42050== by 0x10274541B: MPI_Alltoallv (alltoallv.c:531)<br>
> ==42050== by 0x101510B3E: dist_symbLU (pdsymbfact_distdata.c:539)<br>
> ==42050== by 0x10150A5C6: ddist_psymbtonum (pdsymbfact_distdata.c:1275)<br>
> ==42050== by 0x1015018C2: pdgssvx (pdgssvx.c:1057)<br>
> ==42050== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42050== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42050== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42050== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42050== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42050== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42050== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42050== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42050== by 0x100001B3C: main (in ./ex19)<br>
> ==42050== Uninitialised value was created by a stack allocation<br>
> ==42050== at 0x10150E4C4: dist_symbLU (pdsymbfact_distdata.c:96)<br>
> ==42050==<br>
> ==42050== Syscall param writev(vector[...]) points to uninitialised byte(s)<br>
> ==42050== at 0x102DA1C3A: writev (in /usr/lib/system/libsystem_kernel.dylib)<br>
> ==42050== by 0x10296A0DC: MPL_large_writev (mplsock.c:32)<br>
> ==42050== by 0x10295F6AD: MPIDU_Sock_writev (sock_immed.i:610)<br>
> ==42050== by 0x102943FCA: MPIDI_CH3_iSendv (ch3_isendv.c:84)<br>
> ==42050== by 0x102934361: MPIDI_CH3_EagerContigIsend (ch3u_eager.c:556)<br>
> ==42050== by 0x102939531: MPID_Isend (mpid_isend.c:138)<br>
> ==42050== by 0x10277656E: MPI_Isend (isend.c:125)<br>
> ==42050== by 0x101524C41: pdgstrf2_trsm (pdgstrf2.c:201)<br>
> ==42050== by 0x10151ECBF: pdgstrf (pdgstrf.c:1082)<br>
> ==42050== by 0x1015019A5: pdgssvx (pdgssvx.c:1069)<br>
> ==42050== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42050== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42050== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42050== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42050== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42050== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42050== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42050== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42050== by 0x100001B3C: main (in ./ex19)<br>
> ==42050== Address 0x1060144d0 is 1,168 bytes inside a block of size 131,072 alloc'd<br>
> ==42050== at 0x1000183B1: malloc (vg_replace_malloc.c:303)<br>
> ==42050== by 0x10153B704: superlu_malloc_dist (memory.c:108)<br>
> ==42050== by 0x1014FD7AD: doubleMalloc_dist (dmemory.c:145)<br>
> ==42050== by 0x10151DA7D: pdgstrf (pdgstrf.c:735)<br>
> ==42050== by 0x1015019A5: pdgssvx (pdgssvx.c:1069)<br>
> ==42050== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42050== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42050== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42050== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42050== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42050== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42050== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42050== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42050== by 0x100001B3C: main (in ./ex19)<br>
> ==42050== Uninitialised value was created by a heap allocation<br>
> ==42050== at 0x1000183B1: malloc (vg_replace_malloc.c:303)<br>
> ==42050== by 0x10153B704: superlu_malloc_dist (memory.c:108)<br>
> ==42050== by 0x1014FD7AD: doubleMalloc_dist (dmemory.c:145)<br>
> ==42050== by 0x10151DA7D: pdgstrf (pdgstrf.c:735)<br>
> ==42050== by 0x1015019A5: pdgssvx (pdgssvx.c:1069)<br>
> ==42050== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42050== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42050== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42050== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42050== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42050== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42050== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42050== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42050== by 0x100001B3C: main (in ./ex19)<br>
> ==42050==<br>
> ==42048== Conditional jump or move depends on uninitialised value(s)<br>
> ==42048== at 0x10151F141: pdgstrf (pdgstrf.c:1139)<br>
> ==42048== by 0x1015019A5: pdgssvx (pdgssvx.c:1069)<br>
> ==42048== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42048== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42048== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42048== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42048== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42048== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42048== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42048== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42048== by 0x100001B3C: main (in ./ex19)<br>
> ==42048== Uninitialised value was created by a heap allocation<br>
> ==42048== at 0x1000183B1: malloc (vg_replace_malloc.c:303)<br>
> ==42048== by 0x10153B704: superlu_malloc_dist (memory.c:108)<br>
> ==42048== by 0x10150ABE2: ddist_psymbtonum (pdsymbfact_distdata.c:1332)<br>
> ==42048== by 0x1015018C2: pdgssvx (pdgssvx.c:1057)<br>
> ==42048== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42048== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42048== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42048== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42048== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42048== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42048== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42048== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42048== by 0x100001B3C: main (in ./ex19)<br>
> ==42048==<br>
> ==42049== Conditional jump or move depends on uninitialised value(s)<br>
> ==42049== at 0x10151F141: pdgstrf (pdgstrf.c:1139)<br>
> ==42049== by 0x1015019A5: pdgssvx (pdgssvx.c:1069)<br>
> ==42049== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42049== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42049== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42049== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42049== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42049== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42049== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42049== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42049== by 0x100001B3C: main (in ./ex19)<br>
> ==42049== Uninitialised value was created by a heap allocation<br>
> ==42049== at 0x1000183B1: malloc (vg_replace_malloc.c:303)<br>
> ==42049== by 0x10153B704: superlu_malloc_dist (memory.c:108)<br>
> ==42049== by 0x10150ABE2: ddist_psymbtonum (pdsymbfact_distdata.c:1332)<br>
> ==42049== by 0x1015018C2: pdgssvx (pdgssvx.c:1057)<br>
> ==42049== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42049== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42049== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42049== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42049== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42049== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42049== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42049== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42049== by 0x100001B3C: main (in ./ex19)<br>
> ==42049==<br>
> ==42048== Conditional jump or move depends on uninitialised value(s)<br>
> ==42048== at 0x101520054: pdgstrf (pdgstrf.c:1429)<br>
> ==42048== by 0x1015019A5: pdgssvx (pdgssvx.c:1069)<br>
> ==42048== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42048== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42048== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42048== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42048== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42048== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42049== Conditional jump or move depends on uninitialised value(s)<br>
> ==42048== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42048== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42048== by 0x100001B3C: main (in ./ex19)<br>
> ==42048== Uninitialised value was created by a heap allocation<br>
> ==42049== at 0x101520054: pdgstrf (pdgstrf.c:1429)<br>
> ==42048== at 0x1000183B1: malloc (vg_replace_malloc.c:303)<br>
> ==42048== by 0x10153B704: superlu_malloc_dist (memory.c:108)<br>
> ==42049== by 0x1015019A5: pdgssvx (pdgssvx.c:1069)<br>
> ==42049== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42048== by 0x10150ABE2: ddist_psymbtonum (pdsymbfact_distdata.c:1332)<br>
> ==42048== by 0x1015018C2: pdgssvx (pdgssvx.c:1057)<br>
> ==42048== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42049== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42049== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42048== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42048== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42049== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42049== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42049== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42048== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42048== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42048== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42049== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42049== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42048== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42048== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42049== by 0x100001B3C: main (in ./ex19)<br>
> ==42049== Uninitialised value was created by a heap allocation<br>
> ==42049== at 0x1000183B1: malloc (vg_replace_malloc.c:303)<br>
> ==42048== by 0x100001B3C: main (in ./ex19)<br>
> ==42048==<br>
> ==42049== by 0x10153B704: superlu_malloc_dist (memory.c:108)<br>
> ==42049== by 0x10150ABE2: ddist_psymbtonum (pdsymbfact_distdata.c:1332)<br>
> ==42049== by 0x1015018C2: pdgssvx (pdgssvx.c:1057)<br>
> ==42049== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42049== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42049== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42049== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42049== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42049== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42049== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42049== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42049== by 0x100001B3C: main (in ./ex19)<br>
> ==42049==<br>
> ==42050== Conditional jump or move depends on uninitialised value(s)<br>
> ==42050== at 0x10151FDE6: pdgstrf (pdgstrf.c:1382)<br>
> ==42050== by 0x1015019A5: pdgssvx (pdgssvx.c:1069)<br>
> ==42050== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42050== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42050== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42050== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42050== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42050== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42050== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42050== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42050== by 0x100001B3C: main (in ./ex19)<br>
> ==42050== Uninitialised value was created by a heap allocation<br>
> ==42050== at 0x1000183B1: malloc (vg_replace_malloc.c:303)<br>
> ==42050== by 0x10153B704: superlu_malloc_dist (memory.c:108)<br>
> ==42050== by 0x10150B241: ddist_psymbtonum (pdsymbfact_distdata.c:1389)<br>
> ==42050== by 0x1015018C2: pdgssvx (pdgssvx.c:1057)<br>
> ==42050== by 0x1009CFE7A: MatLUFactorNumeric_SuperLU_DIST (superlu_dist.c:414)<br>
> ==42050== by 0x10046CC5C: MatLUFactorNumeric (matrix.c:2946)<br>
> ==42050== by 0x100F09F2C: PCSetUp_LU (lu.c:152)<br>
> ==42050== by 0x100FF9036: PCSetUp (precon.c:982)<br>
> ==42050== by 0x1010F54EB: KSPSetUp (itfunc.c:332)<br>
> ==42050== by 0x1010F7985: KSPSolve (itfunc.c:546)<br>
> ==42050== by 0x10125541E: SNESSolve_NEWTONLS (ls.c:233)<br>
> ==42050== by 0x1011C49B7: SNESSolve (snes.c:3906)<br>
> ==42050== by 0x100001B3C: main (in ./ex19)<br>
> ==42050==<br>
><br>
><br>
> > On Jul 20, 2015, at 12:03 PM, <a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">
Mahir.Ulker-Kaustell@tyrens.se</a><mailto:<a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">Mahir.Ulker-Kaustell@tyrens.se</a>> wrote:<br>
> ><br>
> > Ok. So I have been creating the full factorization on each process. That gives me some hope!<br>
> ><br>
> > I followed your suggestion and tried to use the runtime option ‘-mat_superlu_dist_parsymbfact’.<br>
> > However, now the program crashes with:<br>
> ><br>
> > Invalid ISPEC at line 484 in file get_perm_c.c<br>
> ><br>
> > And so on…<br>
> ><br>
> > From the SuperLU manual; I should give the option either YES or NO, however -mat_superlu_dist_parsymbfact YES makes the program crash in the same way as above.<br>
> > Also I can’t find any reference to -mat_superlu_dist_parsymbfact in the PETSc documentation<br>
> ><br>
> > Mahir<br>
> ><br>
> > Mahir Ülker-Kaustell, Kompetenssamordnare, Brokonstruktör, Tekn. Dr, Tyréns AB<br>
> > 010 452 30 82, <a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">Mahir.Ulker-Kaustell@tyrens.se</a><mailto:<a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">Mahir.Ulker-Kaustell@tyrens.se</a>><br>
> ><br>
> > From: Xiaoye S. Li [mailto:<a href="mailto:xsli@lbl.gov">xsli@lbl.gov</a><mailto:<a href="mailto:xsli@lbl.gov">xsli@lbl.gov</a>>]<br>
> > Sent: den 20 juli 2015 18:12<br>
> > To: Ülker-Kaustell, Mahir<br>
> > Cc: Hong; petsc-users<br>
> > Subject: Re: [petsc-users] SuperLU MPI-problem<br>
> ><br>
> > The default SuperLU_DIST setting is to serial symbolic factorization. Therefore, what matters is how much memory do you have per MPI task?<br>
> ><br>
> > The code failed to malloc memory during redistribution of matrix A to {L\U} data struction (using result of serial symbolic factorization.)<br>
> ><br>
> > You can use parallel symbolic factorization, by runtime option: '-mat_superlu_dist_parsymbfact'<br>
> ><br>
> > Sherry Li<br>
> ><br>
> ><br>
> > On Mon, Jul 20, 2015 at 8:59 AM, <a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">
Mahir.Ulker-Kaustell@tyrens.se</a><mailto:<a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">Mahir.Ulker-Kaustell@tyrens.se</a>> <<a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">Mahir.Ulker-Kaustell@tyrens.se</a><mailto:<a href="mailto:Mahir.Ulker-Kaustell@tyrens.se">Mahir.Ulker-Kaustell@tyrens.se</a>>>
wrote:<br>
> > Hong:<br>
> ><br>
> > Previous experiences with this equation have shown that it is very difficult to solve it iteratively. Hence the use of a direct solver.<br>
> ><br>
> > The large test problem I am trying to solve has slightly less than 10^6 degrees of freedom. The matrices are derived from finite elements so they are sparse.<br>
> > The machine I am working on has 128GB ram. I have estimated the memory needed to less than 20GB, so if the solver needs twice or even three times as much, it should still work well. Or have I completely misunderstood something here?<br>
> ><br>
> > Mahir<br>
> ><br>
> ><br>
> ><br>
> > From: Hong [mailto:<a href="mailto:hzhang@mcs.anl.gov">hzhang@mcs.anl.gov</a><mailto:<a href="mailto:hzhang@mcs.anl.gov">hzhang@mcs.anl.gov</a>>]<br>
> > Sent: den 20 juli 2015 17:39<br>
> > To: Ülker-Kaustell, Mahir<br>
> > Cc: petsc-users<br>
> > Subject: Re: [petsc-users] SuperLU MPI-problem<br>
> ><br>
> > Mahir:<br>
> > Direct solvers consume large amount of memory. Suggest to try followings:<br>
> ><br>
> > 1. A sparse iterative solver if [-omega^2M + K] is not too ill-conditioned. You may test it using the small matrix.<br>
> ><br>
> > 2. Incrementally increase your matrix sizes. Try different matrix orderings.<br>
> > Do you get memory crash in the 1st symbolic factorization?<br>
> > In your case, matrix data structure stays same when omega changes, so you only need to do one matrix symbolic factorization and reuse it.<br>
> ><br>
> > 3. Use a machine that gives larger memory.<br>
> ><br>
> > Hong<br>
> ><br>
> > Dear Petsc-Users,<br>
> ><br>
> > I am trying to use PETSc to solve a set of linear equations arising from Naviers equation (elastodynamics) in the frequency domain.<br>
> > The frequency dependency of the problem requires that the system<br>
> ><br>
> > [-omega^2M + K]u = F<br>
> ><br>
> > where M and K are constant, square, positive definite matrices (mass and stiffness respectively) is solved for each frequency omega of interest.<br>
> > K is a complex matrix, including material damping.<br>
> ><br>
> > I have written a PETSc program which solves this problem for a small (1000 degrees of freedom) test problem on one or several processors, but it keeps crashing when I try it on my full scale (in the order of 10^6 degrees of freedom) problem.<br>
> ><br>
> > The program crashes at KSPSetUp() and from what I can see in the error messages, it appears as if it consumes too much memory.<br>
> ><br>
> > I would guess that similar problems have occurred in this mail-list, so I am hoping that someone can push me in the right direction…<br>
> ><br>
> > Mahir<br>
><br>
><br>
><br>
><br>
><br>
><br>
><br>
><br>
><br>
><br>
</div>
</div>
</blockquote>
</div>
<br>
</div>
</div>
</div>
</body>
</html>