From lizs at mail.uc.edu Wed Sep 1 15:54:54 2010 From: lizs at mail.uc.edu (Li, Zhisong (lizs)) Date: Wed, 1 Sep 2010 20:54:54 +0000 Subject: [petsc-users] Does PETSc have any broadcast function? Message-ID: <88D7E3BB7E1960428303E760100374510FA679C4@BL2PRD0103MB060.prod.exchangelabs.com> Hi, Petsc Team, I wonder if Petsc has any function like MPI_Bcast() which can broadcast a value to all processes. If we directly add MPI functions into the Petsc program, the MPI datatype may be incompatible with the Petsc datatype. Does Petsc have any easy way to handle this? Thank you. Zhisong Li -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at 59A2.org Wed Sep 1 16:02:12 2010 From: jed at 59A2.org (Jed Brown) Date: Wed, 01 Sep 2010 23:02:12 +0200 Subject: [petsc-users] Does PETSc have any broadcast function? In-Reply-To: <88D7E3BB7E1960428303E760100374510FA679C4@BL2PRD0103MB060.prod.exchangelabs.com> References: <88D7E3BB7E1960428303E760100374510FA679C4@BL2PRD0103MB060.prod.exchangelabs.com> Message-ID: <8762ypdvor.fsf@59A2.org> On Wed, 1 Sep 2010 20:54:54 +0000, "Li, Zhisong (lizs)" wrote: > Hi, Petsc Team, > > I wonder if Petsc has any function like MPI_Bcast() which can > broadcast a value to all processes. If we directly add MPI functions > into the Petsc program, the MPI datatype may be incompatible with the > Petsc datatype. Does Petsc have any easy way to handle this? PETSc registers the types MPIU_SCALAR, MPIU_REAL, and MPIU_INT so you can use these to send PetscScalar, PetscReal, and PetscInt respectively using any MPI functions. Jed From B.Sanderse at cwi.nl Wed Sep 1 16:34:57 2010 From: B.Sanderse at cwi.nl (Benjamin Sanderse) Date: Wed, 1 Sep 2010 15:34:57 -0600 Subject: [petsc-users] solving singular system Message-ID: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> Hi guys, I am trying to solve a singular matrix that results from the discretization of a Poisson equation with Neumann boundary conditions. In this case the null space consists of a constant vector. According to the manual MatNullSpaceCreate should be used to construct the null space. Since the constant functions are not needed when providing basis vectors, I am wondering what I should put as basis vectors? My code is now: PetscInt zero=0 MatNullSpaceCreate(PETSC_COMM_WORLD,PETSC_TRUE,zero,???,&nsp); KSPSetNullSpace(ksp,nsp); If anybody knows what I should put at the question marks, that would be of great help. Thanks! Ben From jed at 59A2.org Wed Sep 1 16:58:07 2010 From: jed at 59A2.org (Jed Brown) Date: Wed, 01 Sep 2010 23:58:07 +0200 Subject: [petsc-users] solving singular system In-Reply-To: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> Message-ID: <87zkw1cej4.fsf@59A2.org> On Wed, 1 Sep 2010 15:34:57 -0600, Benjamin Sanderse wrote: > Hi guys, > > I am trying to solve a singular matrix that results from the discretization of a Poisson equation with Neumann boundary conditions. In this case the null space consists of a constant vector. > According to the manual MatNullSpaceCreate should be used to construct the null space. Since the constant functions are not needed when providing basis vectors, I am wondering what I should put as basis vectors? My code is now: > > PetscInt zero=0 > > MatNullSpaceCreate(PETSC_COMM_WORLD,PETSC_TRUE,zero,???,&nsp); You can just pass PETSC_NULL (0), that argument is never looked at because you specify that there are zero vectors. Jed From B.Sanderse at cwi.nl Thu Sep 2 10:51:45 2010 From: B.Sanderse at cwi.nl (Benjamin Sanderse) Date: Thu, 2 Sep 2010 09:51:45 -0600 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: <87zkw1cej4.fsf@59A2.org> References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> Message-ID: <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> Hello all, I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? Thanks, Ben From bsmith at mcs.anl.gov Thu Sep 2 11:09:52 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 2 Sep 2010 11:09:52 -0500 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> Message-ID: <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: > Hello all, > > I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. > I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ Barry > > Thanks, > > Ben From B.Sanderse at cwi.nl Thu Sep 2 14:07:19 2010 From: B.Sanderse at cwi.nl (Benjamin Sanderse) Date: Thu, 2 Sep 2010 13:07:19 -0600 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> Message-ID: <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: Found unrecogonized header 0 in file. If your file contains complex numbers then call PetscBinaryRead() with "complex" as the second argument Error in ==> PetscBinaryRead at 27 if nargin < 2 ??? Output argument "varargout" (and maybe others) not assigned during call to "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". Error in ==> test_petsc_par at 57 x4 = PetscBinaryReady(PS); Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: fd = PETSC_VIEWER_SOCKET_WORLD; ... KSPSolve(ksp,b,x); ... VecView(fd,x); Thanks for the help! Ben Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: > > On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: > >> Hello all, >> >> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. >> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? > > In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ > > Barry > >> >> Thanks, >> >> Ben > From bsmith at mcs.anl.gov Thu Sep 2 14:45:06 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 2 Sep 2010 14:45:06 -0500 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> Message-ID: Matlab is never aware the vector is parallel. Please send me the code and I'll figure out what is going on. Barry On Sep 2, 2010, at 2:07 PM, Benjamin Sanderse wrote: > That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: > > Found unrecogonized header 0 in file. If your file contains complex numbers > then call PetscBinaryRead() with "complex" as the second argument > Error in ==> PetscBinaryRead at 27 > if nargin < 2 > > ??? Output argument "varargout" (and maybe others) not assigned during call to > "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". > > Error in ==> test_petsc_par at 57 > x4 = PetscBinaryReady(PS); > > Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: > > fd = PETSC_VIEWER_SOCKET_WORLD; > ... > KSPSolve(ksp,b,x); > ... > VecView(fd,x); > > Thanks for the help! > > Ben > > Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: > >> >> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: >> >>> Hello all, >>> >>> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. >>> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? >> >> In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ >> >> Barry >> >>> >>> Thanks, >>> >>> Ben >> > From sixthseason at gmail.com Fri Sep 3 09:45:35 2010 From: sixthseason at gmail.com (=?GB2312?B?TGl1IExpbiDB9cHW?=) Date: Fri, 3 Sep 2010 22:45:35 +0800 Subject: [petsc-users] how to set the parameters of FGMRES inner iterative Message-ID: HI, every one! How to set the parameters of FGMRES inner iterative? such as the restart number, the preconditioner method ? -- ------------------------------------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Sep 3 09:53:33 2010 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 3 Sep 2010 16:53:33 +0200 Subject: [petsc-users] how to set the parameters of FGMRES inner iterative In-Reply-To: References: Message-ID: On Fri, Sep 3, 2010 at 4:45 PM, Liu Lin ?? wrote: > HI, every one! > How to set the parameters of FGMRES inner iterative? such as the restart > number, the preconditioner method ? 1) There is no inner iterative method in FGMRES 2) http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/KSP/KSPFGMRES.html 3) -pc_type Matt > > -- > > ------------------------------------------------------------------------------- > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From gdiso at ustc.edu Fri Sep 3 09:55:15 2010 From: gdiso at ustc.edu (Gong Ding) Date: Fri, 3 Sep 2010 22:55:15 +0800 Subject: [petsc-users] BoomerAMG Howto Message-ID: <674C26569A314F619BC897C21BBCC2CD@cogendaeda> Dear all, I am trying to use Boomer AMG of hypre + GMRES for my nonlinear problem, which is solved by MUMPS or GMRES +ILU previously. However AMG does not convergence. The -ksp_monitor shows that there are huge residual during inner GMRES iteration, as large as 1e30. I only change PC from ILU to boomeramg. Can anyone tell me what's wrong with it? Gong Ding From B.Sanderse at cwi.nl Fri Sep 3 10:08:29 2010 From: B.Sanderse at cwi.nl (Benjamin Sanderse) Date: Fri, 3 Sep 2010 09:08:29 -0600 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> Message-ID: Hi Barry, Could you figure out something with the codes I sent you? Thanks, Ben Op 2 sep 2010, om 13:45 heeft Barry Smith het volgende geschreven: > > > Matlab is never aware the vector is parallel. Please send me the code and I'll figure out what is going on. > > Barry > > On Sep 2, 2010, at 2:07 PM, Benjamin Sanderse wrote: > >> That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: >> >> Found unrecogonized header 0 in file. If your file contains complex numbers >> then call PetscBinaryRead() with "complex" as the second argument >> Error in ==> PetscBinaryRead at 27 >> if nargin < 2 >> >> ??? Output argument "varargout" (and maybe others) not assigned during call to >> "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". >> >> Error in ==> test_petsc_par at 57 >> x4 = PetscBinaryReady(PS); >> >> Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: >> >> fd = PETSC_VIEWER_SOCKET_WORLD; >> ... >> KSPSolve(ksp,b,x); >> ... >> VecView(fd,x); >> >> Thanks for the help! >> >> Ben >> >> Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: >> >>> >>> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: >>> >>>> Hello all, >>>> >>>> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. >>>> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? >>> >>> In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ >>> >>> Barry >>> >>>> >>>> Thanks, >>>> >>>> Ben >>> >> > From bsmith at mcs.anl.gov Fri Sep 3 10:19:37 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 3 Sep 2010 10:19:37 -0500 Subject: [petsc-users] how to set the parameters of FGMRES inner iterative In-Reply-To: References: Message-ID: If you want to use, for example, GMRES as the preconditioner in FGMRES then you use -pc_type ksp -ksp_ksp_type gmres -ksp_ksp_max_it 10 -ksp_pc_type ilu etc The best way to find options is to run with -help The reason for the "extra" ksp in front of the last four options is that it is setting the options for the inner ksp Note that this is all automatically recursive to any level of imbedded solvers. I will this example to the KSPFGMRES manual page. Barry On Sep 3, 2010, at 9:53 AM, Matthew Knepley wrote: > On Fri, Sep 3, 2010 at 4:45 PM, Liu Lin ?? wrote: > HI, every one! > How to set the parameters of FGMRES inner iterative? such as the restart number, the preconditioner method ? > > 1) There is no inner iterative method in FGMRES > > 2) http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/KSP/KSPFGMRES.html > > 3) -pc_type > > Matt > > > -- > ------------------------------------------------------------------------------- > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Sep 3 10:22:25 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 3 Sep 2010 10:22:25 -0500 Subject: [petsc-users] BoomerAMG Howto In-Reply-To: <674C26569A314F619BC897C21BBCC2CD@cogendaeda> References: <674C26569A314F619BC897C21BBCC2CD@cogendaeda> Message-ID: <67B8F263-30B2-4043-94FE-362C35E611EE@mcs.anl.gov> BoomerAMG can not handle the linear system you are giving it. Each algebraic multigrid solver can only handle a certain class of problems. Barry On Sep 3, 2010, at 9:55 AM, Gong Ding wrote: > Dear all, > I am trying to use Boomer AMG of hypre + GMRES for my nonlinear problem, which is solved by MUMPS or GMRES +ILU previously. > However AMG does not convergence. > > The -ksp_monitor shows that there are huge residual during inner GMRES iteration, as large as 1e30. > I only change PC from ILU to boomeramg. > Can anyone tell me what's wrong with it? > > Gong Ding From bsmith at mcs.anl.gov Fri Sep 3 10:25:18 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 3 Sep 2010 10:25:18 -0500 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> Message-ID: <553525B0-C02E-4874-AE1E-5786037A35FA@mcs.anl.gov> On Sep 3, 2010, at 10:08 AM, Benjamin Sanderse wrote: > Hi Barry, > > Could you figure out something with the codes I sent you? > > Thanks, > > Ben I've built my PETSc with Matlab in preparation but have to devote at least 10 minutes a day to my 8 children and needy spouse. Sorry for the joke. If you are lucky I'll have time this afternoon to try it. Barry > > Op 2 sep 2010, om 13:45 heeft Barry Smith het volgende geschreven: > >> >> >> Matlab is never aware the vector is parallel. Please send me the code and I'll figure out what is going on. >> >> Barry >> >> On Sep 2, 2010, at 2:07 PM, Benjamin Sanderse wrote: >> >>> That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: >>> >>> Found unrecogonized header 0 in file. If your file contains complex numbers >>> then call PetscBinaryRead() with "complex" as the second argument >>> Error in ==> PetscBinaryRead at 27 >>> if nargin < 2 >>> >>> ??? Output argument "varargout" (and maybe others) not assigned during call to >>> "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". >>> >>> Error in ==> test_petsc_par at 57 >>> x4 = PetscBinaryReady(PS); >>> >>> Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: >>> >>> fd = PETSC_VIEWER_SOCKET_WORLD; >>> ... >>> KSPSolve(ksp,b,x); >>> ... >>> VecView(fd,x); >>> >>> Thanks for the help! >>> >>> Ben >>> >>> Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: >>> >>>> >>>> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: >>>> >>>>> Hello all, >>>>> >>>>> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. >>>>> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? >>>> >>>> In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ >>>> >>>> Barry >>>> >>>>> >>>>> Thanks, >>>>> >>>>> Ben >>>> >>> >> > From B.Sanderse at cwi.nl Fri Sep 3 12:02:16 2010 From: B.Sanderse at cwi.nl (Benjamin Sanderse) Date: Fri, 3 Sep 2010 11:02:16 -0600 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: <553525B0-C02E-4874-AE1E-5786037A35FA@mcs.anl.gov> References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <553525B0-C02E-4874-AE1E-5786037A35FA@mcs.anl.gov> Message-ID: Smiles... :) Op 3 sep 2010, om 09:25 heeft Barry Smith het volgende geschreven: > > On Sep 3, 2010, at 10:08 AM, Benjamin Sanderse wrote: > >> Hi Barry, >> >> Could you figure out something with the codes I sent you? >> >> Thanks, >> >> Ben > > I've built my PETSc with Matlab in preparation but have to devote at least 10 minutes a day to my 8 children and needy spouse. Sorry for the joke. If you are lucky I'll have time this afternoon to try it. > > Barry > >> >> Op 2 sep 2010, om 13:45 heeft Barry Smith het volgende geschreven: >> >>> >>> >>> Matlab is never aware the vector is parallel. Please send me the code and I'll figure out what is going on. >>> >>> Barry >>> >>> On Sep 2, 2010, at 2:07 PM, Benjamin Sanderse wrote: >>> >>>> That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: >>>> >>>> Found unrecogonized header 0 in file. If your file contains complex numbers >>>> then call PetscBinaryRead() with "complex" as the second argument >>>> Error in ==> PetscBinaryRead at 27 >>>> if nargin < 2 >>>> >>>> ??? Output argument "varargout" (and maybe others) not assigned during call to >>>> "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". >>>> >>>> Error in ==> test_petsc_par at 57 >>>> x4 = PetscBinaryReady(PS); >>>> >>>> Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: >>>> >>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>> ... >>>> KSPSolve(ksp,b,x); >>>> ... >>>> VecView(fd,x); >>>> >>>> Thanks for the help! >>>> >>>> Ben >>>> >>>> Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: >>>> >>>>> >>>>> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: >>>>> >>>>>> Hello all, >>>>>> >>>>>> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. >>>>>> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? >>>>> >>>>> In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ >>>>> >>>>> Barry >>>>> >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Ben >>>>> >>>> >>> >> > From bsmith at mcs.anl.gov Fri Sep 3 14:25:56 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 3 Sep 2010 14:25:56 -0500 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> Message-ID: <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> Ben Ok, I figured out the problem. It is not fundamental and mostly comes from not having a create way to debug this. The test vector you create is sequential then you try to view it back to Matlab with the parallel fd viewer. If you change to ierr = VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,1,&test);CHKERRQ(ierr); then the code runs. I've found (just now) that when I use launch all the output from the .c program gets lost which makes it impossible to figure out what has gone wrong. You can debug by running the two parts of the computation in two different windows. So comment out the launch from the matlab script and then in Matlab run the script (it will hang waiting for the socket to work) and in a separate terminal window run the .c program; for example petscmpiexec -n 2 ./ex1 -info Now you see exactly what is happening in the PETSc program. You can even use -start_in_debugger on the PETSc side to run the debugger on crashes. I'll add this to the docs for launch Barry On Sep 2, 2010, at 3:28 PM, Benjamin Sanderse wrote: > Hi Barry, > > I attached my matlab file, c file and makefile. First I generate the executable with 'make petsc_poisson_par_barry' and then I run test_petsc_par_barry.m. > If you change MATMPIAIJ to MATAIJ and VECMPI to VECSEQ the code works fine. > > Thanks a lot, > > Benjamin > > > > Op 2 sep 2010, om 13:45 heeft Barry Smith het volgende geschreven: > >> >> >> Matlab is never aware the vector is parallel. Please send me the code and I'll figure out what is going on. >> >> Barry >> >> On Sep 2, 2010, at 2:07 PM, Benjamin Sanderse wrote: >> >>> That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: >>> >>> Found unrecogonized header 0 in file. If your file contains complex numbers >>> then call PetscBinaryRead() with "complex" as the second argument >>> Error in ==> PetscBinaryRead at 27 >>> if nargin < 2 >>> >>> ??? Output argument "varargout" (and maybe others) not assigned during call to >>> "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". >>> >>> Error in ==> test_petsc_par at 57 >>> x4 = PetscBinaryReady(PS); >>> >>> Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: >>> >>> fd = PETSC_VIEWER_SOCKET_WORLD; >>> ... >>> KSPSolve(ksp,b,x); >>> ... >>> VecView(fd,x); >>> >>> Thanks for the help! >>> >>> Ben >>> >>> Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: >>> >>>> >>>> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: >>>> >>>>> Hello all, >>>>> >>>>> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. >>>>> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? >>>> >>>> In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ >>>> >>>> Barry >>>> >>>>> >>>>> Thanks, >>>>> >>>>> Ben >>>> >>> >> > From B.Sanderse at cwi.nl Fri Sep 3 16:32:00 2010 From: B.Sanderse at cwi.nl (Benjamin Sanderse) Date: Fri, 3 Sep 2010 15:32:00 -0600 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> Message-ID: <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> Hi Barry, Thanks for your help! However, there are still some issues left. In other to test things, I simplified the program even more and now I am just sending a vector back and forth: matlab->petsc->matlab: fd = PETSC_VIEWER_SOCKET_WORLD; // load rhs vector ierr = VecLoad(fd,VECMPI,&b);CHKERRQ(ierr); // send to matlab ierr = VecView(b,fd);CHKERRQ(ierr); ierr = VecDestroy(b);CHKERRQ(ierr); - Your approach with two windows works *sometimes*. I removed the 'launch' statement and I executed my program 10 times, the first 2 times worked, and in all other cases I got this: petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info [1] PetscInitialize(): PETSc successfully started: number of processors = 2 [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl [0] PetscInitialize(): PETSc successfully started: number of processors = 2 [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 [1] PetscCommDuplicate(): returning tag 2147483647 [0] PetscCommDuplicate(): returning tag 2147483647 [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl [0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again [0] PetscOpenSocket(): Connection refused in attaching socket, trying again^C -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt [0]1:Return code = 0, signaled with Interrupt Every time I start the program I use close(socket) and clear all in Matlab, so the socket from the previous run should not be present anymore. It seems that the port gets corrupted after a couple of times? Matlab does not respond and I have to kill it and restart it manually. - If I include the launch statement, or just type system('mpiexec -n 2 ./petsc_poisson_par_barry2 &') the program never works. Hope you can figure out what is going wrong. Ben Op 3 sep 2010, om 13:25 heeft Barry Smith het volgende geschreven: > > Ben > > Ok, I figured out the problem. It is not fundamental and mostly comes from not having a create way to debug this. > > The test vector you create is sequential then you try to view it back to Matlab with the parallel fd viewer. If you change to > ierr = VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,1,&test);CHKERRQ(ierr); > then the code runs. > > I've found (just now) that when I use launch all the output from the .c program gets lost which makes it impossible to figure out what has gone wrong. You can debug by running the two parts of the computation in two different windows. So comment out the launch from the matlab script and then in Matlab run the script (it will hang waiting for the socket to work) and in a separate terminal window run the .c program; for example petscmpiexec -n 2 ./ex1 -info Now you see exactly what is happening in the PETSc program. You can even use -start_in_debugger on the PETSc side to run the debugger on crashes. > > I'll add this to the docs for launch > > Barry > > > On Sep 2, 2010, at 3:28 PM, Benjamin Sanderse wrote: > >> Hi Barry, >> >> I attached my matlab file, c file and makefile. First I generate the executable with 'make petsc_poisson_par_barry' and then I run test_petsc_par_barry.m. >> If you change MATMPIAIJ to MATAIJ and VECMPI to VECSEQ the code works fine. >> >> Thanks a lot, >> >> Benjamin >> >> >> >> Op 2 sep 2010, om 13:45 heeft Barry Smith het volgende geschreven: >> >>> >>> >>> Matlab is never aware the vector is parallel. Please send me the code and I'll figure out what is going on. >>> >>> Barry >>> >>> On Sep 2, 2010, at 2:07 PM, Benjamin Sanderse wrote: >>> >>>> That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: >>>> >>>> Found unrecogonized header 0 in file. If your file contains complex numbers >>>> then call PetscBinaryRead() with "complex" as the second argument >>>> Error in ==> PetscBinaryRead at 27 >>>> if nargin < 2 >>>> >>>> ??? Output argument "varargout" (and maybe others) not assigned during call to >>>> "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". >>>> >>>> Error in ==> test_petsc_par at 57 >>>> x4 = PetscBinaryReady(PS); >>>> >>>> Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: >>>> >>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>> ... >>>> KSPSolve(ksp,b,x); >>>> ... >>>> VecView(fd,x); >>>> >>>> Thanks for the help! >>>> >>>> Ben >>>> >>>> Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: >>>> >>>>> >>>>> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: >>>>> >>>>>> Hello all, >>>>>> >>>>>> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. >>>>>> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? >>>>> >>>>> In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ >>>>> >>>>> Barry >>>>> >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Ben >>>>> >>>> >>> >> > From rlmackie862 at gmail.com Fri Sep 3 17:20:53 2010 From: rlmackie862 at gmail.com (Randall Mackie) Date: Fri, 3 Sep 2010 15:20:53 -0700 Subject: [petsc-users] Trying to compile PETSC in cygwin Message-ID: I'm trying to compile a uniprocessor version of my PETSc code on Windows under cygwin. I just downloaded and installed cygwin plus gcc, g++, and gfortran, and lapack. Whenever I run configure, I get the following error message: $ ./configure PETSC_ARCH=windows-gfortran-serial --with-fortran --with-fortran-kernels=1 --with-scalar-type=co mplex --with-debugging=0 --with-mpi=0 --with-c-language=cxx =============================================================================== Configuring PETSc to compile on your system =============================================================================== =============================================================================== WARNING! Compiling PETSc with no debugging, this should only be done for timing and production runs. All development should be done when configured using --with-debugging=1 =============================================================================== TESTING: configureScalarType from PETSc.utilities.scalarTypes(config/PETSc/utilities/scalarTypes.py:36) ******************************************************************************* UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): ------------------------------------------------------------------------------- C Compiler provided doest not support C99 complex ******************************************************************************* I've even tried adding the option --CFLAGS='-std=c99', but I get the same error message. Anybody have any suggestions for how to get past this? Thanks, Randy M. -------------- next part -------------- An HTML attachment was scrubbed... URL: From rlmackie862 at gmail.com Fri Sep 3 17:53:14 2010 From: rlmackie862 at gmail.com (Randall Mackie) Date: Fri, 3 Sep 2010 15:53:14 -0700 Subject: [petsc-users] Trying to compile PETSC in cygwin In-Reply-To: References: Message-ID: I figured it out myself - I had the wrong option below, should have been --with-clanguage=cxx although I'm not sure why it didn't work with just gcc. Randy M. On Fri, Sep 3, 2010 at 3:20 PM, Randall Mackie wrote: > I'm trying to compile a uniprocessor version of my PETSc code on Windows > under cygwin. > > I just downloaded and installed cygwin plus gcc, g++, and gfortran, and > lapack. > > Whenever I run configure, I get the following error message: > > $ ./configure PETSC_ARCH=windows-gfortran-serial --with-fortran > --with-fortran-kernels=1 --with-scalar-type=co > mplex --with-debugging=0 --with-mpi=0 --with-c-language=cxx > > =============================================================================== > Configuring PETSc to compile on your > system > > =============================================================================== > =============================================================================== > WARNING! Compiling PETSc with no debugging, this > should only be > done for timing and production runs. All development > should be done when configured > using --with-debugging=1 > =============================================================================== > TESTING: configureScalarType from > PETSc.utilities.scalarTypes(config/PETSc/utilities/scalarTypes.py:36) > ******************************************************************************* > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > details): > > ------------------------------------------------------------------------------- > C Compiler provided doest not support C99 complex > > ******************************************************************************* > > > I've even tried adding the option --CFLAGS='-std=c99', but I get the same > error message. > > Anybody have any suggestions for how to get past this? > > Thanks, > > Randy M. > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Sep 3 22:11:48 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 3 Sep 2010 22:11:48 -0500 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> Message-ID: <30D58A3F-D8D9-4692-8BE9-BB8168201BD3@mcs.anl.gov> On Sep 3, 2010, at 4:32 PM, Benjamin Sanderse wrote: > Hi Barry, > > Thanks for your help! However, there are still some issues left. In other to test things, I simplified the program even more and now I am just sending a vector back and forth: matlab->petsc->matlab: > > fd = PETSC_VIEWER_SOCKET_WORLD; > > // load rhs vector > ierr = VecLoad(fd,VECMPI,&b);CHKERRQ(ierr); > > // send to matlab > ierr = VecView(b,fd);CHKERRQ(ierr); > ierr = VecDestroy(b);CHKERRQ(ierr); > > > - Your approach with two windows works *sometimes*. I removed the 'launch' statement and I executed my program 10 times, the first 2 times worked, and in all other cases I got this: > > petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info > [1] PetscInitialize(): PETSc successfully started: number of processors = 2 > [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl > [0] PetscInitialize(): PETSc successfully started: number of processors = 2 > [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl > [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 > [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 > [1] PetscCommDuplicate(): returning tag 2147483647 > [0] PetscCommDuplicate(): returning tag 2147483647 > [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl > [0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again > [0] PetscOpenSocket(): Connection refused in attaching socket, trying again^C > -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt > [0]1:Return code = 0, signaled with Interrupt > > Every time I start the program I use close(socket) and clear all in Matlab, so the socket from the previous run should not be present anymore. It seems that the port gets corrupted after a couple of times? Matlab does not respond and I have to kill it and restart it manually. Sometimes when you close a socket connection it doesn't close for a very long time so that if you try to open it again it doesn't work. When it appears the socket can not be used try using netstat | grep 5005 to see if the socket is still active. > > - If I include the launch statement, or just type > system('mpiexec -n 2 ./petsc_poisson_par_barry2 &') > the program never works. Are you sure mpiexec is in the path of system and it is the right one? The problem is that we are kind of cheating with system because we start a new job in the background and have no idea what the output is. Are you using unix and running Matlab on the command line or in a GUI? Barry > > Hope you can figure out what is going wrong. > > Ben > > > Op 3 sep 2010, om 13:25 heeft Barry Smith het volgende geschreven: > >> >> Ben >> >> Ok, I figured out the problem. It is not fundamental and mostly comes from not having a create way to debug this. >> >> The test vector you create is sequential then you try to view it back to Matlab with the parallel fd viewer. If you change to >> ierr = VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,1,&test);CHKERRQ(ierr); >> then the code runs. >> >> I've found (just now) that when I use launch all the output from the .c program gets lost which makes it impossible to figure out what has gone wrong. You can debug by running the two parts of the computation in two different windows. So comment out the launch from the matlab script and then in Matlab run the script (it will hang waiting for the socket to work) and in a separate terminal window run the .c program; for example petscmpiexec -n 2 ./ex1 -info Now you see exactly what is happening in the PETSc program. You can even use -start_in_debugger on the PETSc side to run the debugger on crashes. >> >> I'll add this to the docs for launch >> >> Barry >> >> >> On Sep 2, 2010, at 3:28 PM, Benjamin Sanderse wrote: >> >>> Hi Barry, >>> >>> I attached my matlab file, c file and makefile. First I generate the executable with 'make petsc_poisson_par_barry' and then I run test_petsc_par_barry.m. >>> If you change MATMPIAIJ to MATAIJ and VECMPI to VECSEQ the code works fine. >>> >>> Thanks a lot, >>> >>> Benjamin >>> >>> >>> >>> Op 2 sep 2010, om 13:45 heeft Barry Smith het volgende geschreven: >>> >>>> >>>> >>>> Matlab is never aware the vector is parallel. Please send me the code and I'll figure out what is going on. >>>> >>>> Barry >>>> >>>> On Sep 2, 2010, at 2:07 PM, Benjamin Sanderse wrote: >>>> >>>>> That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: >>>>> >>>>> Found unrecogonized header 0 in file. If your file contains complex numbers >>>>> then call PetscBinaryRead() with "complex" as the second argument >>>>> Error in ==> PetscBinaryRead at 27 >>>>> if nargin < 2 >>>>> >>>>> ??? Output argument "varargout" (and maybe others) not assigned during call to >>>>> "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". >>>>> >>>>> Error in ==> test_petsc_par at 57 >>>>> x4 = PetscBinaryReady(PS); >>>>> >>>>> Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: >>>>> >>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>> ... >>>>> KSPSolve(ksp,b,x); >>>>> ... >>>>> VecView(fd,x); >>>>> >>>>> Thanks for the help! >>>>> >>>>> Ben >>>>> >>>>> Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: >>>>> >>>>>> >>>>>> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: >>>>>> >>>>>>> Hello all, >>>>>>> >>>>>>> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. >>>>>>> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? >>>>>> >>>>>> In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ >>>>>> >>>>>> Barry >>>>>> >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Ben >>>>>> >>>>> >>>> >>> >> > From wujinshan at yahoo.com Sat Sep 4 03:26:41 2010 From: wujinshan at yahoo.com (jinshan wu) Date: Sat, 4 Sep 2010 01:26:41 -0700 (PDT) Subject: [petsc-users] looking for examples on solving linear systems of matrix-free matrices Message-ID: <514938.13838.qm@web112616.mail.gq1.yahoo.com> Hi all, I am new to this list. But it seems to have a lot of resources and friendly people here. I need to solve a huge linear system. In order to save some memory, I am thinking to store the matrices in its own data structure other than in real matrices. I understand that petsc can do that via GMRES (and others) without accessing matrix elements . So my question is where can we find some example? I am perfectly fine with supplying the matrix-vector product part by my own. I need more infor on how to connect it to the solver in petsc. Also another technical question: do we have to perform the preconditioner? thanks, Jinshan From knepley at gmail.com Sat Sep 4 04:27:03 2010 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 4 Sep 2010 11:27:03 +0200 Subject: [petsc-users] looking for examples on solving linear systems of matrix-free matrices In-Reply-To: <514938.13838.qm@web112616.mail.gq1.yahoo.com> References: <514938.13838.qm@web112616.mail.gq1.yahoo.com> Message-ID: On Sat, Sep 4, 2010 at 10:26 AM, jinshan wu wrote: > Hi all, I am new to this list. But it seems to have a lot of resources and > friendly people here. > > I need to solve a huge linear system. In order to save some memory, I am > thinking to store the matrices in its own data structure other than in real > matrices. I understand that petsc can do that via GMRES (and others) > without accessing matrix elements . So my question is where can we find > some > example? I am perfectly fine with supplying the matrix-vector product part > by my > own. I need more infor on how to connect it to the solver in petsc. > You can use http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/Mat/MatCreateShell.html There are links to examples on that page. > Also another technical question: do we have to perform the preconditioner? > Usually, if you want it to be efficient. Matt > thanks, > Jinshan > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From sixthseason at gmail.com Sat Sep 4 09:30:04 2010 From: sixthseason at gmail.com (=?GB2312?B?TGl1IExpbiDB9cHW?=) Date: Sat, 4 Sep 2010 22:30:04 +0800 Subject: [petsc-users] how to set the full gmres by the ksp gmres Message-ID: Hi, anywho! how to set the full gmres by the ksp gmres? -- -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at 59A2.org Sat Sep 4 09:40:38 2010 From: jed at 59A2.org (Jed Brown) Date: Sat, 04 Sep 2010 16:40:38 +0200 Subject: [petsc-users] how to set the full gmres by the ksp gmres In-Reply-To: References: Message-ID: <87lj7ha7x5.fsf@59A2.org> On Sat, 4 Sep 2010 22:30:04 +0800, Liu Lin ?? wrote: > Hi, anywho! > > how to set the full gmres by the ksp gmres? Do you mean GMRES without restart? Just set a big restart -ksp_gmres_restart <30>: Number of Krylov search directions (KSPGMRESSetRestart) Also useful if GMRES is having trouble for a nasty problem: -ksp_gmres_classicalgramschmidt: Classical (unmodified) Gram-Schmidt (fast) (KSPGMRESSetOrthogonalization) -ksp_gmres_modifiedgramschmidt: Modified Gram-Schmidt (slow,more stable) (KSPGMRESSetOrthogonalization) Jed From gdiso at ustc.edu Sat Sep 4 21:41:51 2010 From: gdiso at ustc.edu (Gong Ding) Date: Sun, 5 Sep 2010 10:41:51 +0800 Subject: [petsc-users] BoomerAMG Howto References: <674C26569A314F619BC897C21BBCC2CD@cogendaeda> <67B8F263-30B2-4043-94FE-362C35E611EE@mcs.anl.gov> Message-ID: <5441B5F93D83455DA918C02915B5239A@cogendaeda> I had tried BoomerAMG for poisson equation, which works well. But when I preform it on my semiconductor system, it does not convergence. I guess the reason is semiconductor equations has three variables, the potential, electron and hole density, the algebraic multigrid arithmetic does not consider this and treat coarsen/smooth between different variables. Is there any way to specify coarsen arithmetic by user? Since each mesh node has 3 variables, I would like to do point based coarsen operation. BoomerAMG can not handle the linear system you are giving it. Each algebraic multigrid solver can only handle a certain class of problems. Barry On Sep 3, 2010, at 9:55 AM, Gong Ding wrote: > Dear all, > I am trying to use Boomer AMG of hypre + GMRES for my nonlinear problem, which is solved by MUMPS or GMRES +ILU previously. > However AMG does not convergence. > > The -ksp_monitor shows that there are huge residual during inner GMRES iteration, as large as 1e30. > I only change PC from ILU to boomeramg. > Can anyone tell me what's wrong with it? > > Gong Ding From bsmith at mcs.anl.gov Sat Sep 4 22:10:24 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sat, 4 Sep 2010 22:10:24 -0500 Subject: [petsc-users] BoomerAMG Howto In-Reply-To: <5441B5F93D83455DA918C02915B5239A@cogendaeda> References: <674C26569A314F619BC897C21BBCC2CD@cogendaeda> <67B8F263-30B2-4043-94FE-362C35E611EE@mcs.anl.gov> <5441B5F93D83455DA918C02915B5239A@cogendaeda> Message-ID: <6C3C2584-D27F-46AC-B1C3-2B1E976FC438@mcs.anl.gov> You can call MatSetBlockSize() on the matrix with a value of 3. This information will then be transmitted to BoomerAMG so that it will know it is a 3 component problem. It may work a bit better with this information. But I suspect it will still not work well semiconductor problems are notoriously difficult for iterative solvers. You may just go best with a direct solver. We recommend the Mumps solver. You can use it by configuring PETSc with --download-mumps --download-scalapack --download-blacs and then run the program with -pc_type lu -pc_factor_mat_solver_package mumps Good luck, Barry On Sep 4, 2010, at 9:41 PM, Gong Ding wrote: > > I had tried BoomerAMG for poisson equation, which works well. > But when I preform it on my semiconductor system, it does not convergence. > I guess the reason is semiconductor equations has three variables, the potential, electron and hole density, > the algebraic multigrid arithmetic does not consider this and treat coarsen/smooth between different variables. > > Is there any way to specify coarsen arithmetic by user? > Since each mesh node has 3 variables, I would like to do point based coarsen operation. > > > > > BoomerAMG can not handle the linear system you are giving it. Each algebraic multigrid solver can only handle a certain class of problems. > > > Barry > > On Sep 3, 2010, at 9:55 AM, Gong Ding wrote: > >> Dear all, >> I am trying to use Boomer AMG of hypre + GMRES for my nonlinear problem, which is solved by MUMPS or GMRES +ILU previously. >> However AMG does not convergence. >> >> The -ksp_monitor shows that there are huge residual during inner GMRES iteration, as large as 1e30. >> I only change PC from ILU to boomeramg. >> Can anyone tell me what's wrong with it? >> >> Gong Ding From gdiso at ustc.edu Sun Sep 5 00:10:10 2010 From: gdiso at ustc.edu (Gong Ding) Date: Sun, 5 Sep 2010 13:10:10 +0800 Subject: [petsc-users] BoomerAMG Howto References: <674C26569A314F619BC897C21BBCC2CD@cogendaeda><67B8F263-30B2-4043-94FE-362C35E611EE@mcs.anl.gov><5441B5F93D83455DA918C02915B5239A@cogendaeda> <6C3C2584-D27F-46AC-B1C3-2B1E976FC438@mcs.anl.gov> Message-ID: ----- Original Message ----- From: "Barry Smith" To: "PETSc users list" Sent: Sunday, September 05, 2010 11:10 AM Subject: Re: [petsc-users] BoomerAMG Howto >You can call MatSetBlockSize() on the matrix with a value of 3. This information will then be transmitted to BoomerAMG so that it will know it is a 3 component problem. It may work a bit better with this information. Thanks, I will try it. > But I suspect it will still not work well semiconductor problems are notoriously difficult for iterative solvers. You may just go best with a direct solver. We recommend the Mumps solver. You can use it by configuring PETSc with --download-mumps --download->scalapack --download-blacs and then run the program with -pc_type lu -pc_factor_mat_solver_package mumps Yes, MUMPS is the default linear solver in my code. I found it is the most stable one in practical. However, direct solver become too slow for large 3D problem. I am trying to find some fast solver instead. On Sep 4, 2010, at 9:41 PM, Gong Ding wrote: > > I had tried BoomerAMG for poisson equation, which works well. > But when I preform it on my semiconductor system, it does not convergence. > I guess the reason is semiconductor equations has three variables, the potential, electron and hole density, > the algebraic multigrid arithmetic does not consider this and treat coarsen/smooth between different variables. > > Is there any way to specify coarsen arithmetic by user? > Since each mesh node has 3 variables, I would like to do point based coarsen operation. > > > > > BoomerAMG can not handle the linear system you are giving it. Each algebraic multigrid solver can only handle a certain class of problems. > > > Barry > > On Sep 3, 2010, at 9:55 AM, Gong Ding wrote: > >> Dear all, >> I am trying to use Boomer AMG of hypre + GMRES for my nonlinear problem, which is solved by MUMPS or GMRES +ILU previously. >> However AMG does not convergence. >> >> The -ksp_monitor shows that there are huge residual during inner GMRES iteration, as large as 1e30. >> I only change PC from ILU to boomeramg. >> Can anyone tell me what's wrong with it? >> >> Gong Ding From jordi.poblet at gmail.com Mon Sep 6 06:01:46 2010 From: jordi.poblet at gmail.com (jordi poblet) Date: Mon, 6 Sep 2010 13:01:46 +0200 Subject: [petsc-users] PETSc + SLEPc makefile Message-ID: Dear all, I am trying to use PETSc and SLEPc but I have problems with the makefiles. Could someone provide me an example of makefiles that compiles multiple C++ files and include external libraries different than PETSc and SLEPc? In the makefile that I am trying to use (attached): -main.cpp: is the main file where some tests are done -UsePETSc.cpp: A class that uses PETSc -UseSLEPc.cpp: A class that uses SLEPc I have no problem when using UsePETSc.cpp without SLEPc or when compiling single examples that call SLEPc functions (with the makefiles provided in the SLEPc examples). Sorry if this is not a purelly PETSc question but I have supposed that there is someone else here using also SLEPc. Thank you very much in advance, Jordi Poblet-Puig -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: MakefileExample Type: application/octet-stream Size: 597 bytes Desc: not available URL: From knepley at gmail.com Mon Sep 6 06:18:44 2010 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 6 Sep 2010 13:18:44 +0200 Subject: [petsc-users] PETSc + SLEPc makefile In-Reply-To: References: Message-ID: On Mon, Sep 6, 2010 at 1:01 PM, jordi poblet wrote: > Dear all, > > I am trying to use PETSc and SLEPc but I have problems with the makefiles. > Could someone provide me an example of makefiles that compiles multiple C++ > files and include external libraries different than PETSc and SLEPc? > I do not really understand your problem. What error are you getting? Matt > In the makefile that I am trying to use (attached): > > -main.cpp: is the main file where some tests are done > -UsePETSc.cpp: A class that uses PETSc > -UseSLEPc.cpp: A class that uses SLEPc > > I have no problem when using UsePETSc.cpp without SLEPc or when compiling > single examples that call SLEPc functions (with the makefiles provided in > the SLEPc examples). > Sorry if this is not a purelly PETSc question but I have supposed that > there is someone else here using also SLEPc. > > Thank you very much in advance, > > Jordi Poblet-Puig > > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Mon Sep 6 06:29:10 2010 From: jroman at dsic.upv.es (Jose E. Roman) Date: Mon, 6 Sep 2010 13:29:10 +0200 Subject: [petsc-users] PETSc + SLEPc makefile In-Reply-To: References: Message-ID: El 06/09/2010, a las 13:01, jordi poblet escribi?: > Dear all, > > I am trying to use PETSc and SLEPc but I have problems with the makefiles. Could someone provide me an example of makefiles that compiles multiple C++ files and include external libraries different than PETSc and SLEPc? > > In the makefile that I am trying to use (attached): > > -main.cpp: is the main file where some tests are done > -UsePETSc.cpp: A class that uses PETSc > -UseSLEPc.cpp: A class that uses SLEPc > > I have no problem when using UsePETSc.cpp without SLEPc or when compiling single examples that call SLEPc functions (with the makefiles provided in the SLEPc examples). > Sorry if this is not a purelly PETSc question but I have supposed that there is someone else here using also SLEPc. > > Thank you very much in advance, > > Jordi Poblet-Puig > Probably you need to add ${PETSC_INCLUDE} also to the UseSLEPc.o target. Anyway, your makefile does not seem to be well formed. Try the following: CFLAGS = -I../MyIncludefiles -Wno-deprecated MYLIB = -L../MyLibraries -lMyLibrary MYOBJS = main.o UseSLEPc.o UsePETSc.o EXE = MyExecutableFile all: ${EXE} include ${SLEPC_DIR}/conf/slepc_common ${EXE}: ${MYOBJS} chkopts -${CLINKER} -o ${EXE} ${MYOBJS} ${MYLIB} ${SLEPC_LIB} ${RM} ${MYOBJS} From jordi.poblet at gmail.com Mon Sep 6 09:13:38 2010 From: jordi.poblet at gmail.com (jordi poblet) Date: Mon, 6 Sep 2010 16:13:38 +0200 Subject: [petsc-users] PETSc + SLEPc makefile In-Reply-To: References: Message-ID: Dear Matt, Thank you for your answer. My problem was that I needed help with the makefiles because I was not having success when compiling a program with multiple C++ files using PETSc and SLEPc. In addition, I was trying to write a makefile distinguishing between compilation and linking (not to write everything in the same makefile sentence).To do it, I was trying to "guess" the correct variables to be used for the case of PETSc + SLEPc in order to specify: compilation options, include folders locations and library locations (${PETSC_INCLUDE}, ${SLEPC_LIB},${SLEPC_INCLUDE} ...). And which of them should be used when compiling and which others to be used when linking. I do not really understand what a PETSc or SLEPc makefile is doing and so often I am lost when I wish to modify something. Dear Jose, Thank you very much for your email and makefile example. Now I can compile the code. Best regards, Jordi Poblet-Puig On Mon, Sep 6, 2010 at 1:29 PM, Jose E. Roman wrote: > > El 06/09/2010, a las 13:01, jordi poblet escribi?: > > > Dear all, > > > > I am trying to use PETSc and SLEPc but I have problems with the > makefiles. Could someone provide me an example of makefiles that compiles > multiple C++ files and include external libraries different than PETSc and > SLEPc? > > > > In the makefile that I am trying to use (attached): > > > > -main.cpp: is the main file where some tests are done > > -UsePETSc.cpp: A class that uses PETSc > > -UseSLEPc.cpp: A class that uses SLEPc > > > > I have no problem when using UsePETSc.cpp without SLEPc or when compiling > single examples that call SLEPc functions (with the makefiles provided in > the SLEPc examples). > > Sorry if this is not a purelly PETSc question but I have supposed that > there is someone else here using also SLEPc. > > > > Thank you very much in advance, > > > > Jordi Poblet-Puig > > > > Probably you need to add ${PETSC_INCLUDE} also to the UseSLEPc.o target. > Anyway, your makefile does not seem to be well formed. Try the following: > > CFLAGS = -I../MyIncludefiles -Wno-deprecated > MYLIB = -L../MyLibraries -lMyLibrary > MYOBJS = main.o UseSLEPc.o UsePETSc.o > EXE = MyExecutableFile > > all: ${EXE} > > include ${SLEPC_DIR}/conf/slepc_common > > ${EXE}: ${MYOBJS} chkopts > -${CLINKER} -o ${EXE} ${MYOBJS} ${MYLIB} ${SLEPC_LIB} > ${RM} ${MYOBJS} > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From xy2102 at columbia.edu Mon Sep 6 19:31:01 2010 From: xy2102 at columbia.edu (Rebecca Xuefei Yuan) Date: Mon, 06 Sep 2010 20:31:01 -0400 Subject: [petsc-users] valgrind error comes out when upgrade from ubuntu 8.04 LTS to 10.04 Message-ID: <20100906203101.9i11ikcdkos4gss4@cubmail.cc.columbia.edu> Dear all, I upgrade my laptop from ubuntu 8.04 LTS to 10.04, after the upgrade, I reinstalled PETSc, but there are tons of valgrind errors coming out even the code is unchanged. Then I tried with ~/soft/petsc-3.1-p4/src/snes/examples/tutorials/ex19.c of the command: ~/soft/petsc-3.1-p4/externalpackages/mpich2-1.0.8/bin/mpiexec -np 2 valgrind --tool=memcheck ./ex19.exe -malloc off -da_grid_x 30 -da_grid_y 30 as instructed in FAQ. However, the errors are very long(hope it is right to post the full log of valgrind here...), as rebecca at YuanWork:~/linux/code/twoway/twoway_brandnew/trunk/set_a$ ~/soft/petsc-3.1-p4/externalpackages/mpich2-1.0.8/bin/mpiexec -np 2 valgrind --tool=memcheck ./ex19.exe -malloc off -da_grid_x 30 -da_grid_y 30 ==2174== Memcheck, a memory error detector ==2175== Memcheck, a memory error detector ==2175== Copyright (C) 2002-2009, and GNU GPL'd, by Julian Seward et al. ==2175== Using Valgrind-3.5.0 and LibVEX; rerun with -h for copyright info ==2175== Command: ./ex19.exe -malloc off -da_grid_x 30 -da_grid_y 30 ==2175== ==2174== Copyright (C) 2002-2009, and GNU GPL'd, by Julian Seward et al. ==2174== Using Valgrind-3.5.0 and LibVEX; rerun with -h for copyright info ==2174== Command: ./ex19.exe -malloc off -da_grid_x 30 -da_grid_y 30 ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x400B217: _dl_relocate_object (do-rel.h:104) ==2174== by 0x40031D0: dl_main (rtld.c:2229) ==2174== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==2174== by 0x4000C6C: _dl_start (rtld.c:333) ==2174== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) ==2174== by 0x40031D0: dl_main (rtld.c:2229) ==2174== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==2174== by 0x4000C6C: _dl_start (rtld.c:333) ==2174== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==2174== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x400B217: _dl_relocate_object (do-rel.h:104) ==2175== by 0x40031D0: dl_main (rtld.c:2229) ==2175== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==2175== by 0x4000C6C: _dl_start (rtld.c:333) ==2175== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) ==2175== by 0x40031D0: dl_main (rtld.c:2229) ==2175== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==2175== by 0x4000C6C: _dl_start (rtld.c:333) ==2175== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==2175== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x400B27A: _dl_relocate_object (do-rel.h:127) ==2174== by 0x40031D0: dl_main (rtld.c:2229) ==2174== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==2174== by 0x4000C6C: _dl_start (rtld.c:333) ==2174== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==2174== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x400B27A: _dl_relocate_object (do-rel.h:127) ==2175== by 0x40031D0: dl_main (rtld.c:2229) ==2175== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==2175== by 0x4000C6C: _dl_start (rtld.c:333) ==2175== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==2175== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x400A5DE: _dl_relocate_object (do-rel.h:65) ==2174== by 0x40030FE: dl_main (rtld.c:2292) ==2174== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==2174== by 0x4000C6C: _dl_start (rtld.c:333) ==2174== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x400A5E6: _dl_relocate_object (do-rel.h:68) ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x400A5DE: _dl_relocate_object (do-rel.h:65) ==2175== by 0x40030FE: dl_main (rtld.c:2292) ==2175== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==2175== by 0x4000C6C: _dl_start (rtld.c:333) ==2175== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x400A5E6: _dl_relocate_object (do-rel.h:68) ==2175== by 0x40030FE: dl_main (rtld.c:2292) ==2175== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==2175== by 0x4000C6C: _dl_start (rtld.c:333) ==2175== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) ==2175== by 0x40030FE: dl_main (rtld.c:2292) ==2175== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==2175== by 0x4000C6C: _dl_start (rtld.c:333) ==2175== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==2175== ==2174== by 0x40030FE: dl_main (rtld.c:2292) ==2174== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==2174== by 0x4000C6C: _dl_start (rtld.c:333) ==2174== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) ==2174== by 0x40030FE: dl_main (rtld.c:2292) ==2174== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==2174== by 0x4000C6C: _dl_start (rtld.c:333) ==2174== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==2174== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==2175== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==2175== by 0x87C742A: MPID_Init (mpid_init.c:331) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43197CB: __strlen_sse2 (strlen.S:116) ==2175== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==2175== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==2175== by 0x87C742A: MPID_Init (mpid_init.c:331) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x87F559E: PMIU_writeline (simple_pmiutil.c:180) ==2175== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==2175== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==2175== by 0x87C742A: MPID_Init (mpid_init.c:331) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Use of uninitialised value of size 4 ==2175== at 0x87F55A4: PMIU_writeline (simple_pmiutil.c:184) ==2175== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==2175== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==2175== by 0x87C742A: MPID_Init (mpid_init.c:331) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Syscall param write(count) contains uninitialised byte(s) ==2175== at 0x41B1EB3: __write_nocancel (syscall-template.S:82) ==2175== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==2175== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==2175== by 0x87C742A: MPID_Init (mpid_init.c:331) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x87F566C: PMIU_writeline (simple_pmiutil.c:197) ==2175== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==2175== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==2175== by 0x87C742A: MPID_Init (mpid_init.c:331) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==2174== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==2174== by 0x87C742A: MPID_Init (mpid_init.c:331) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43197CB: __strlen_sse2 (strlen.S:116) ==2174== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==2174== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==2174== by 0x87C742A: MPID_Init (mpid_init.c:331) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x87F559E: PMIU_writeline (simple_pmiutil.c:180) ==2174== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==2174== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==2174== by 0x87C742A: MPID_Init (mpid_init.c:331) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Use of uninitialised value of size 4 ==2174== at 0x87F55A4: PMIU_writeline (simple_pmiutil.c:184) ==2174== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==2174== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==2174== by 0x87C742A: MPID_Init (mpid_init.c:331) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Syscall param write(count) contains uninitialised byte(s) ==2174== at 0x41B1EB3: __write_nocancel (syscall-template.S:82) ==2174== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==2174== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==2174== by 0x87C742A: MPID_Init (mpid_init.c:331) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x87F566C: PMIU_writeline (simple_pmiutil.c:197) ==2174== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==2174== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==2174== by 0x87C742A: MPID_Init (mpid_init.c:331) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==2175== by 0x87C74C7: MPID_Init (mpid_init.c:381) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==2175== by 0x87C74C7: MPID_Init (mpid_init.c:381) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==2175== by 0x87C74C7: MPID_Init (mpid_init.c:381) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==2175== by 0x87C74C7: MPID_Init (mpid_init.c:381) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==2175== by 0x87C74C7: MPID_Init (mpid_init.c:381) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==2175== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) ==2175== by 0x87C7526: MPID_Init (mpid_init.c:417) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==2175== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) ==2175== by 0x87C7526: MPID_Init (mpid_init.c:417) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==2175== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) ==2175== by 0x87C7526: MPID_Init (mpid_init.c:417) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==2175== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) ==2175== by 0x87C7526: MPID_Init (mpid_init.c:417) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==2175== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) ==2175== by 0x87C7526: MPID_Init (mpid_init.c:417) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==2174== by 0x87C74C7: MPID_Init (mpid_init.c:381) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==2174== by 0x87C74C7: MPID_Init (mpid_init.c:381) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==2174== by 0x87C74C7: MPID_Init (mpid_init.c:381) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==2174== by 0x87C74C7: MPID_Init (mpid_init.c:381) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==2174== by 0x87C74C7: MPID_Init (mpid_init.c:381) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==2174== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) ==2174== by 0x87C7526: MPID_Init (mpid_init.c:417) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==2174== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) ==2174== by 0x87C7526: MPID_Init (mpid_init.c:417) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==2174== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) ==2174== by 0x87C7526: MPID_Init (mpid_init.c:417) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==2174== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) ==2174== by 0x87C7526: MPID_Init (mpid_init.c:417) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==2174== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) ==2174== by 0x87C7526: MPID_Init (mpid_init.c:417) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319863: __GI_strlen (strlen.S:138) ==2175== by 0x438AD43: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:191) ==2175== by 0x438A685: gethostbyname (getXXbyYY.c:117) ==2175== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) ==2175== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) ==2175== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) ==2175== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x431986D: __GI_strlen (strlen.S:144) ==2175== by 0x438AD43: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:191) ==2175== by 0x438A685: gethostbyname (getXXbyYY.c:117) ==2175== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) ==2175== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) ==2175== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) ==2175== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319863: __GI_strlen (strlen.S:138) ==2174== by 0x438AD43: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:191) ==2174== by 0x438A685: gethostbyname (getXXbyYY.c:117) ==2174== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) ==2174== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) ==2174== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) ==2174== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x431986D: __GI_strlen (strlen.S:144) ==2174== by 0x438AD43: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:191) ==2174== by 0x438A685: gethostbyname (getXXbyYY.c:117) ==2174== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) ==2174== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) ==2174== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) ==2174== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2175== Invalid read of size 4 ==2175== at 0x431983B: __GI_strlen (strlen.S:115) ==2175== by 0x43843CE: __nss_lookup (nsswitch.c:191) ==2175== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) ==2175== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==2175== by 0x438A685: gethostbyname (getXXbyYY.c:117) ==2175== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) ==2175== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) ==2175== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) ==2175== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== Address 0x44308f8 is 40 bytes inside a block of size 42 alloc'd ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==2175== by 0x4384583: nss_parse_service_list (nsswitch.c:622) ==2175== by 0x4384E71: __nss_database_lookup (nsswitch.c:775) ==2175== by 0x43859AD: __nss_hosts_lookup2 (XXX-lookup.c:71) ==2175== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==2175== by 0x438A685: gethostbyname (getXXbyYY.c:117) ==2175== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) ==2175== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) ==2175== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) ==2175== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== ==2174== Invalid read of size 4 ==2174== at 0x431983B: __GI_strlen (strlen.S:115) ==2174== by 0x43843CE: __nss_lookup (nsswitch.c:191) ==2174== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) ==2174== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==2174== by 0x438A685: gethostbyname (getXXbyYY.c:117) ==2174== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) ==2174== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) ==2174== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) ==2174== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== Address 0x44308f8 is 40 bytes inside a block of size 42 alloc'd ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==2174== by 0x4384583: nss_parse_service_list (nsswitch.c:622) ==2174== by 0x4384E71: __nss_database_lookup (nsswitch.c:775) ==2174== by 0x43859AD: __nss_hosts_lookup2 (XXX-lookup.c:71) ==2174== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==2174== by 0x438A685: gethostbyname (getXXbyYY.c:117) ==2174== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) ==2174== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) ==2174== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) ==2174== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x400B217: _dl_relocate_object (do-rel.h:104) ==2174== by 0x4011D15: dl_open_worker (dl-open.c:367) ==2174== by 0x400D875: _dl_catch_error (dl-error.c:178) ==2174== by 0x4011675: _dl_open (dl-open.c:583) ==2174== by 0x43AA4A1: do_dlopen (dl-libc.c:86) ==2174== by 0x400D875: _dl_catch_error (dl-error.c:178) ==2174== by 0x43AA5A0: dlerror_run (dl-libc.c:47) ==2174== by 0x43AA6BA: __libc_dlopen_mode (dl-libc.c:160) ==2174== by 0x43842E4: __nss_lookup_function (nsswitch.c:405) ==2174== by 0x43843CE: __nss_lookup (nsswitch.c:191) ==2174== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) ==2174== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) ==2174== by 0x4011D15: dl_open_worker (dl-open.c:367) ==2174== by 0x400D875: _dl_catch_error (dl-error.c:178) ==2174== by 0x4011675: _dl_open (dl-open.c:583) ==2174== by 0x43AA4A1: do_dlopen (dl-libc.c:86) ==2174== by 0x400D875: _dl_catch_error (dl-error.c:178) ==2174== by 0x43AA5A0: dlerror_run (dl-libc.c:47) ==2174== by 0x43AA6BA: __libc_dlopen_mode (dl-libc.c:160) ==2174== by 0x43842E4: __nss_lookup_function (nsswitch.c:405) ==2174== by 0x43843CE: __nss_lookup (nsswitch.c:191) ==2174== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) ==2174== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==2174== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x400B217: _dl_relocate_object (do-rel.h:104) ==2175== by 0x4011D15: dl_open_worker (dl-open.c:367) ==2175== by 0x400D875: _dl_catch_error (dl-error.c:178) ==2175== by 0x4011675: _dl_open (dl-open.c:583) ==2175== by 0x43AA4A1: do_dlopen (dl-libc.c:86) ==2175== by 0x400D875: _dl_catch_error (dl-error.c:178) ==2175== by 0x43AA5A0: dlerror_run (dl-libc.c:47) ==2175== by 0x43AA6BA: __libc_dlopen_mode (dl-libc.c:160) ==2175== by 0x43842E4: __nss_lookup_function (nsswitch.c:405) ==2175== by 0x43843CE: __nss_lookup (nsswitch.c:191) ==2175== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) ==2175== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) ==2175== by 0x4011D15: dl_open_worker (dl-open.c:367) ==2175== by 0x400D875: _dl_catch_error (dl-error.c:178) ==2175== by 0x4011675: _dl_open (dl-open.c:583) ==2175== by 0x43AA4A1: do_dlopen (dl-libc.c:86) ==2175== by 0x400D875: _dl_catch_error (dl-error.c:178) ==2175== by 0x43AA5A0: dlerror_run (dl-libc.c:47) ==2175== by 0x43AA6BA: __libc_dlopen_mode (dl-libc.c:160) ==2175== by 0x43842E4: __nss_lookup_function (nsswitch.c:405) ==2175== by 0x43843CE: __nss_lookup (nsswitch.c:191) ==2175== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) ==2175== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==2175== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43197CB: __strlen_sse2 (strlen.S:116) ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x87F559E: PMIU_writeline (simple_pmiutil.c:180) ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Use of uninitialised value of size 4 ==2174== at 0x87F55A4: PMIU_writeline (simple_pmiutil.c:184) ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Syscall param write(count) contains uninitialised byte(s) ==2174== at 0x41B1EB3: __write_nocancel (syscall-template.S:82) ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x87F566C: PMIU_writeline (simple_pmiutil.c:197) ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43197CB: __strlen_sse2 (strlen.S:116) ==2175== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x87F559E: PMIU_writeline (simple_pmiutil.c:180) ==2175== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Use of uninitialised value of size 4 ==2175== at 0x87F55A4: PMIU_writeline (simple_pmiutil.c:184) ==2175== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Syscall param write(count) contains uninitialised byte(s) ==2175== at 0x41B1EB3: __write_nocancel (syscall-template.S:82) ==2175== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x87F566C: PMIU_writeline (simple_pmiutil.c:197) ==2175== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==2174== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==2174== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==2174== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==2174== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==2174== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) ==2174== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==2175== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==2175== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==2175== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==2175== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==2175== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) ==2175== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== by 0x879F15A: PMPI_Init (init.c:106) ==2175== by 0x80AEE12: PetscI==2174== Invalid read of size 4 ==2174== at 0x431983B: __GI_strlen (strlen.S:115) ==2174== by 0x43843CE: __nss_lookup (nsswitch.c:191) ==2174== by 0x438564E: __nss_passwd_lookup2 (XXX-lookup.c:76) ==2174== by 0x433D0DE: getpwuid_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==2174== by 0x433C92E: getpwuid (getXXbyYY.c:117) ==2174== by 0x80BC53A: PetscGetUserName (fuser.c:66) ==2174== by 0x80822FF: PetscErrorPrintfInitialize (errtrace.c:68) ==2174== by 0x80AEED1: PetscInitialize (pinit.c:576) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== Address 0x4430700 is 40 bytes inside a block of size 43 alloc'd ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==2174== by 0x4384583: nss_parse_service_list (nsswitch.c:622) ==2174== by 0x4384E71: __nss_database_lookup (nsswitch.c:775) ==2174== by 0x43859AD: __nss_hosts_lookup2 (XXX-lookup.c:71) ==2174== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==2174== by 0x438A685: gethostbyname (getXXbyYY.c:117) ==2174== nitialize (pinit.c:561) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) ==2174== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) ==2174== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) ==2174== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== ==2175== Invalid read of size 4 ==2175== at 0x431983B: __GI_strlen (strlen.S:115) ==2175== by 0x43843CE: __nss_lookup (nsswitch.c:191) ==2175== by 0x438564E: __nss_passwd_lookup2 (XXX-lookup.c:76) ==2175== by 0x433D0DE: getpwuid_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==2175== by 0x433C92E: getpwuid (getXXbyYY.c:117) ==2175== by 0x80BC53A: PetscGetUserName (fuser.c:66) ==2175== by 0x80822FF: PetscErrorPrintfInitialize (errtrace.c:68) ==2175== by 0x80AEED1: PetscInitialize (pinit.c:576) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== Address 0x4430700 is 40 bytes inside a block of size 43 alloc'd ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==2175== by 0x4384583: nss_parse_service_list (nsswitch.c:622) ==2175== by 0x4384E71: __nss_database_lookup (nsswitch.c:775) ==2175== by 0x43859AD: __nss_hosts_lookup2 (XXX-lookup.c:71) ==2175== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==2175== by 0x438A685: gethostbyname (getXXbyYY.c:117) ==2175== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) ==2175== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) ==2175== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) ==2175== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2175== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80885F9: PetscVSNPrintf (mprint.c:95) ==2174== by 0x8088B22: PetscSNPrintf (mprint.c:228) ==2174== by 0x8082456: PetscErrorPrintfInitialize (errtrace.c:71) ==2174== by 0x80AEED1: PetscInitialize (pinit.c:576) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43197CB: __strlen_sse2 (strlen.S:116) ==2174== by 0x80885F9: PetscVSNPrintf (mprint.c:95) ==2174== by 0x8088B22: PetscSNPrintf (mprint.c:228) ==2174== by 0x8082456: PetscErrorPrintfInitialize (errtrace.c:71) ==2174== by 0x80AEED1: PetscInitialize (pinit.c:576) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2174== by 0x80824DE: PetscErrorPrintfInitialize (errtrace.c:73) ==2174== by 0x80AEED1: PetscInitialize (pinit.c:576) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2174== by 0x8082563: PetscErrorPrintfInitialize (errtrace.c:77) ==2174== by 0x80AEED1: PetscInitialize (pinit.c:576) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2174== by 0x809CFD4: PetscOptionsInsert (options.c:516) ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80885F9: PetscVSNPrintf (mprint.c:95) ==2175== by 0x8088B22: PetscSNPrintf (mprint.c:228) ==2175== by 0x8082456: PetscErrorPrintfInitialize (errtrace.c:71) ==2175== by 0x80AEED1: PetscInitialize (pinit.c:576) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43197CB: __strlen_sse2 (strlen.S:116) ==2175== by 0x80885F9: PetscVSNPrintf (mprint.c:95) ==2175== by 0x8088B22: PetscSNPrintf (mprint.c:228) ==2175== by 0x8082456: PetscErrorPrintfInitialize (errtrace.c:71) ==2175== by 0x80AEED1: PetscInitialize (pinit.c:576) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43197CB: __strlen_sse2 (strlen.S:116) ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x87F559E: PMIU_writeline (simple_pmiutil.c:180) ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2174== ==2174== Use of uninitialised value of size 4 ==2174== at 0x87F55A4: PMIU_writeline (simple_pmiutil.c:184) ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2174== ==2174== Syscall param write(count) contains uninitialised byte(s) ==2174== at 0x41B1EB3: __write_nocancel (syscall-template.S:82) ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x87F566C: PMIU_writeline (simple_pmiutil.c:197) ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) ==2174== by 0x87CA9F9: getConnIn==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2175== by 0x80824DE: PetscErrorPrintfInitialize (errtrace.c:73) ==2175== by 0x80AEED1: PetscInitialize (pinit.c:576) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== foKVS (mpidi_pg.c:622) ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) ==2174== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2175== by 0x8082563: PetscErrorPrintfInitialize (errtrace.c:77) ==2175== by 0x80AEED1: PetscInitialize (pinit.c:576) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2175== by 0x809CFD4: PetscOptionsInsert (options.c:516) ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2174== Invalid read of size 8 ==2174== at 0x4319785: __strlen_sse2 (strlen.S:87) ==2174== by 0x87BD379: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:776) ==2174== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) ==2174== by 0x8793060: MPIC_Wait (helper_fns.c:269) ==2174== by 0x879377E: MPIC_Send (helper_fns.c:38) ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== Address 0x442e0d0 is 8 bytes before a block of size 257 alloc'd ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==2174== by 0x87C74AE: MPID_Init (mpid_init.c:373) ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==2174== by 0x879F15A: PMPI_Init (init.c:106) ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43197AD: __strlen_sse2 (strlen.S:104) ==2174== by 0x87BD379: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:776) ==2174== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) ==2174== by 0x8793060: MPIC_Wait (helper_fns.c:269) ==2174== by 0x879377E: MPIC_Send (helper_fns.c:38) ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Syscall param writev(vector) points to uninitialised byte(s) ==2174== at 0x436BA61: writev (writev.c:51) ==2174== by 0x87EB49A: MPIDU_Sock_wait (sock_wait.i:693) ==2174== by 0x87BD7B0: MPIDI_CH3I_Progress (ch3_progress.c:187) ==2174== by 0x8793060: MPIC_Wait (helper_fns.c:269) ==2174== by 0x879377E: MPIC_Send (helper_fns.c:38) ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== Address 0x4462754 is 68 bytes inside a block of size 72 alloc'd ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==2174== by 0x87CBE6C: MPIDI_CH3I_Connection_alloc (ch3u_connect_sock.c:160) ==2174== by 0x87CBFAD: MPIDI_CH3I_Sock_connect (ch3u_connect_sock.c:1164) ==2174== by 0x87CC132: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1102) ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) ==2174== ==2174== Syscall param writev(vector[...]) points to uninitialised byte(s) ==2174== at 0x436BA61: writev (writev.c:51) ==2174== by 0x87EB49A: MPIDU_Sock_wait (sock_wait.i:693) ==2174== by 0x87BD7B0: MPIDI_CH3I_Progress (ch3_progress.c:187) ==2174== by 0x8793060: MPIC_Wait (helper_fns.c:269) ==2174== by 0x879377E: MPIC_Send (helper_fns.c:38) ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== Address 0x4462728 is 24 bytes inside a block of size 72 alloc'd ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==2174== by 0x87CBE6C: MPIDI_CH3I_Connection_alloc (ch3u_connect_sock.c:160) ==2174== by 0x87CBFAD: MPIDI_CH3I_Sock_connect (ch3u_connect_sock.c:1164) ==2174== by 0x87CC132: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1102) ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x87EC050: MPIDU_Sock_wait (socki_util.i:543) ==2174== by 0x87BD7B0: MPIDI_CH3I_Progress (ch3_progress.c:187) ==2174== by 0x8793060: MPIC_Wait (helper_fns.c:269) ==2174== by 0x879377E: MPIC_Send (helper_fns.c:38) ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==2175== by 0x87CA206: MPIDI_PG_Find (mpidi_pg.c:341) ==2175== by 0x87CB66D: MPIDI_CH3_Sockconn_handle_connopen_event (ch3u_connect_sock.c:883) ==2175== by 0x87BD3FA: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:639) ==2175== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) ==2175== by 0x8793060: MPIC_Wait (helper_fns.c:269) ==2175== by 0x8793626: MPIC_Recv (helper_fns.c:74) ==2175== by 0x878C049: MPIR_Bcast (bcast.c:195) ==2175== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2175== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2175== by 0x809D14A: PetscOptionsInsert (options.c:522) ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==2175== by 0x87CA206: MPIDI_PG_Find (mpidi_pg.c:341) ==2175== by 0x87CB66D: MPIDI_CH3_Sockconn_handle_connopen_event (ch3u_connect_sock.c:883) ==2175== by 0x87BD3FA: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:639) ==2175== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) ==2175== by 0x8793060: MPIC_Wait (helper_fns.c:269) ==2175== by 0x8793626: MPIC_Recv (helper_fns.c:74) ==2175== by 0x878C049: MPIR_Bcast (bcast.c:195) ==2175== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2175== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2175== by 0x809D14A: PetscOptionsInsert (options.c:522) ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==2175== by 0x87CA206: MPIDI_PG_Find (mpidi_pg.c:341) ==2175== by 0x87CB66D: MPIDI_CH3_Sockconn_handle_connopen_event (ch3u_connect_sock.c:883) ==2175== by 0x87BD3FA: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:639) ==2175== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) ==2175== by 0x8793060: MPIC_Wait (helper_fns.c:269) ==2175== by 0x8793626: MPIC_Recv (helper_fns.c:74) ==2175== by 0x878C049: MPIR_Bcast (bcast.c:195) ==2175== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2175== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2175== by 0x809D14A: PetscOptionsInsert (options.c:522) ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==2175== by 0x87CA206: MPIDI_PG_Find (mpidi_pg.c:341) ==2175== by 0x87CB66D: MPIDI_CH3_Sockconn_handle_connopen_event (ch3u_connect_sock.c:883) ==2175== by 0x87BD3FA: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:639) ==2175== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) ==2175== by 0x8793060: MPIC_Wait (helper_fns.c:269) ==2175== by 0x8793626: MPIC_Recv (helper_fns.c:74) ==2175== by 0x878C049: MPIR_Bcast (bcast.c:195) ==2175== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2175== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2175== by 0x809D14A: PetscOptionsInsert (options.c:522) ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==2175== by 0x87CA206: MPIDI_PG_Find (mpidi_pg.c:341) ==2175== by 0x87CB66D: MPIDI_CH3_Sockconn_handle_connopen_event (ch3u_connect_sock.c:883) ==2175== by 0x87BD3FA: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:639) ==2175== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) ==2175== by 0x8793060: MPIC_Wait (helper_fns.c:269) ==2175== by 0x8793626: MPIC_Recv (helper_fns.c:74) ==2175== by 0x878C049: MPIR_Bcast (bcast.c:195) ==2175== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==2175== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==2175== by 0x809D14A: PetscOptionsInsert (options.c:522) ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2175== ==2174== Invalid read of size 8 ==2174== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) ==2174== by 0x809F18B: PetscOptionsSetValue (options.c:803) ==2174== by 0x809DD95: PetscOptionsInsert (options.c:588) ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== Address 0x44c0d88 is 8 bytes inside a block of size 10 alloc'd ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==2174== by 0x809F5F2: PetscOptionsSetValue (options.c:829) ==2174== by 0x809DD95: PetscOptionsInsert (options.c:588) ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Invalid read of size 8 ==2174== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) ==2174== by 0x809F18B: PetscOptionsSetValue (options.c:803) ==2174== by 0x809DD95: PetscOptionsInsert (options.c:588) ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== Address 0x44c0d90 is 6 bytes after a block of size 10 alloc'd ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==2174== by 0x809F5F2: PetscOptionsSetValue (options.c:829) ==2174== by 0x809DD95: PetscOptionsInsert (options.c:588) ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==2174== by 0x80AF535: PetscInitialize (pinit.c:635) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2174== by 0x80A8875: PetscOptionsCheckInitial_Private (init.c:242) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Invalid read of size 8 ==2174== at 0x4319785: __strlen_sse2 (strlen.S:87) ==2174== by 0x8099BCC: PetscOptionsAtol (options.c:152) ==2174== by 0x80A1E9D: PetscOptionsGetTruth (options.c:1310) ==2174== by 0x80A88F7: PetscOptionsCheckInitial_Private (init.c:244) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== Address 0x44c0d40 is 8 bytes before a block of size 4 alloc'd ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==2174== by 0x809F6EB: PetscOptionsSetValue (options.c:833) ==2174== by 0x809DD95: PetscOptionsInsert (options.c:588) ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2174== by 0x80A8A66: PetscOptionsCheckInitial_Private (init.c:257) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2174== by 0x80A8BC3: PetscOptionsCheckInitial_Private (init.c:264) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2174== by 0x80A8C5B: PetscOptionsCheckInitial_Private (init.c:267) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2175== Invalid read of size 8 ==2175== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) ==2175== by 0x809F18B: PetscOptionsSetValue (options.c:803) ==2175== by 0x809DD95: PetscOptionsInsert (options.c:588) ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== Address 0x4433c38 is 8 bytes inside a block of size 10 alloc'd ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==2175== by 0x809F5F2: PetscOptionsSetValue (options.c:829) ==2175== by 0x809DD95: PetscOptionsInsert (options.c:588) ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Invalid read of size 8 ==2175== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) ==2175== by 0x809F18B: PetscOptionsSetValue (options.c:803) ==2175== by 0x809DD95: PetscOptionsInsert (options.c:588) ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== Address 0x4433c40 is 6 bytes after a block of size 10 alloc'd ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==2175== by 0x809F5F2: PetscOptionsSetValue (options.c:829) ==2175== by 0x809DD95: PetscOptionsInsert (options.c:588) ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==2175== by 0x80AF535: PetscInitialize (pinit.c:635) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2174== by 0x80C395E: PetscSetDisplay (pdisplay.c:99) ==2174== by 0x80A8D23: PetscOptionsCheckInitial_Private (init.c:276) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2175== by 0x80A8875: PetscOptionsCheckInitial_Private (init.c:242) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Invalid read of size 8 ==2175== at 0x4319785: __strlen_sse2 (strlen.S:87) ==2175== by 0x8099BCC: PetscOptionsAtol (options.c:152) ==2175== by 0x80A1E9D: PetscOptionsGetTruth (options.c:1310) ==2175== by 0x80A88F7: PetscOptionsCheckInitial_Private (init.c:244) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== Address 0x4433bf0 is 8 bytes before a block of size 4 alloc'd ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==2175== by 0x809F6EB: PetscOptionsSetValue (options.c:833) ==2175== by 0x809DD95: PetscOptionsInsert (options.c:588) ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==2174== by 0x80A8D9B: PetscOptionsCheckInitial_Private (init.c:281) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==2174== by 0x80A8D9B: PetscOptionsCheckInitial_Private (init.c:281) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==2174== by 0x80A8E13: PetscOptionsCheckInitial_Private (init.c:282) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==2174== by 0x80A8E8B: PetscOptionsCheckInitial_Private (init.c:283) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2174== by 0x80A94FC: PetscOptionsCheckInitial_Private (init.c:320) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2175== by 0x80A8A66: PetscOptionsCheckInitial_Private (init.c:257) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2175== by 0x80A8BC3: PetscOptionsCheckInitial_Private (init.c:264) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2175== by 0x80A8C5B: PetscOptionsCheckInitial_Private (init.c:267) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2174== by 0x80A95F6: PetscOptionsCheckInitial_Private (init.c:323) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2174== by 0x80A96F8: PetscOptionsCheckInitial_Private (init.c:326) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2174== by 0x80A97FA: PetscOptionsCheckInitial_Private (init.c:329) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2175== by 0x80C395E: PetscSetDisplay (pdisplay.c:99) ==2175== by 0x80A8D23: PetscOptionsCheckInitial_Private (init.c:276) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2174== by 0x80A98FE: PetscOptionsCheckInitial_Private (init.c:334) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2174== by 0x80A9A5C: PetscOptionsCheckInitial_Private (init.c:341) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2174== by 0x80A9CA7: PetscOptionsCheckInitial_Private (init.c:350) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2174== by 0x80A9D9D: PetscOptionsCheckInitial_Private (init.c:352) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2174== by 0x80A9E24: PetscOptionsCheckInitial_Private (init.c:353) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2174== by 0x80AA655: PetscOptionsCheckInitial_Private (init.c:402) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==2174== by 0x80AA74F: PetscOptionsCheckInitial_Private (init.c:409) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==2174== by 0x80AA7C7: PetscOptionsCheckInitial_Private (init.c:410) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2174== by 0x80AAABF: PetscOptionsCheckInitial_Private (init.c:439) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2174== by 0x80AAC26: PetscOptionsCheckInitial_Private (init.c:452) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==2175== by 0x80A8D9B: PetscOptionsCheckInitial_Private (init.c:281) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==2175== by 0x80A8D9B: PetscOptionsCheckInitial_Private (init.c:281) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==2175== by 0x80A8E13: PetscOptionsCheckInitial_Private (init.c:282) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==2175== by 0x80A8E8B: PetscOptionsCheckInitial_Private (init.c:283) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2175== by 0x80A94FC: PetscOptionsCheckInitial_Private (init.c:320) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2175== by 0x80A95F6: PetscOptionsCheckInitial_Private (init.c:323) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2175== by 0x80A96F8: PetscOptionsCheckInitial_Private (init.c:326) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2175== by 0x80A97FA: PetscOptionsCheckInitial_Private (init.c:329) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2175== by 0x80A98FE: PetscOptionsCheckInitial_Private (init.c:334) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2174== by 0x80AADBF: PetscOptionsCheckInitial_Private (init.c:468) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2174== by 0x80AAE3F: PetscOptionsCheckInitial_Private (init.c:469) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2174== by 0x80AAE3F: PetscOptionsCheckInitial_Private (init.c:469) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==2174== by 0x80AAEB7: PetscOptionsCheckInitial_Private (init.c:470) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2174== by 0x80AB021: PetscOptionsCheckInitial_Private (init.c:474) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2174== by 0x80AB29D: PetscOptionsCheckInitial_Private (init.c:499) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==2174== by 0x80AB315: PetscOptionsCheckInitial_Private (init.c:504) ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== ==2174== More than 100 errors detected. Subsequent errors ==2174== will still be recorded, but in less detail than before. ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2175== by 0x80A9A5C: PetscOptionsCheckInitial_Private (init.c:341) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2175== by 0x80A9CA7: PetscOptionsCheckInitial_Private (init.c:350) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2175== by 0x80A9D9D: PetscOptionsCheckInitial_Private (init.c:352) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2175== by 0x80A9E24: PetscOptionsCheckInitial_Private (init.c:353) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2175== by 0x80AA655: PetscOptionsCheckInitial_Private (init.c:402) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==2175== by 0x80AA74F: PetscOptionsCheckInitial_Private (init.c:409) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==2175== by 0x80AA7C7: PetscOptionsCheckInitial_Private (init.c:410) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2175== by 0x80AAABF: PetscOptionsCheckInitial_Private (init.c:439) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2175== by 0x80AAC26: PetscOptionsCheckInitial_Private (init.c:452) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2175== by 0x80AADBF: PetscOptionsCheckInitial_Private (init.c:468) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2175== by 0x80AAE3F: PetscOptionsCheckInitial_Private (init.c:469) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2175== by 0x80AAE3F: PetscOptionsCheckInitial_Private (init.c:469) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==2175== by 0x80AAEB7: PetscOptionsCheckInitial_Private (init.c:470) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2175== by 0x80AB021: PetscOptionsCheckInitial_Private (init.c:474) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==2175== by 0x80AB29D: PetscOptionsCheckInitial_Private (init.c:499) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==2175== by 0x80AB315: PetscOptionsCheckInitial_Private (init.c:504) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A27DF: PetscOptionsGetReal (options.c:1419) ==2175== by 0x80AC610: PetscOptionsCheckInitial_Private (init.c:554) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2175== by 0x80AC70C: PetscOptionsCheckInitial_Private (init.c:559) ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==2175== by 0x80C52B1: PetscLogBegin_Private (plog.c:196) ==2175== by 0x80AF677: PetscInitialize (pinit.c:643) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==2175== by 0x80C532E: PetscLogBegin_Private (plog.c:200) ==2175== by 0x80AF677: PetscInitialize (pinit.c:643) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A420A: PetscOptionsGetStringArray (options.c:1756) ==2175== by 0x808387F: PetscInitialize_DynamicLibraries (reg.c:80) ==2175== by 0x80AF6D6: PetscInitialize (pinit.c:650) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2175== by 0x81DE8F5: PetscInitializePackage (dlregispetsc.c:58) ==2175== by 0x8083A20: PetscInitialize_DynamicLibraries (reg.c:93) ==2175== by 0x80AF6D6: PetscInitialize (pinit.c:650) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2175== by 0x81DEA77: PetscInitializePackage (dlregispetsc.c:66) ==2175== by 0x8083A20: PetscInitialize_DynamicLibraries (reg.c:93) ==2175== by 0x80AF6D6: PetscInitialize (pinit.c:650) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A420A: PetscOptionsGetStringArray (options.c:1756) ==2175== by 0x8083AA5: PetscInitialize_DynamicLibraries (reg.c:117) ==2175== by 0x80AF6D6: PetscInitialize (pinit.c:650) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80A066B: PetscOptionsFindPair_Private (options.c:967) ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2174== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) ==2174== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) ==2174== by 0x80AF975: PetscInitialize (pinit.c:659) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x40249EA: strncat (mc_replace_strmem.c:202) ==2174== by 0x80BE6E2: PetscStrncat (str.c:205) ==2174== by 0x80A06F1: PetscOptionsFindPair_Private (options.c:968) ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2174== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) ==2174== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) ==2174== by 0x80AF975: PetscInitialize (pinit.c:659) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==2175== by 0x80AC9CD: PetscOptionsCheckInitial_Components (pinit.c:57) ==2175== by 0x80AF916: PetscInitialize (pinit.c:657) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4024A18: strncat (mc_replace_strmem.c:202) ==2174== by 0x80BE6E2: PetscStrncat (str.c:205) ==2174== by 0x80A06F1: PetscOptionsFindPair_Private (options.c:968) ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2174== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) ==2174== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) ==2174== by 0x80AF975: PetscInitialize (pinit.c:659) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==2175== by 0x80B290F: PetscOptionsBegin_Private (aoptions.c:44) ==2175== by 0x80A4EED: PetscOptionsSetFromOptions (options.c:1890) ==2175== by 0x80AF975: PetscInitialize (pinit.c:659) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80A066B: PetscOptionsFindPair_Private (options.c:967) ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2175== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) ==2175== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) ==2175== by 0x80AF975: PetscInitialize (pinit.c:659) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== ==2175== More than 100 errors detected. Subsequent errors ==2175== will still be recorded, but in less detail than before. ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x40249EA: strncat (mc_replace_strmem.c:202) ==2175== by 0x80BE6E2: PetscStrncat (str.c:205) ==2175== by 0x80A06F1: PetscOptionsFindPair_Private (options.c:968) ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2175== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) ==2175== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) ==2175== by 0x80AF975: PetscInitialize (pinit.c:659) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4024A18: strncat (mc_replace_strmem.c:202) ==2175== by 0x80BE6E2: PetscStrncat (str.c:205) ==2175== by 0x80A06F1: PetscOptionsFindPair_Private (options.c:968) ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2175== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) ==2175== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) ==2175== by 0x80AF975: PetscInitialize (pinit.c:659) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2174== by 0x80BDE83: PetscStrallocpy (str.c:79) ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==2174== by 0x80841CC: PetscFListAdd (reg.c:201) ==2174== by 0x81843D0: DARegister (dareg.c:104) ==2174== by 0x818476B: DARegisterAll (daregall.c:32) ==2174== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) ==2174== by 0x815F1D1: DACreate (dacreate.c:173) ==2174== by 0x81558E2: DACreate2d (da2.c:1837) ==2174== by 0x804BE2A: main (ex19.c:107) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43197AD: __strlen_sse2 (strlen.S:104) ==2174== by 0x80BDE83: PetscStrallocpy (str.c:79) ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==2174== by 0x80841CC: PetscFListAdd (reg.c:201) ==2174== by 0x81843D0: DARegister (dareg.c:104) ==2174== by 0x818476B: DARegisterAll (daregall.c:32) ==2174== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) ==2174== by 0x815F1D1: DACreate (dacreate.c:173) ==2174== by 0x81558E2: DACreate2d (da2.c:1837) ==2174== by 0x804BE2A: main (ex19.c:107) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x80BDEDB: PetscStrallocpy (str.c:80) ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) ==2175== by 0x80BDE83: PetscStrallocpy (str.c:79) ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==2175== by 0x80841CC: PetscFListAdd (reg.c:201) ==2175== by 0x81843D0: DARegister (dareg.c:104) ==2175== by 0x818476B: DARegisterAll (daregall.c:32) ==2175== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) ==2175== by 0x815F1D1: DACreate (dacreate.c:173) ==2175== by 0x81558E2: DACreate2d (da2.c:1837) ==2175== by 0x804BE2A: main (ex19.c:107) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43197AD: __strlen_sse2 (strlen.S:104) ==2175== by 0x80BDE83: PetscStrallocpy (str.c:79) ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==2175== by 0x80841CC: PetscFListAdd (reg.c:201) ==2175== by 0x81843D0: DARegister (dareg.c:104) ==2175== by 0x818476B: DARegisterAll (daregall.c:32) ==2175== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) ==2175== by 0x815F1D1: DACreate (dacreate.c:173) ==2175== by 0x81558E2: DACreate2d (da2.c:1837) ==2175== by 0x804BE2A: main (ex19.c:107) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x80BDEDB: PetscStrallocpy (str.c:80) ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==2175== by 0x80841CC: PetscFListAdd (reg.c:201) ==2175== by 0x81843D0: DARegister (dareg.c:104) ==2175== by 0x818476B: DARegisterAll (daregall.c:32) ==2175== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) ==2175== by 0x815F1D1: DACreate (dacreate.c:173) ==2175== by 0x81558E2: DACreate2d (da2.c:1837) ==2175== by 0x804BE2A: main (ex19.c:107) ==2175== ==2174== by 0x80841CC: PetscFListAdd (reg.c:201) ==2174== by 0x81843D0: DARegister (dareg.c:104) ==2174== by 0x818476B: DARegisterAll (daregall.c:32) ==2174== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) ==2174== by 0x815F1D1: DACreate (dacreate.c:173) ==2174== by 0x81558E2: DACreate2d (da2.c:1837) ==2174== by 0x804BE2A: main (ex19.c:107) ==2174== ==2175== Invalid read of size 8 ==2175== at 0x43197A0: __strlen_sse2 (strlen.S:99) ==2175== by 0x8099209: PetscOptionsAtoi (options.c:70) ==2175== by 0x80A13CE: PetscOptionsGetInt (options.c:1138) ==2175== by 0x80B5AA5: PetscOptionsInt (aoptions.c:473) ==2175== by 0x815E8BC: DASetFromOptions (dacreate.c:109) ==2175== by 0x8155C96: DACreate2d (da2.c:1847) ==2175== by 0x804BE2A: main (ex19.c:107) ==2175== Address 0x4433c70 is 0 bytes inside a block of size 3 alloc'd ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==2175== by 0x809F6EB: PetscOptionsSetValue (options.c:833) ==2175== by 0x809DD95: PetscOptionsInsert (options.c:588) ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2174== Invalid read of size 8 ==2174== at 0x43197A0: __strlen_sse2 (strlen.S:99) ==2174== by 0x8099209: PetscOptionsAtoi (options.c:70) ==2174== by 0x80A13CE: PetscOptionsGetInt (options.c:1138) ==2174== by 0x80B5AA5: PetscOptionsInt (aoptions.c:473) ==2174== by 0x815E8BC: DASetFromOptions (dacreate.c:109) ==2174== by 0x8155C96: DACreate2d (da2.c:1847) ==2174== by 0x804BE2A: main (ex19.c:107) ==2174== Address 0x44c0dc0 is 0 bytes inside a block of size 3 alloc'd ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==2174== by 0x809F6EB: PetscOptionsSetValue (options.c:833) ==2174== by 0x809DD95: PetscOptionsInsert (options.c:588) ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2175== Invalid read of size 8 ==2175== at 0x4319785: __strlen_sse2 (strlen.S:87) ==2175== by 0x8099209: PetscOptionsAtoi (options.c:70) ==2175== by 0x80A13CE: PetscOptionsGetInt (options.c:1138) ==2175== by 0x80B5AA5: PetscOptionsInt (aoptions.c:473) ==2175== by 0x815E96A: DASetFromOptions (dacreate.c:114) ==2175== by 0x8155C96: DACreate2d (da2.c:1847) ==2175== by 0x804BE2A: main (ex19.c:107) ==2175== Address 0x4433ce0 is 8 bytes before a block of size 3 alloc'd ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==2175== by 0x809F6EB: PetscOptionsSetValue (options.c:833) ==2175== by 0x809DD95: PetscOptionsInsert (options.c:588) ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2175== by 0x804BA0C: main (ex19.c:96) ==2175== ==2174== Invalid read of size 8 ==2174== at 0x4319785: __strlen_sse2 (strlen.S:87) ==2174== by 0x8099209: PetscOptionsAtoi (options.c:70) ==2174== by 0x80A13CE: PetscOptionsGetInt (options.c:1138) ==2174== by 0x80B5AA5: PetscOptionsInt (aoptions.c:473) ==2174== by 0x815E96A: DASetFromOptions (dacreate.c:114) ==2174== by 0x8155C96: DACreate2d (da2.c:1847) ==2174== by 0x804BE2A: main (ex19.c:107) ==2174== Address 0x44c0e30 is 8 bytes before a block of size 3 alloc'd ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==2174== by 0x809F6EB: PetscOptionsSetValue (options.c:833) ==2174== by 0x809DD95: PetscOptionsInsert (options.c:588) ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==2174== by 0x804BA0C: main (ex19.c:96) ==2174== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43197BC: __strlen_sse2 (strlen.S:110) ==2175== by 0x80BDE83: PetscStrallocpy (str.c:79) ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==2175== by 0x80846F3: PetscFListAdd (reg.c:238) ==2175== by 0x829D9AC: MatRegister (matreg.c:139) ==2175== by 0x86C6837: MatRegisterAll (matregis.c:85) ==2175== by 0x82A0B71: MatInitializePackage (dlregismat.c:80) ==2175== by 0x8542386: MatCreate (gcreate.c:72) ==2175== by 0x8171B37: DAGetInterpolation_2D_Q1 (dainterp.c:308) ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) ==2175== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43197BC: __strlen_sse2 (strlen.S:110) ==2174== by 0x80BDE83: PetscStrallocpy (str.c:79) ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==2174== by 0x80846F3: PetscFListAdd (reg.c:238) ==2174== by 0x829D9AC: MatRegister (matreg.c:139) ==2174== by 0x86C6837: MatRegisterAll (matregis.c:85) ==2174== by 0x82A0B71: MatInitializePackage (dlregismat.c:80) ==2174== by 0x8542386: MatCreate (gcreate.c:72) ==2174== by 0x8171B37: DAGetInterpolation_2D_Q1 (dainterp.c:308) ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) ==2174== ==2175== Invalid read of size 8 ==2175== at 0x43BEC7D: __strcmp_ssse3 (strcmp-ssse3.S:1021) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x834D30B: MatCreate_MPIAIJ (mpiaij.c:5096) ==2175== by 0x829D1E0: MatSetType (matreg.c:65) ==2175== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) ==2175== by 0x829D1E0: MatSetType (matreg.c:65) ==2175== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) ==2175== Address 0x44db848 is 24 bytes inside a block of size 28 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x8084689: PetscFListAdd (reg.c:237) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) ==2175== by 0x829D1E0: MatSetType (matreg.c:65) ==2175== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) ==2175== by 0x829D1E0: MatSetType (matreg.c:65) ==2175== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2175== ==2174== Invalid read of size 8 ==2174== at 0x43BEC7D: __strcmp_ssse3 (strcmp-ssse3.S:1021) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x834D30B: MatCreate_MPIAIJ (mpiaij.c:5096) ==2174== by 0x829D1E0: MatSetType (matreg.c:65) ==2174== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) ==2174== by 0x829D1E0: MatSetType (matreg.c:65) ==2174== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) ==2174== Address 0x456b7c8 is 24 bytes inside a block of size 28 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x8084689: PetscFListAdd (reg.c:237) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) ==2174== by 0x829D1E0: MatSetType (matreg.c:65) ==2174== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) ==2174== by 0x829D1E0: MatSetType (matreg.c:65) ==2174== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2174== ==2175== Invalid read of size 8 ==2175== at 0x43BE415: __strcmp_ssse3 (strcmp-ssse3.S:225) ==2175== by 0x8085399: PetscFListFind (reg.c:375) ==2175== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==2175== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==2175== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) ==2175== by 0x804BEA0: main (ex19.c:108) ==2175== Address 0x44db848 is 24 bytes inside a block of size 28 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x8084689: PetscFListAdd (reg.c:237) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) ==2175== by 0x829D1E0: MatSetType (matreg.c:65) ==2175== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) ==2175== by 0x829D1E0: MatSetType (matreg.c:65) ==2175== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2175== ==2175== Invalid read of size 8 ==2175== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) ==2175== by 0x8085399: PetscFListFind (reg.c:375) ==2175== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==2175== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==2175== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) ==2175== by 0x804BEA0: main (ex19.c:108) ==2175== Address 0x44dc7d8 is 24 bytes inside a block of size 28 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==2175== by 0x8085186: PetscFListFind (reg.c:356) ==2175== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==2175== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==2175== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) ==2175== ==2175== Invalid read of size 8 ==2175== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) ==2175== by 0x8085406: PetscFListFind (reg.c:376) ==2175== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==2175== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==2175== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) ==2175== by 0x804BEA0: main (ex19.c:108) ==2175== Address 0x44dc7d8 is 24 bytes inside a block of size 28 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==2175== by 0x8085186: PetscFListFind (reg.c:356) ==2175== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==2175== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==2175== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) ==2175== ==2174== Invalid read of size 8 ==2174== at 0x43BE415: __strcmp_ssse3 (strcmp-ssse3.S:225) ==2174== by 0x8085399: PetscFListFind (reg.c:375) ==2174== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==2174== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==2174== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) ==2174== by 0x804BEA0: main (ex19.c:108) ==2174== Address 0x456b7c8 is 24 bytes inside a block of size 28 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x8084689: PetscFListAdd (reg.c:237) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) ==2174== by 0x829D1E0: MatSetType (matreg.c:65) ==2174== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) ==2174== by 0x829D1E0: MatSetType (matreg.c:65) ==2174== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2174== ==2174== Invalid read of size 8 ==2174== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) ==2174== by 0x8085399: PetscFListFind (reg.c:375) ==2174== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==2174== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==2174== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) ==2174== by 0x804BEA0: main (ex19.c:108) ==2174== Address 0x456c758 is 24 bytes inside a block of size 28 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==2174== by 0x8085186: PetscFListFind (reg.c:356) ==2174== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==2174== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==2174== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) ==2174== ==2174== Invalid read of size 8 ==2174== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) ==2174== by 0x8085406: PetscFListFind (reg.c:376) ==2174== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==2174== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==2174== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) ==2174== by 0x804BEA0: main (ex19.c:108) ==2174== Address 0x456c758 is 24 bytes inside a block of size 28 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==2174== by 0x8085186: PetscFListFind (reg.c:356) ==2174== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==2174== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==2174== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) ==2174== ==2174== Invalid read of size 8 ==2174== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x82BE3A6: MatCreate_SeqAIJ (aij.c:3363) ==2174== by 0x829D1E0: MatSetType (matreg.c:65) ==2174== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==2174== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) ==2174== Address 0x456d3a0 is 16 bytes inside a block of size 21 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x8084162: PetscFListAdd (reg.c:200) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x82BE331: MatCreate_SeqAIJ (aij.c:3360) ==2174== by 0x829D1E0: MatSetType (matreg.c:65) ==2174== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==2174== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2174== ==2175== Invalid read of size 8 ==2175== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x82BE3A6: MatCreate_SeqAIJ (aij.c:3363) ==2175== by 0x829D1E0: MatSetType (matreg.c:65) ==2175== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==2175== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) ==2175== Address 0x44dd420 is 16 bytes inside a block of size 21 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x8084162: PetscFListAdd (reg.c:200) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x82BE331: MatCreate_SeqAIJ (aij.c:3360) ==2175== by 0x829D1E0: MatSetType (matreg.c:65) ==2175== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==2175== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2175== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x80BDEDB: PetscStrallocpy (str.c:80) ==2175== Conditional jump or move depends on uninitialised value(s) ==2174== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) ==2174== by 0x80A6564: PetscObjectAppendOptionsPrefix (prefix.c:70) ==2174== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) ==2174== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) ==2174== by 0x85A5056: PCMGSetLevels (mg.c:195) ==2174== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) ==2174== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2174== by 0x804C4FF: main (ex19.c:140) ==2174== ==2175== at 0x80BDEDB: PetscStrallocpy (str.c:80) ==2175== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) ==2175== by 0x80A6564: PetscObjectAppendOptionsPrefix (prefix.c:70) ==2175== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) ==2175== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) ==2175== by 0x85A5056: PCMGSetLevels (mg.c:195) ==2175== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) ==2175== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2175== by 0x804C4FF: main (ex19.c:140) ==2175== ==2174== Invalid read of size 8 ==2174== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x8668D6B: KSPDestroy_GMRES (gmres.c:302) ==2174== by 0x8675E35: KSPDestroy_FGMRES (fgmres.c:341) ==2174== by 0x8635CA0: KSPSetType (itcreate.c:569) ==2174== by 0x81C5A98: DMMGSetSNES (damgsnes.c:668) ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2174== by 0x804C4FF: main (ex19.c:140) ==2174== Address 0x462f078 is 24 bytes inside a block of size 31 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x8084689: PetscFListAdd (reg.c:237) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x86784F2: KSPCreate_FGMRES (fgmres.c:753) ==2174== by 0x8635D92: KSPSetType (itcreate.c:576) ==2174== by 0x81BE7AA: DMMGSetUpLevel (damg.c:372) ==2174== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2174== by 0x804C4FF: main (ex19.c:140) ==2174== ==2174== Invalid read of size 8 ==2174== at 0x43BE320: __strcmp_ssse3 (strcmp-ssse3.S:141) ==2174== by 0x8085399: PetscFListFind (reg.c:375) ==2174== by 0x858986E: PCSetType (pcset.c:66) ==2174== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2174== by 0x804C4FF: main (ex19.c:140) ==2174== Address 0x45a0778 is 8 bytes inside a block of size 10 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== Invalid read of size 8 ==2175== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x8668D6B: KSPDestroy_GMRES (gmres.c:302) ==2175== by 0x8675E35: KSPDestroy_FGMRES (fgmres.c:341) ==2175== by 0x8635CA0: KSPSetType (itcreate.c:569) ==2175== by 0x81C5A98: DMMGSetSNES (damgsnes.c:668) ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2175== by 0x804C4FF: main (ex19.c:140) ==2175== Address 0x459cf78 is 24 bytes inside a block of size 31 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x8084689: PetscFListAdd (reg.c:237) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x86784F2: KSPCreate_FGMRES (fgmres.c:753) ==2175== by 0x8635D92: KSPSetType (itcreate.c:576) ==2175== by 0x81BE7AA: DMMGSetUpLevel (damg.c:372) ==2175== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2175== by 0x804C4FF: main (ex19.c:140) ==2175== ==2174== by 0x8084689: PetscFListAdd (reg.c:237) ==2174== by 0x8718EF5: PCRegister (precon.c:1537) ==2174== by 0x858B07B: PCRegisterAll (pcregis.c:95) ==2174== by 0x8642C37: PCInitializePackage (dlregisksp.c:60) ==2174== by 0x870E72D: PCCreate (precon.c:299) ==2174== by 0x862ADD8: KSPGetPC (itfunc.c:1251) ==2174== by 0x861BF6A: KSPSetOptionsPrefix (itcl.c:87) ==2174== by 0x81AD4A9: SNESSetOptionsPrefix (snes.c:2529) ==2174== by 0x81C52C6: DMMGSetSNES (damgsnes.c:612) ==2174== ==2174== Invalid read of size 8 ==2174== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) ==2174== by 0x8085399: PetscFListFind (reg.c:375) ==2174== by 0x858986E: PCSetType (pcset.c:66) ==2174== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2174== by 0x804C4FF: main (ex19.c:140) ==2174== Address 0x4630188 is 8 bytes inside a block of size 10 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==2174== by 0x8085186: PetscFListFind (reg.c:356) ==2174== by 0x858986E: PCSetType (pcset.c:66) ==2174== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2174== by 0x804C4FF: main (ex19.c:140) ==2174== ==2174== Invalid read of size 8 ==2174== at 0x43BE328: __strcmp_ssse3 (strcmp-ssse3.S:143) ==2174== by 0x8085399: PetscFListFind (reg.c:375) ==2174== by 0x858986E: PCSetType (pcset.c:66) ==2174== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2174== by 0x804C4FF: main (ex19.c:140) ==2174== Address 0x45a0780 is 6 bytes after a block of size 10 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x8084689: PetscFListAdd (reg.c:237) ==2174== by 0x8718EF5: PCRegister (precon.c:1537) ==2174== by 0x858B07B: PCRegisterAll (pcregis.c:95) ==2174== by 0x8642C37: PCInitializePackage (dlregisksp.c:60) ==2174== by 0x870E72D: PCCreate (precon.c:299) ==2174== by 0x862ADD8: KSPGetPC (itfunc.c:1251) ==2174== by 0x861BF6A: KSPSetOptionsPrefix (itcl.c:87) ==2174== by 0x81AD4A9: SNESSetOptionsPrefix (snes.c:2529) ==2174== by 0x81C52C6: DMMGSetSNES (damgsnes.c:612) ==2174== ==2174== Invalid read of size 8 ==2174== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) ==2174== by 0x8085399: PetscFListFind (reg.c:375) ==2174== by 0x858986E: PCSetType (pcset.c:66) ==2174== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2174== by 0x804C4FF: main (ex19.c:140) ==2174== Address 0x4630190 is 6 bytes after a block of size 10 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174==2175== Invalid read of size 8 ==2175== at 0x43BE320: __strcmp_ssse3 (strcmp-ssse3.S:141) ==2175== by 0x8085399: PetscFListFind (reg.c:375) ==2175== by 0x858986E: PCSetType (pcset.c:66) ==2175== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2175== by 0x804C4FF: main (ex19.c:140) ==2175== Address 0x450e678 is 8 bytes inside a block of size 10 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x8084689: PetscFListAdd (reg.c:237) ==2175== by 0x8718EF5: PCRegister (precon.c:1537) ==2175== by 0x858B07B: PCRegisterAll (pcregis.c:95) ==2175== by 0x8642C37: PCInitializePackage (dlregisksp.c:60) ==2175== by 0x870E72D: PCCreate (precon.c:299) ==2175== by 0x862ADD8: KSPGetPC (itfunc.c:1251) ==2175== by 0x861BF6A: KSPSetOptionsPrefix (itcl.c:87) ==2175== by 0x81AD4A9: SNESSetOpti== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==2174== by 0x8085186: PetscFListFind (reg.c:356) ==2174== by 0x858986E: PCSetType (pcset.c:66) ==2174== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2174== by 0x804C4FF: main (ex19.c:140) ==2174== onsPrefix (snes.c:2529) ==2175== by 0x81C52C6: DMMGSetSNES (damgsnes.c:612) ==2175== ==2175== Invalid read of size 8 ==2175== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) ==2175== by 0x8085399: PetscFListFind (reg.c:375) ==2175== by 0x858986E: PCSetType (pcset.c:66) ==2175== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2175== by 0x804C4FF: main (ex19.c:140) ==2175== Address 0x459e088 is 8 bytes inside a block of size 10 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==2175== by 0x8085186: PetscFListFind (reg.c:356) ==2175== by 0x858986E: PCSetType (pcset.c:66) ==2175== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2175== by 0x804C4FF: main (ex19.c:140) ==2175== ==2175== Invalid read of size 8 ==2175== at 0x43BE328: __strcmp_ssse3 (strcmp-ssse3.S:143) ==2175== by 0x8085399: PetscFListFind (reg.c:375) ==2175== by 0x858986E: PCSetType (pcset.c:66) ==2175== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2175== by 0x804C4FF: main (ex19.c:140) ==2175== Address 0x450e680 is 6 bytes after a block of size 10 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x8084689: PetscFListAdd (reg.c:237) ==2175== by 0x8718EF5: PCRegister (precon.c:1537) ==2175== by 0x858B07B: PCRegisterAll (pcregis.c:95) ==2175== by 0x8642C37: PCInitializePackage (dlregisksp.c:60) ==2175== by 0x870E72D: PCCreate (precon.c:299) ==2175== by 0x862ADD8: KSPGetPC (itfunc.c:1251) ==2175== by 0x861BF6A: KSPSetOptionsPrefix (itcl.c:87) ==2175== by 0x81AD4A9: SNESSetOptionsPrefix (snes.c:2529) ==2175== by 0x81C52C6: DMMGSetSNES (damgsnes.c:612) ==2175== ==2175== Invalid read of size 8 ==2175== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) ==2175== by 0x8085399: PetscFListFind (reg.c:375) ==2175== by 0x858986E: PCSetType (pcset.c:66) ==2175== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2175== by 0x804C4FF: main (ex19.c:140) ==2175== Address 0x459e090 is 6 bytes after a block of size 10 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==2175== by 0x8085186: PetscFListFind (reg.c:356) ==2175== by 0x858986E: PCSetType (pcset.c:66) ==2175== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2175== by 0x804C4FF: main (ex19.c:140) ==2175== ==2175== Invalid read of size 8 ==2175== at 0x43B674F: __memcpy_ssse3 (memcpy-ssse3.S:703) ==2175== by 0x87923CB: MPIR_Allgatherv (allgatherv.c:340) ==2175== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) ==2175== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) ==2175== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) ==2175== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2175== by 0x804C4FF: main (ex19.c:140) ==2175== Address 0x47cd128 is 0 bytes after a block of size 720 alloc'd ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==2175== by 0x8791B68: MPIR_Allgatherv (allgatherv.c:143) ==2175== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) ==2175== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) ==2175== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) ==2175== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2175== by 0x804C4FF: main (ex19.c:140) ==2175== ==2174== Invalid read of size 8 ==2174== at 0x43B677F: __memcpy_ssse3 (memcpy-ssse3.S:715) ==2174== by 0x8791BD2: MPIR_Allgatherv (allgatherv.c:160) ==2174== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) ==2174== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) ==2174== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) ==2174== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2174== by 0x804C4FF: main (ex19.c:140) ==2174== Address 0x4920208 is 0 bytes after a block of size 1,416 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x8132724: ISCreateGeneral (general.c:342) ==2174== by 0x813BE92: ISColoringGetIS (iscoloring.c:161) ==2174== by 0x836235A: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:30) ==2174== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) ==2174== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2174== by 0x804C4FF: main (ex19.c:140) ==2174== lid velocity = 0.000287274, prandtl # = 1, grashof # = 1 ==2174== Invalid read of size 8 ==2174== at 0x43197A0: __strlen_sse2 (strlen.S:99) ==2175== Invalid read of size 8 ==2174== by 0x80BDE83: PetscStrallocpy (str.c:79) ==2174== by 0x80B2778: PetscOptionsBegin_Private (aoptions.c:38) ==2174== by 0x858A3C6: PCSetFromOptions (pcset.c:170) ==2174== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) ==2174== by 0x85A8052: PCSetUp_MG (mg.c:490) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x8622C90: KSPSolve (itfunc.c:353) ==2174== by 0x81B0295: SNES_KSPSolve (snes.c:2944) ==2174== by 0x86A2C12: SNESSolve_LS (ls.c:191) ==2174== by 0x81AB5EC: SNESSolve (snes.c:2255) ==2174== Address 0x4858f38 is 8 bytes inside a block of size 11 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) ==2174== by 0x80A6564: PetscObjectAppendOptionsPrefix (prefix.c:70) ==2174== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) ==2174== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) ==2174== by 0x85A4D63: PCMGSetLevels (mg.c:180) ==2174== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) ==2174== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2174== by 0x804C4FF: main (ex19.c:140) ==2174== ==2175== at 0x43197A0: __strlen_sse2 (strlen.S:99) ==2175== by 0x80BDE83: PetscStrallocpy (str.c:79) ==2175== by 0x80B2778: PetscOptionsBegin_Private (aoptions.c:38) ==2175== by 0x858A3C6: PCSetFromOptions (pcset.c:170) ==2175== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) ==2175== by 0x85A8052: PCSetUp_MG (mg.c:490) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x8622C90: KSPSolve (itfunc.c:353) ==2175== by 0x81B0295: SNES_KSPSolve (snes.c:2944) ==2175== by 0x86A2C12: SNESSolve_LS (ls.c:191) ==2175== by 0x81AB5EC: SNESSolve (snes.c:2255) ==2175== Address 0x4797eb8 is 8 bytes inside a block of size 11 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) ==2175== by 0x80A6564: PetscObjectAppendOptionsPrefix (prefix.c:70) ==2175== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) ==2175== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) ==2175== by 0x85A4D63: PCMGSetLevels (mg.c:180) ==2175== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) ==2175== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (dam==2174== Invalid read of size 8 ==2174== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) ==2174== by 0x8085406: PetscFListFind (reg.c:376) ==2174== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==2174== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==2174== by 0x870C60A: PCGetDefaultType_Private (precon.c:25) ==2174== by 0x858A499: PCSetFromOptions (pcset.c:172) ==2174== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) ==2174== by 0x85A8052: PCSetUp_MG (mg.c:490) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x8622C90: KSPSolve (itfunc.c:353) ==2174== by 0x81B0295: SNES_KSPSolve (snes.c:2944) ==2174== Address 0x4cfa520 is 16 bytes inside a block of size 22 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24)gsnes.c:952) ==2175== by 0x804C4FF: main (ex19.c:140) ==2175== ==2174== by 0x8085186: PetscFListFind (reg.c:356) ==2174== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==2174== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==2174== by 0x870C60A: PCGetDefaultType_Private (precon.c:25) ==2174== by 0x858A499: PCSetFromOptions (pcset.c:172) ==2174== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) ==2174== by 0x85A8052: PCSetUp_MG (mg.c:490) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== ==2175== Invalid read of size 8 ==2175== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) ==2175== by 0x8085406: PetscFListFind (reg.c:376) ==2175== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==2175== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==2175== by 0x870C60A: PCGetDefaultType_Private (precon.c:25) ==2175== by 0x858A499: PCSetFromOptions (pcset.c:172) ==2175== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) ==2175== by 0x85A8052: PCSetUp_MG (mg.c:490) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x8622C90: KSPSolve (itfunc.c:353) ==2175== by 0x81B0295: SNES_KSPSolve (snes.c:2944) ==2175== Address 0x4c39110 is 16 bytes inside a block of size 22 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==2175== by 0x8085186: PetscFListFind (reg.c:356) ==2175== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==2175== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==2175== by 0x870C60A: PCGetDefaultType_Private (precon.c:25) ==2175== by 0x858A499: PCSetFromOptions (pcset.c:172) ==2175== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) ==2175== by 0x85A8052: PCSetUp_MG (mg.c:490) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== ==2175== Invalid read of size 8 ==2175== at 0x43197A0: __strlen_sse2 (strlen.S:99) ==2175== by 0x80A671F: PetscObjectAppendOptionsPrefix (prefix.c:76) ==2175== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) ==2174== Invalid read of size 8 ==2174== at 0x43197A0: __strlen_sse2 (strlen.S:99) ==2174== by 0x80A671F: PetscObjectAppendOptionsPrefix (prefix.c:76) ==2174== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) ==2174== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) ==2174== by 0x859F3D9: PCSetUp_BJacobi_Singleblock (bjacobi.c:905) ==2174== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x85A8E40: PCSetUp_MG (mg.c:556) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x8622C90: KSPSolve (itfunc.c:353) ==2174== Address 0x4d567c8 is 8 bytes inside a block of size 13 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) ==2174== by 0x87166FA: PCSetOptionsPrefix (precon.c:1209) ==2174== by 0x861BFD3: KSPSetOptionsPrefix (itcl.c:88) ==2174== by 0x859F375: PCSetUp_BJacobi_Singleblock (bjacobi.c:904) ==2174== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x85A8E40: PCSetUp_MG (mg.c:556) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== ==2175== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) ==2175== by 0x859F3D9: PCSetUp_BJacobi_Singleblock (bjacobi.c:905) ==2175== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x85A8E40: PCSetUp_MG (mg.c:556) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x8622C90: KSPSolve (itfunc.c:353) ==2175== Address 0x4c42198 is 8 bytes inside a block of size 13 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) ==2175== by 0x87166FA: PCSetOptionsPrefix (precon.c:1209) ==2175== by 0x861BFD3: KSPSetOptionsPrefix (itcl.c:88) ==2175== by 0x859F375: PCSetUp_BJacobi_Singleblock (bjacobi.c:904) ==2175== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x85A8E40: PCSetUp_MG (mg.c:556) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== ==2174== Invalid read of size 8 ==2174== at 0x43197AF: __strlen_sse2 (strlen.S:106) ==2174== by 0x80BDE83: PetscStrallocpy (str.c:79) ==2174== by 0x80B2778: PetscOptionsBegin_Private (aoptions.c:38) ==2174== by 0x858A3C6: PCSetFromOptions (pcset.c:170) ==2174== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) ==2174== by 0x859F978: PCSetUp_BJacobi_Singleblock (bjacobi.c:944) ==2174== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x85A8E40: PCSetUp_MG (mg.c:556) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== Address 0x4d568d0 is 16 bytes inside a block of size 17 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80A67BF: PetscObjectAppendOptionsPrefix (prefix.c:77) ==2174== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) ==2174== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) ==2174== by 0x859F3D9: PCSetUp_BJacobi_Singleblock (bjacobi.c:905) ==2174== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x85A8E40: PCSetUp_MG (mg.c:556) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== ==2175== Invalid read of size 8 ==2175== at 0x43197AF: __strlen_sse2 (strlen.S:106) ==2175== by 0x80BDE83: PetscStrallocpy (str.c:79) ==2175== by 0x80B2778: PetscOptionsBegin_Private (aoptions.c:38) ==2175== by 0x858A3C6: PCSetFromOptions (pcset.c:170) ==2175== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) ==2175== by 0x859F978: PCSetUp_BJacobi_Singleblock (bjacobi.c:944) ==2175== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x85A8E40: PCSetUp_MG (mg.c:556) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== Address 0x4c422a0 is 16 bytes inside a block of size 17 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80A67BF: PetscObjectAppendOptionsPrefix (prefix.c:77) ==2175== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) ==2175== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) ==2175== by 0x859F3D9: PCSetUp_BJacobi_Singleblock (bjacobi.c:905) ==2175== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x85A8E40: PCSetUp_MG (mg.c:556) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== ==2175== Invalid read of size 8 ==2175== at 0x43BF3A5: __strcmp_ssse3 (strcmp-ssse3.S:1687) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x85E8F9A: PCCreate_ILU (ilu.c:379) ==2175== by 0x8589A8B: PCSetType (pcset.c:78) ==2175== by 0x858A64C: PCSetFromOptions (pcset.c:181) ==2175== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) ==2175== by 0x859F978: PCSetUp_BJacobi_Singleblock (bjacobi.c:944) ==2175== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== Address 0x4c94a28 is 24 bytes inside a block of size 27 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x8084689: PetscFListAdd (reg.c:237) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x85E8F25: PCCreate_ILU (ilu.c:377) ==2175== by 0x8589A8B: PCSetType (pcset.c:78) ==2175== by 0x858A64C: PCSetFromOptions (pcset.c:181) ==2175== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) ==2175== by 0x859F978: PCSetUp_BJacobi_Singleblock (bjacobi.c:944) ==2175== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) ==2175== ==2174== Invalid read of size 8 ==2174== at 0x43BF3A5: __strcmp_ssse3 (strcmp-ssse3.S:1687) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x85E8F9A: PCCreate_ILU (ilu.c:379) ==2174== by 0x8589A8B: PCSetType (pcset.c:78) ==2174== by 0x858A64C: PCSetFromOptions (pcset.c:181) ==2174== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) ==2174== by 0x859F978: PCSetUp_BJacobi_Singleblock (bjacobi.c:944) ==2174== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== Address 0x4d58a28 is 24 bytes inside a block of size 27 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x8084689: PetscFListAdd (reg.c:237) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x85E8F25: PCCreate_ILU (ilu.c:377) ==2174== by 0x8589A8B: PCSetType (pcset.c:78) ==2174== by 0x858A64C: PCSetFromOptions (pcset.c:181) ==2174== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) ==2174== by 0x859F978: PCSetUp_BJacobi_Singleblock (bjacobi.c:944) ==2174== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) ==2174== ==2175== Invalid read of size 8 ==2175== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) ==2175== by 0x8093FC2: PetscTypeCompare (destroy.c:254) ==2175== by 0x85A956A: PCSetUp_MG (mg.c:585) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x8622C90: KSPSolve (itfunc.c:353) ==2175== by 0x81B0295: SNES_KSPSolve (snes.c:2944) ==2175== by 0x86A2C12: SNESSolve_LS (ls.c:191) ==2175== by 0x81AB5EC: SNESSolve (snes.c:2255) ==2175== by 0x81C4919: DMMGSolveSNES (damgsnes.c:510) ==2175== by 0x81BDF6C: DMMGSolve (damg.c:313) ==2175== by 0x804C9A7: main (ex19.c:155) ==2175== Address 0x4798858 is 8 bytes inside a block of size 10 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x80902DB: PetscObjectChangeTypeName (pname.c:114) ==2175== by 0x8589AEE: PCSetType (pcset.c:79) ==2175== by 0x85A4F3C: PCMGSetLevels (mg.c:187) ==2175== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) ==2175== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2175== by 0x804C4FF: main (ex19.c:140) ==2175== ==2175== Invalid read of size 8 ==2175== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) ==2175== by 0x8093FC2: PetscTypeCompare (destroy.c:254) ==2175== by 0x85A956A: PCSetUp_MG (mg.c:585) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x8622C90: KSPSolve (itfunc.c:353) ==2175== by 0x81B0295: SNES_KSPSolve (snes.c:2944) ==2175== by 0x86A2C12: SNESSolve_LS (ls.c:191) ==2175== by 0x81AB5EC: SNESSolve (snes.c:2255) ==2175== by 0x81C4919: DMMGSolveSNES (damgsnes.c:510) ==2175== by 0x81BDF6C: DMMGSolve (damg.c:313) ==2175== by 0x804C9A7: main (ex19.c:155) ==2175== Address 0x4798860 is 6 bytes after a block of size 10 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x80902DB: PetscObjectChangeTypeName (pname.c:114) ==2175== by 0x8589AEE: PCSetType (pcset.c:79) ==2175== by 0x85A4F3C: PCMGSetLevels (mg.c:187) ==2175== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) ==2175== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2175== by 0x804C4FF: main (ex19.c:140) ==2175== ==2174== Invalid read of size 8 ==2174== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) ==2174== by 0x8093FC2: PetscTypeCompare (destroy.c:254) ==2174== by 0x85A956A: PCSetUp_MG (mg.c:585) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x8622C90: KSPSolve (itfunc.c:353) ==2174== by 0x81B0295: SNES_KSPSolve (snes.c:2944) ==2174== by 0x86A2C12: SNESSolve_LS (ls.c:191) ==2174== by 0x81AB5EC: SNESSolve (snes.c:2255) ==2174== by 0x81C4919: DMMGSolveSNES (damgsnes.c:510) ==2174== by 0x81BDF6C: DMMGSolve (damg.c:313) ==2174== by 0x804C9A7: main (ex19.c:155) ==2174== Address 0x48598d8 is 8 bytes inside a block of size 10 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x80902DB: PetscObjectChangeTypeName (pname.c:114) ==2174== by 0x8589AEE: PCSetType (pcset.c:79) ==2174== by 0x85A4F3C: PCMGSetLevels (mg.c:187) ==2174== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) ==2174== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2174== by 0x804C4FF: main (ex19.c:140) ==2174== ==2174== Invalid read of size 8 ==2174== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) ==2174== by 0x8093FC2: PetscTypeCompare (destroy.c:254) ==2174== by 0x85A956A: PCSetUp_MG (mg.c:585) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x8622C90: KSPSolve (itfunc.c:353) ==2174== by 0x81B0295: SNES_KSPSolve (snes.c:2944) ==2174== by 0x86A2C12: SNESSolve_LS (ls.c:191) ==2174== by 0x81AB5EC: SNESSolve (snes.c:2255) ==2174== by 0x81C4919: DMMGSolveSNES (damgsnes.c:510) ==2174== by 0x81BDF6C: DMMGSolve (damg.c:313) ==2174== by 0x804C9A7: main (ex19.c:155) ==2174== Address 0x48598e0 is 6 bytes after a block of size 10 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x80902DB: PetscObjectChangeTypeName (pname.c:114) ==2174== by 0x8589AEE: PCSetType (pcset.c:79) ==2174== by 0x85A4F3C: PCMGSetLevels (mg.c:187) ==2174== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) ==2174== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2174== by 0x804C4FF: main (ex19.c:140) ==2174== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF669: __strcmp_ssse3 (strcmp-ssse3.S:2007) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF66E: __strcmp_ssse3 (strcmp-ssse3.S:2010) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF673: __strcmp_ssse3 (strcmp-ssse3.S:2013) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF678: __strcmp_ssse3 (strcmp-ssse3.S:2016) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF67D: __strcmp_ssse3 (strcmp-ssse3.S:2019) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8086A53: Pets630) ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2175== ==2175== Invalid read of size 8 ==2175== at 0x43BF10D: __strcmp_ssse3 (strcmp-ssse3.S:1446) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== cFListDuplicate (reg.c:596) ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF669: __strcmp_ssse3 (strcmp-ssse3.S:2007) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2174== by 0x82D2300: MatILUFactorSym by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2175== Address 0x501a448 is 24 bytes inside a block of size 30 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x8084689: PetscFListAdd (reg.c:237) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x82BE3A6: MatCreate_SeqAIJ (aij.c:3363) ==2175== by 0x829D1E0: MatSetType (matreg.c:65) ==2175== by 0x82C3982: MatGetFactor_seqaij_petsc (aijfact.c:118) ==2175== by 0x8276893: MatGetFactor (matrix.c:3649) ==2175== by 0x85E7687: PCSetUp_ILU (ilu.c:202) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==21bolic_SeqAIJ (aijfact.c:1731) ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF66E: __strcmp_ssse3 (strcmp-ssse3.S:2010) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x86275== at 0x43BF12F: __strcmp_ssse3 (strcmp-ssse3.S:1456) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2175== by 0x82BFA68: MatDuplicateNoCre1C54: KSPSetUp (itfunc.c:237) ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF673: __strcmp_ssse3 (strcmp-ssse3.S:2013) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2174== ==2174== Conditional jump or move depate_SeqAIJ (aij.c:3511) ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2175== by 0x82866FB: MatILUFactorSymends on uninitialised value(s) ==2174== at 0x43BF678: __strcmp_ssse3 (strcmp-ssse3.S:2016) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF67D: __strcmp_ssse3 (strcmp-ssse3.S:2019) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2174==bolic (matrix.c:5464) ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x859D9CE: PCSetUpOnBlocks by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2174== ==2174== Invalid read of size 8 ==2174== at 0x43BF10D: __strcmp_ssse3 (strcmp-ssse3.S:1446) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2174== by 0x82866FB: MatILUFactorSymb_BJacobi_Singleblock (bjacobi.c:753) ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5D4: __strcmp_ssse3olic (matrix.c:5464) ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2174== Address 0x5799948 is 24 bytes inside a block of size 30 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x8084689: PetscFListAdd (reg.c:237) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x82BE3A6: MatCreate_SeqAIJ (aij.c:3363) ==2174== by 0x829D1E0: MatSetType (matreg.c:65) ==2174== by 0x82C3982: MatGetFactor_seqaij_petsc (aijfact.c:118) ==2174== by 0x8276893: MatGetFactor (matrix.c:3649) ==2174== by 0x85E7687: PCSetUp_ILU (ilu.c:202) = (strcmp-ssse3.S:1902) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5D9: __strcmp_ssse3 (strcmp-ssse3.S:1905) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2175== =2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF12F: __strcmp_ssse3 (strcmp-ssse3.S:1456) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==2174== by 0x80842Bby 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2175== ==2175== Invalid read of size 8 ==2175== at 0x43197BE: __strlen_sse2 (strlen.S:112) ==2175== by 0x80BDE83: PetscStrallocpy (str.c:79) ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==2175== by 0x808432D: PetscFListAdd (reg.c:225) ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==8: PetscFListAdd (reg.c:223) ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu02175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== Address 0x45afcd0 is 32 bytes inside a block of size 35 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==2175== by 0x80846F3: PetscFListAdd (reg.c:238) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x82BE3A6: MatCreate_SeqAIJ (aij.c:3363) ==2175== by 0x829D1E0: MatSetType (matreg.c:65) ==2175== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==2175== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==2175== by 0x818A96F: DAGetMatrix2d_MPIAIJ (fdda.c:779) (aijfact.c:1630) ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2174== ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5DE: __strcmp_ssse3 (strcmp-ssse3.S:1908) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2175== ==2175== Invalid read of size 8 ==2175== at 0x43BF165: __strcmp_ssse3 (strcmp-ssse3.S:1474) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:59 by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2174== by 0x8714602: PCSetUpOnBlocks (pr6) ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2175== Address 0x541be90 is 32 bytes inside a block of size 37 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x8084689: PetscFListAdd (reg.c:237) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x82BEAecon.c:828) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5D4: __strcmp_ssse3 (strcmp-ssse3.S:1902) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5D9: __strcmp_ssse3 (strcmp-ssse3.S:1905) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174=33: MatCreate_SeqAIJ (aij.c:3408) ==2175== by 0x829D1E0: MatSetType (matreg.c:65) ==2175== by 0x82C3982: MatGetFactor_seqaij_petsc (aijfact.c:118) ==2175== by 0x8276893: MatGetFactor (matrix.c:3649) ==2175== by 0x85E7687: PCSetUp_ILU (ilu.c:202) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF18F: __strcmp_ssse3 (strcmp-ssse3.S:1485) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2175== by 0x8714039: PCSetUp (precon.c:795) ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2175== by 0x859D9CE: PCSetUpOnBlock= by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2174== ==2174== Invalid read of size 8 ==2174== at 0x43197BE: __strlen_sse2 (strlen.S:112) ==2174== by 0x80BDE83: PetscStrallocpy (str.c:79) ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==2174== by 0x808432D: PetscFListAdd (reg.c:225) ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2174== by 0x82s_BJacobi_Singleblock (bjacobi.c:753) ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2175== D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== Address 0x4642530 is 32 bytes inside a block of size 35 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==2174== by 0x80846F3: PetscFListAdd (reg.c:238) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x82BE3A6: MatCreate_SeqAIJ (aij.c:3363) ==2174== by 0x829D1E0: MatSetType (matreg.c:65) ==2174== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==2174== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==2174== by 0x818A96F: DAGetMatrix2d_MPIAIJ (fdda.c:779) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5DE: __strcmp_ssse3 (strcmp-ssse3.S:1908) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2174== ==2174== Invalid read of size 8 ==2174== at 0x43BF165: __strcmp_ssse3 (strcmp-ssse3.S:1474) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2174== Address 0x54138b0 is 32 bytes inside a block of size 37 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x8084689: PetscFListAdd (reg.c:237) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x82BEA33: MatCreate_SeqAIJ (aij.c:3408) ==2174== by 0x829D1E0: MatSetType (matreg.c:65) ==2174== by 0x82C3982: MatGetFactor_seqaij_petsc (aijfact.c:118) ==2174== by 0x8276893: MatGetFactor (matrix.c:3649) ==2174== by 0x85E7687: PCSetUp_ILU (ilu.c:202) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF18F: __strcmp_ssse3 (strcmp-ssse3.S:1485) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==2174== by 0x8714039: PCSetUp (precon.c:795) ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==2174== Number of Newton iterations = 2 ==2175== Invalid read of size 8 ==2175== at 0x43BEFE9: __strcmp_ssse3 (strcmp-ssse3.S:1339) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x86A4028: SNESDestroy_LS (ls.c:322) ==2175== by 0x81A578E: SNESDestroy (snes.c:1406) ==2175== by 0x8093606: PetscObjectDestroy (destroy.c:172) ==2175== by 0x81BCE39: DMMGDestroy (damg.c:179) ==2175== by 0x804CBD4: main (ex19.c:174) ==2175== Address 0x4b28898 is 24 bytes inside a block of size 28 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x8084689: PetscFListAdd (reg.c:237) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x86AA093: SNESCreate_LS (ls.c:1199) ==2175== by 0x81AC1EF: SNESSetType (snes.c:2353) ==2175== by 0x819BDE2: SNESSetFromOptions (snes.c:306) ==2175== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) ==2175== by 0x804C56B: main (ex19.c:141) ==2175== ==2175== Invalid read of size 8 ==2175== at 0x43BF015: __strcmp_ssse3 (strcmp-ssse3.S:1354) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x86A4028: SNESDestroy_LS (ls.c:322) ==2175== by 0x81A578E: SNESDestroy (snes.c:1406) ==2175== by 0x8093606: PetscObjectDestroy (destroy.c:172) ==2175== by 0x81BCE39: DMMGDestroy (damg.c:179) ==2175== by 0x804CBD4: main (ex19.c:174) ==2175== Address 0x4b28898 is 24 bytes inside a block of size 28 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x8084689: PetscFListAdd (reg.c:237) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x86AA093: SNESCreate_LS (ls.c:1199) ==2175== by 0x81AC1EF: SNESSetType (snes.c:2353) ==2175== by 0x819BDE2: SNESSetFromOptions (snes.c:306) ==2175== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) ==2175== by 0x804C56B: main (ex19.c:141) ==2175== ==2175== Invalid read of size 8 ==2175== at 0x43BF04A: __strcmp_ssse3 (strcmp-ssse3.S:1369) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x86A4028: SNESDestroy_LS (ls.c:322) ==2175== by 0x81A578E: SNESDestroy (snes.c:1406) ==2175== by 0x8093606: PetscObjectDestroy (destroy.c:172) ==2175== by 0x81BCE39: DMMGDestroy (damg.c:179) ==2175== by 0x804CBD4: main (ex19.c:174) ==2175== Address 0x4b288a0 is 4 bytes after a block of size 28 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x8084689: PetscFListAdd (reg.c:237) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x86AA093: SNESCreate_LS (ls.c:1199) ==2175== by 0x81AC1EF: SNESSetType (snes.c:2353) ==2175== by 0x819BDE2: SNESSetFromOptions (snes.c:306) ==2175== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) ==2175== by 0x804C56B: main (ex19.c:141) ==2175== ==2174== Invalid read of size 8 ==2174== at 0x43BEFE9: __strcmp_ssse3 (strcmp-ssse3.S:1339) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x86A4028: SNESDestroy_LS (ls.c:322) ==2174== by 0x81A578E: SNESDestroy (snes.c:1406) ==2174== by 0x8093606: PetscObjectDestroy (destroy.c:172) ==2174== by 0x81BCE39: DMMGDestroy (damg.c:179) ==2174== by 0x804CBD4: main (ex19.c:174) ==2174== Address 0x4be1e28 is 24 bytes inside a block of size 28 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x8084689: PetscFListAdd (reg.c:237) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x86AA093: SNESCreate_LS (ls.c:1199) ==2174== by 0x81AC1EF: SNESSetType (snes.c:2353) ==2174== by 0x819BDE2: SNESSetFromOptions (snes.c:306) ==2174== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) ==2174== by 0x804C56B: main (ex19.c:141) ==2174== ==2174== Invalid read of size 8 ==2174== at 0x43BF015: __strcmp_ssse3 (strcmp-ssse3.S:1354) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x86A4028: SNESDestroy_LS (ls.c:322) ==2174== by 0x81A578E: SNESDestroy (snes.c:1406) ==2174== by 0x8093606: PetscObjectDestroy (destroy.c:172) ==2174== by 0x81BCE39: DMMGDestroy (damg.c:179) ==2174== by 0x804CBD4: main (ex19.c:174) ==2174== Address 0x4be1e28 is 24 bytes inside a block of size 28 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x8084689: PetscFListAdd (reg.c:237) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x86AA093: SNESCreate_LS (ls.c:1199) ==2174== by 0x81AC1EF: SNESSetType (snes.c:2353) ==2174== by 0x819BDE2: SNESSetFromOptions (snes.c:306) ==2174== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) ==2174== by 0x804C56B: main (ex19.c:141) ==2174== ==2174== Invalid read of size 8 ==2174== at 0x43BF04A: __strcmp_ssse3 (strcmp-ssse3.S:1369) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x86A4028: SNESDestroy_LS (ls.c:322) ==2174== by 0x81A578E: SNESDestroy (snes.c:1406) ==2174== by 0x8093606: PetscObjectDestroy (destroy.c:172) ==2174== by 0x81BCE39: DMMGDestroy (damg.c:179) ==2174== by 0x804CBD4: main (ex19.c:174) ==2174== Address 0x4be1e30 is 4 bytes after a block of size 28 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x8084689: PetscFListAdd (reg.c:237) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x86AA093: SNESCreate_LS (ls.c:1199) ==2174== by 0x81AC1EF: SNESSetType (snes.c:2353) ==2174== by 0x819BDE2: SNESSetFromOptions (snes.c:306) ==2174== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) ==2174== by 0x804C56B: main (ex19.c:141) ==2174== ==2175== Invalid read of size 8 ==2175== at 0x43BF469: __strcmp_ssse3 (strcmp-ssse3.S:1765) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x830D1DF: MatDestroy_SeqAIJ_Inode (inode2.c:62) ==2175== by 0x82AB1E1: MatDestroy_SeqAIJ (aij.c:810) ==2175== by 0x82595D1: MatDestroy (matrix.c:876) ==2175== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) ==2175== by 0x82595D1: MatDestroy (matrix.c:876) ==2175== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==2175== by 0x82595D1: MatDestroy (matrix.c:876) ==2175== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==2175== Address 0x44df298 is 24 bytes inside a block of size 26 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x8084689: PetscFListAdd (reg.c:237) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x830D905: MatCreate_SeqAIJ_Inode (inode2.c:101) ==2175== by 0x82BEB04: MatCreate_SeqAIJ (aij.c:3414) ==2175== by 0x829D1E0: MatSetType (matreg.c:65) ==2175== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==2175== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==2175== ==2175== Invalid read of size 8 ==2175== at 0x43BF46D: __strcmp_ssse3 (strcmp-ssse3.S:1766) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x82AB357: MatDestroy_SeqAIJ (aij.c:815) ==2175== by 0x82595D1: MatDestroy (matrix.c:876) ==2175== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) ==2175== by 0x82595D1: MatDestroy (matrix.c:876) ==2175== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==2175== by 0x82595D1: MatDestroy (matrix.c:876) ==2175== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==2175== by 0x85A5A49: PCDestroy_MG (mg.c:257) ==2175== Address 0x44dd918 is 24 bytes inside a block of size 28 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x8084689: PetscFListAdd (reg.c:237) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x82BE442: MatCreate_SeqAIJ (aij.c:3369) ==2175== by 0x829D1E0: MatSetType (matreg.c:65) ==2175== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==2175== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2175== ==2175== Invalid read of size 8 ==2175== at 0x43BF22D: __strcmp_ssse3 (strcmp-ssse3.S:1553) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x82AB527: MatDestroy_SeqAIJ (aij.c:819) ==2175== by 0x82595D1: MatDestroy (matrix.c:876) ==2175== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) ==2175== by 0x82595D1: MatDestroy (matrix.c:876) ==2175== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==2175== by 0x82595D1: MatDestroy (matrix.c:876) ==2175== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==2175== by 0x85A5A49: PCDestroy_MG (mg.c:257) ==2175== Address 0x44ddfa8 is 24 bytes inside a block of size 28 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x8084689: PetscFListAdd (reg.c:237) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x82BE616: MatCreate_SeqAIJ (aij.c:3381) ==2175== by 0x829D1E0: MatSetType (matreg.c:65) ==2175== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==2175== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2175== ==2175== Invalid read of size 8 ==2175== at 0x43BEFED: __strcmp_ssse3 (strcmp-ssse3.S:1340) ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x8327F64: MatDestroy_MPIAIJ (mpiaij.c:919) ==2175== by 0x82595D1: MatDestroy (matrix.c:876) ==2175== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==2175== by 0x82595D1: MatDestroy (matrix.c:876) ==2175== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==2175== by 0x85A5A49: PCDestroy_MG (mg.c:257) ==2175== by 0x870CC31: PCDestroy (precon.c:83) ==2175== by 0x8627601: KSPDestroy (itfunc.c:695) ==2175== Address 0x44db848 is 24 bytes inside a block of size 28 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2175== by 0x8084689: PetscFListAdd (reg.c:237) ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2175== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) ==2175== by 0x829D1E0: MatSetType (matreg.c:65) ==2175== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) ==2175== by 0x829D1E0: MatSetType (matreg.c:65) ==2175== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2175== ==2174== Invalid read of size 8 ==2174== at 0x43BF469: __strcmp_ssse3 (strcmp-ssse3.S:1765) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x830D1DF: MatDestroy_SeqAIJ_Inode (inode2.c:62) ==2174== by 0x82AB1E1: MatDestroy_SeqAIJ (aij.c:810) ==2174== by 0x82595D1: MatDestroy (matrix.c:876) ==2174== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) ==2174== by 0x82595D1: MatDestroy (matrix.c:876) ==2174== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==2174== by 0x82595D1: MatDestroy (matrix.c:876) ==2174== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==2174== Address 0x456f218 is 24 bytes inside a block of size 26 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x8084689: PetscFListAdd (reg.c:237) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x830D905: MatCreate_SeqAIJ_Inode (inode2.c:101) ==2174== by 0x82BEB04: MatCreate_SeqAIJ (aij.c:3414) ==2174== by 0x829D1E0: MatSetType (matreg.c:65) ==2174== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==2174== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==2174== ==2174== Invalid read of size 8 ==2174== at 0x43BF46D: __strcmp_ssse3 (strcmp-ssse3.S:1766) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x82AB357: MatDestroy_SeqAIJ (aij.c:815) ==2174== by 0x82595D1: MatDestroy (matrix.c:876) ==2174== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) ==2174== by 0x82595D1: MatDestroy (matrix.c:876) ==2174== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==2174== by 0x82595D1: MatDestroy (matrix.c:876) ==2174== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==2174== by 0x85A5A49: PCDestroy_MG (mg.c:257) ==2174== Address 0x456d898 is 24 bytes inside a block of size 28 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x8084689: PetscFListAdd (reg.c:237) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x82BE442: MatCreate_SeqAIJ (aij.c:3369) ==2174== by 0x829D1E0: MatSetType (matreg.c:65) ==2174== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==2174== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2174== ==2174== Invalid read of size 8 ==2174== at 0x43BF22D: __strcmp_ssse3 (strcmp-ssse3.S:1553) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x82AB527: MatDestroy_SeqAIJ (aij.c:819) ==2174== by 0x82595D1: MatDestroy (matrix.c:876) ==2174== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) ==2174== by 0x82595D1: MatDestroy (matrix.c:876) ==2174== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==2174== by 0x82595D1: MatDestroy (matrix.c:876) ==2174== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==2174== by 0x85A5A49: PCDestroy_MG (mg.c:257) ==2174== Address 0x456df28 is 24 bytes inside a block of size 28 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x8084689: PetscFListAdd (reg.c:237) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x82BE616: MatCreate_SeqAIJ (aij.c:3381) ==2174== by 0x829D1E0: MatSetType (matreg.c:65) ==2174== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==2174== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2174== ==2174== Invalid read of size 8 ==2174== at 0x43BEFED: __strcmp_ssse3 (strcmp-ssse3.S:1340) ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x8327F64: MatDestroy_MPIAIJ (mpiaij.c:919) ==2174== by 0x82595D1: MatDestroy (matrix.c:876) ==2174== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==2174== by 0x82595D1: MatDestroy (matrix.c:876) ==2174== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==2174== by 0x85A5A49: PCDestroy_MG (mg.c:257) ==2174== by 0x870CC31: PCDestroy (precon.c:83) ==2174== by 0x8627601: KSPDestroy (itfunc.c:695) ==2174== Address 0x456b7c8 is 24 bytes inside a block of size 28 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) ==2174== by 0x8084689: PetscFListAdd (reg.c:237) ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==2174== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) ==2174== by 0x829D1E0: MatSetType (matreg.c:65) ==2174== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) ==2174== by 0x829D1E0: MatSetType (matreg.c:65) ==2174== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==2174== ==2175== Invalid read of size 8 ==2175== at 0x43B674F: __memcpy_ssse3 (memcpy-ssse3.S:703) ==2175== by 0x8791BD2: MPIR_Allgatherv (allgatherv.c:160) ==2175== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) ==2175== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) ==2175== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) ==2175== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2175== by 0x804C4FF: main (ex19.c:140) ==2175== Address 0x4c37f38 is 0 bytes after a block of size 360 alloc'd ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2175== by 0x8132724: ISCreateGeneral (general.c:342) ==2175== by 0x813BE92: ISColoringGetIS (iscoloring.c:161) ==2175== by 0x836235A: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:30) ==2175== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) ==2175== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2175== by 0x804C4FF: main (ex19.c:140) ==2175== ==2174== Invalid read of size 8 ==2174== at 0x43B674F: __memcpy_ssse3 (memcpy-ssse3.S:703) ==2174== by 0x8791BD2: MPIR_Allgatherv (allgatherv.c:160) ==2174== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) ==2174== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) ==2174== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) ==2174== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2174== by 0x804C4FF: main (ex19.c:140) ==2174== Address 0x4cb0c68 is 0 bytes after a block of size 360 alloc'd ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==2174== by 0x8132724: ISCreateGeneral (general.c:342) ==2174== by 0x813BE92: ISColoringGetIS (iscoloring.c:161) ==2174== by 0x836235A: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:30) ==2174== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) ==2174== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2174== by 0x804C4FF: main (ex19.c:140) ==2174== ==2174== Invalid read of size 8 ==2174== at 0x43B674F: __memcpy_ssse3 (memcpy-ssse3.S:703) ==2174== by 0x87923CB: MPIR_Allgatherv (allgatherv.c:340) ==2174== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) ==2174== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) ==2174== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) ==2174== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2174== by 0x804C4FF: main (ex19.c:140) ==2174== Address 0x4461998 is 0 bytes after a block of size 720 alloc'd ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==2174== by 0x8791B68: MPIR_Allgatherv (allgatherv.c:143) ==2174== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) ==2174== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) ==2174== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) ==2174== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==2174== by 0x804C4FF: main (ex19.c:140) ==2174== lid velocity = 0.000287274, prandtl # = 1, grashof # = 1 Number of Newton iterations = 2 ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2175== by 0x80B0A16: PetscFinalize (pinit.c:829) ==2175== by 0x804CCA7: main (ex19.c:181) ==2175== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==2174== by 0x80B0A16: PetscFinalize (pinit.c:829) ==2174== by 0x804CCA7: main (ex19.c:181) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==2174== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==2174== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==2174== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==2174== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==2174== by 0x80B2276: PetscFinalize (pinit.c:973) ==2174== by 0x804CCA7: main (ex19.c:181) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==2174== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==2174== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==2174== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==2174== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==2174== by 0x80B2276: PetscFinalize (pinit.c:973) ==2174== by 0x804CCA7: main (ex19.c:181) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==2174== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==2174== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==2174== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==2174== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==2174== by 0x80B2276: PetscFinalize (pinit.c:973) ==2174== by 0x804CCA7: main (ex19.c:181) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==2174== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==2174== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==2174== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==2174== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==2174== by 0x80B2276: PetscFinalize (pinit.c:973) ==2174== by 0x804CCA7: main (ex19.c:181) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==2174== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==2174== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==2174== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==2174== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==2174== by 0x80B2276: PetscFinalize (pinit.c:973) ==2174== by 0x804CCA7: main (ex19.c:181) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) ==2174== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==2174== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==2174== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==2174== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==2174== by 0x80B2276: PetscFinalize (pinit.c:973) ==2174== by 0x804CCA7: main (ex19.c:181) ==2174== ==2174== Conditional jump or move depends on uninitialised value(s) ==2174== at 0x43BF5D4: __strcmp_ssse3 (strcmp-ssse3.S:1902) ==2174== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==2174== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==2174== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==2174== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==2174== by 0x80B2276: PetscFinalize (pinit.c:973) ==2174== by 0x804CCA7: main (ex19.c:181) ==2174== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==2175== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==2175== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==2175== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==2175== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==2175== by 0x80B2276: PetscFinalize (pinit.c:973) ==2175== by 0x804CCA7: main (ex19.c:181) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==2175== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==2175== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==2175== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==2175== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==2175== by 0x80B2276: PetscFinalize (pinit.c:973) ==2175== by 0x804CCA7: main (ex19.c:181) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==2175== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==2175== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==2175== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==2175== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==2175== by 0x80B2276: PetscFinalize (pinit.c:973) ==2175== by 0x804CCA7: main (ex19.c:181) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==2175== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==2175== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==2175== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==2175== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==2175== by 0x80B2276: PetscFinalize (pinit.c:973) ==2175== by 0x804CCA7: main (ex19.c:181) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==2175== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==2175== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==2175== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==2175== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==2175== by 0x80B2276: PetscFinalize (pinit.c:973) ==2175== by 0x804CCA7: main (ex19.c:181) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) ==2175== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==2175== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==2175== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==2175== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==2175== by 0x80B2276: PetscFinalize (pinit.c:973) ==2175== by 0x804CCA7: main (ex19.c:181) ==2175== ==2175== Conditional jump or move depends on uninitialised value(s) ==2175== at 0x43BF5D4: __strcmp_ssse3 (strcmp-ssse3.S:1902) ==2175== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==2175== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==2175== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==2175== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==2175== by 0x80B2276: PetscFinalize (pinit.c:973) ==2175== by 0x804CCA7: main (ex19.c:181) ==2175== ==2174== ==2174== HEAP SUMMARY: ==2174== in use at exit: 160 bytes in 11 blocks ==2174== total heap usage: 60,266 allocs, 60,255 frees, 51,015,236 bytes allocated ==2174== ==2174== LEAK SUMMARY: ==2174== definitely lost: 40 bytes in 1 blocks ==2174== indirectly lost: 120 bytes in 10 blocks ==2174== possibly lost: 0 bytes in 0 blocks ==2174== still reachable: 0 bytes in 0 blocks ==2174== suppressed: 0 bytes in 0 blocks ==2174== Rerun with --leak-check=full to see details of leaked memory ==2174== ==2174== For counts of detected and suppressed errors, rerun with: -v ==2174== Use --track-origins=yes to see where uninitialised values come from ==2174== ERROR SUMMARY: 15690 errors from 164 contexts (suppressed: 0 from 0) ==2175== ==2175== HEAP SUMMARY: ==2175== in use at exit: 160 bytes in 11 blocks ==2175== total heap usage: 59,069 allocs, 59,058 frees, 49,630,900 bytes allocated ==2175== ==2175== LEAK SUMMARY: ==2175== definitely lost: 40 bytes in 1 blocks ==2175== indirectly lost: 120 bytes in 10 blocks ==2175== possibly lost: 0 bytes in 0 blocks ==2175== still reachable: 0 bytes in 0 blocks ==2175== suppressed: 0 bytes in 0 blocks ==2175== Rerun with --leak-check=full to see details of leaked memory ==2175== ==2175== For counts of detected and suppressed errors, rerun with: -v ==2175== Use --track-origins=yes to see where uninitialised values come from ==2175== ERROR SUMMARY: 15664 errors from 162 contexts (suppressed: 0 from 0) What is going on here? Shall I ignore those errors? Thanks a lot! Rebecca Xuefei YUAN Department of Applied Physics and Applied Mathematics Columbia University Tel:917-399-8032 www.columbia.edu/~xy2102 From bsmith at mcs.anl.gov Mon Sep 6 19:59:07 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 6 Sep 2010 19:59:07 -0500 Subject: [petsc-users] valgrind error comes out when upgrade from ubuntu 8.04 LTS to 10.04 In-Reply-To: <20100906203101.9i11ikcdkos4gss4@cubmail.cc.columbia.edu> References: <20100906203101.9i11ikcdkos4gss4@cubmail.cc.columbia.edu> Message-ID: <196CED0F-3AD9-487C-AC88-FD94624407ED@mcs.anl.gov> Looks like ubuntu 10.04 is a terrible release without proper testing. All those are problems with the OS and system libraries. Unfortunately it makes it impossible to find errors in PETSc since it gives all those meaningless errors. Barry On Sep 6, 2010, at 7:31 PM, Rebecca Xuefei Yuan wrote: > Dear all, > > I upgrade my laptop from ubuntu 8.04 LTS to 10.04, after the upgrade, I reinstalled PETSc, but there are tons of valgrind errors coming out even the code is unchanged. Then I tried with > > ~/soft/petsc-3.1-p4/src/snes/examples/tutorials/ex19.c > > of the command: > > ~/soft/petsc-3.1-p4/externalpackages/mpich2-1.0.8/bin/mpiexec -np 2 valgrind --tool=memcheck ./ex19.exe -malloc off -da_grid_x 30 -da_grid_y 30 > > as instructed in FAQ. However, the errors are very long(hope it is right to post the full log of valgrind here...), as > > rebecca at YuanWork:~/linux/code/twoway/twoway_brandnew/trunk/set_a$ ~/soft/petsc-3.1-p4/externalpackages/mpich2-1.0.8/bin/mpiexec -np 2 valgrind --tool=memcheck ./ex19.exe -malloc off -da_grid_x 30 -da_grid_y 30 > ==2174== Memcheck, a memory error detector > ==2175== Memcheck, a memory error detector > ==2175== Copyright (C) 2002-2009, and GNU GPL'd, by Julian Seward et al. > ==2175== Using Valgrind-3.5.0 and LibVEX; rerun with -h for copyright info > ==2175== Command: ./ex19.exe -malloc off -da_grid_x 30 -da_grid_y 30 > ==2175== > ==2174== Copyright (C) 2002-2009, and GNU GPL'd, by Julian Seward et al. > ==2174== Using Valgrind-3.5.0 and LibVEX; rerun with -h for copyright info > ==2174== Command: ./ex19.exe -malloc off -da_grid_x 30 -da_grid_y 30 > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x400B217: _dl_relocate_object (do-rel.h:104) > ==2174== by 0x40031D0: dl_main (rtld.c:2229) > ==2174== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > ==2174== by 0x4000C6C: _dl_start (rtld.c:333) > ==2174== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) > ==2174== by 0x40031D0: dl_main (rtld.c:2229) > ==2174== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > ==2174== by 0x4000C6C: _dl_start (rtld.c:333) > ==2174== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > ==2174== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x400B217: _dl_relocate_object (do-rel.h:104) > ==2175== by 0x40031D0: dl_main (rtld.c:2229) > ==2175== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > ==2175== by 0x4000C6C: _dl_start (rtld.c:333) > ==2175== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) > ==2175== by 0x40031D0: dl_main (rtld.c:2229) > ==2175== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > ==2175== by 0x4000C6C: _dl_start (rtld.c:333) > ==2175== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > ==2175== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x400B27A: _dl_relocate_object (do-rel.h:127) > ==2174== by 0x40031D0: dl_main (rtld.c:2229) > ==2174== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > ==2174== by 0x4000C6C: _dl_start (rtld.c:333) > ==2174== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > ==2174== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x400B27A: _dl_relocate_object (do-rel.h:127) > ==2175== by 0x40031D0: dl_main (rtld.c:2229) > ==2175== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > ==2175== by 0x4000C6C: _dl_start (rtld.c:333) > ==2175== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > ==2175== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x400A5DE: _dl_relocate_object (do-rel.h:65) > ==2174== by 0x40030FE: dl_main (rtld.c:2292) > ==2174== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > ==2174== by 0x4000C6C: _dl_start (rtld.c:333) > ==2174== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x400A5E6: _dl_relocate_object (do-rel.h:68) > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x400A5DE: _dl_relocate_object (do-rel.h:65) > ==2175== by 0x40030FE: dl_main (rtld.c:2292) > ==2175== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > ==2175== by 0x4000C6C: _dl_start (rtld.c:333) > ==2175== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x400A5E6: _dl_relocate_object (do-rel.h:68) > ==2175== by 0x40030FE: dl_main (rtld.c:2292) > ==2175== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > ==2175== by 0x4000C6C: _dl_start (rtld.c:333) > ==2175== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) > ==2175== by 0x40030FE: dl_main (rtld.c:2292) > ==2175== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > ==2175== by 0x4000C6C: _dl_start (rtld.c:333) > ==2175== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > ==2175== > ==2174== by 0x40030FE: dl_main (rtld.c:2292) > ==2174== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > ==2174== by 0x4000C6C: _dl_start (rtld.c:333) > ==2174== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) > ==2174== by 0x40030FE: dl_main (rtld.c:2292) > ==2174== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > ==2174== by 0x4000C6C: _dl_start (rtld.c:333) > ==2174== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > ==2174== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x87E2BD2: T.206 (simple_pmi.c:985) > ==2175== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > ==2175== by 0x87C742A: MPID_Init (mpid_init.c:331) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43197CB: __strlen_sse2 (strlen.S:116) > ==2175== by 0x87E2BD2: T.206 (simple_pmi.c:985) > ==2175== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > ==2175== by 0x87C742A: MPID_Init (mpid_init.c:331) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x87F559E: PMIU_writeline (simple_pmiutil.c:180) > ==2175== by 0x87E2BD2: T.206 (simple_pmi.c:985) > ==2175== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > ==2175== by 0x87C742A: MPID_Init (mpid_init.c:331) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Use of uninitialised value of size 4 > ==2175== at 0x87F55A4: PMIU_writeline (simple_pmiutil.c:184) > ==2175== by 0x87E2BD2: T.206 (simple_pmi.c:985) > ==2175== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > ==2175== by 0x87C742A: MPID_Init (mpid_init.c:331) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Syscall param write(count) contains uninitialised byte(s) > ==2175== at 0x41B1EB3: __write_nocancel (syscall-template.S:82) > ==2175== by 0x87E2BD2: T.206 (simple_pmi.c:985) > ==2175== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > ==2175== by 0x87C742A: MPID_Init (mpid_init.c:331) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x87F566C: PMIU_writeline (simple_pmiutil.c:197) > ==2175== by 0x87E2BD2: T.206 (simple_pmi.c:985) > ==2175== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > ==2175== by 0x87C742A: MPID_Init (mpid_init.c:331) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x87E2BD2: T.206 (simple_pmi.c:985) > ==2174== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > ==2174== by 0x87C742A: MPID_Init (mpid_init.c:331) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43197CB: __strlen_sse2 (strlen.S:116) > ==2174== by 0x87E2BD2: T.206 (simple_pmi.c:985) > ==2174== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > ==2174== by 0x87C742A: MPID_Init (mpid_init.c:331) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x87F559E: PMIU_writeline (simple_pmiutil.c:180) > ==2174== by 0x87E2BD2: T.206 (simple_pmi.c:985) > ==2174== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > ==2174== by 0x87C742A: MPID_Init (mpid_init.c:331) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Use of uninitialised value of size 4 > ==2174== at 0x87F55A4: PMIU_writeline (simple_pmiutil.c:184) > ==2174== by 0x87E2BD2: T.206 (simple_pmi.c:985) > ==2174== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > ==2174== by 0x87C742A: MPID_Init (mpid_init.c:331) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Syscall param write(count) contains uninitialised byte(s) > ==2174== at 0x41B1EB3: __write_nocancel (syscall-template.S:82) > ==2174== by 0x87E2BD2: T.206 (simple_pmi.c:985) > ==2174== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > ==2174== by 0x87C742A: MPID_Init (mpid_init.c:331) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x87F566C: PMIU_writeline (simple_pmiutil.c:197) > ==2174== by 0x87E2BD2: T.206 (simple_pmi.c:985) > ==2174== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > ==2174== by 0x87C742A: MPID_Init (mpid_init.c:331) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > ==2175== by 0x87C74C7: MPID_Init (mpid_init.c:381) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > ==2175== by 0x87C74C7: MPID_Init (mpid_init.c:381) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > ==2175== by 0x87C74C7: MPID_Init (mpid_init.c:381) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > ==2175== by 0x87C74C7: MPID_Init (mpid_init.c:381) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > ==2175== by 0x87C74C7: MPID_Init (mpid_init.c:381) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > ==2175== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) > ==2175== by 0x87C7526: MPID_Init (mpid_init.c:417) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > ==2175== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) > ==2175== by 0x87C7526: MPID_Init (mpid_init.c:417) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > ==2175== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) > ==2175== by 0x87C7526: MPID_Init (mpid_init.c:417) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > ==2175== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) > ==2175== by 0x87C7526: MPID_Init (mpid_init.c:417) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > ==2175== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) > ==2175== by 0x87C7526: MPID_Init (mpid_init.c:417) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > ==2174== by 0x87C74C7: MPID_Init (mpid_init.c:381) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > ==2174== by 0x87C74C7: MPID_Init (mpid_init.c:381) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > ==2174== by 0x87C74C7: MPID_Init (mpid_init.c:381) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > ==2174== by 0x87C74C7: MPID_Init (mpid_init.c:381) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > ==2174== by 0x87C74C7: MPID_Init (mpid_init.c:381) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > ==2174== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) > ==2174== by 0x87C7526: MPID_Init (mpid_init.c:417) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > ==2174== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) > ==2174== by 0x87C7526: MPID_Init (mpid_init.c:417) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > ==2174== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) > ==2174== by 0x87C7526: MPID_Init (mpid_init.c:417) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > ==2174== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) > ==2174== by 0x87C7526: MPID_Init (mpid_init.c:417) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > ==2174== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) > ==2174== by 0x87C7526: MPID_Init (mpid_init.c:417) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319863: __GI_strlen (strlen.S:138) > ==2175== by 0x438AD43: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:191) > ==2175== by 0x438A685: gethostbyname (getXXbyYY.c:117) > ==2175== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) > ==2175== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) > ==2175== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) > ==2175== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x431986D: __GI_strlen (strlen.S:144) > ==2175== by 0x438AD43: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:191) > ==2175== by 0x438A685: gethostbyname (getXXbyYY.c:117) > ==2175== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) > ==2175== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) > ==2175== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) > ==2175== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319863: __GI_strlen (strlen.S:138) > ==2174== by 0x438AD43: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:191) > ==2174== by 0x438A685: gethostbyname (getXXbyYY.c:117) > ==2174== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) > ==2174== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) > ==2174== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) > ==2174== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x431986D: __GI_strlen (strlen.S:144) > ==2174== by 0x438AD43: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:191) > ==2174== by 0x438A685: gethostbyname (getXXbyYY.c:117) > ==2174== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) > ==2174== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) > ==2174== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) > ==2174== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2175== Invalid read of size 4 > ==2175== at 0x431983B: __GI_strlen (strlen.S:115) > ==2175== by 0x43843CE: __nss_lookup (nsswitch.c:191) > ==2175== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) > ==2175== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > ==2175== by 0x438A685: gethostbyname (getXXbyYY.c:117) > ==2175== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) > ==2175== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) > ==2175== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) > ==2175== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== Address 0x44308f8 is 40 bytes inside a block of size 42 alloc'd > ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > ==2175== by 0x4384583: nss_parse_service_list (nsswitch.c:622) > ==2175== by 0x4384E71: __nss_database_lookup (nsswitch.c:775) > ==2175== by 0x43859AD: __nss_hosts_lookup2 (XXX-lookup.c:71) > ==2175== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > ==2175== by 0x438A685: gethostbyname (getXXbyYY.c:117) > ==2175== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) > ==2175== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) > ==2175== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) > ==2175== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== > ==2174== Invalid read of size 4 > ==2174== at 0x431983B: __GI_strlen (strlen.S:115) > ==2174== by 0x43843CE: __nss_lookup (nsswitch.c:191) > ==2174== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) > ==2174== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > ==2174== by 0x438A685: gethostbyname (getXXbyYY.c:117) > ==2174== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) > ==2174== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) > ==2174== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) > ==2174== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== Address 0x44308f8 is 40 bytes inside a block of size 42 alloc'd > ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > ==2174== by 0x4384583: nss_parse_service_list (nsswitch.c:622) > ==2174== by 0x4384E71: __nss_database_lookup (nsswitch.c:775) > ==2174== by 0x43859AD: __nss_hosts_lookup2 (XXX-lookup.c:71) > ==2174== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > ==2174== by 0x438A685: gethostbyname (getXXbyYY.c:117) > ==2174== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) > ==2174== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) > ==2174== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) > ==2174== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x400B217: _dl_relocate_object (do-rel.h:104) > ==2174== by 0x4011D15: dl_open_worker (dl-open.c:367) > ==2174== by 0x400D875: _dl_catch_error (dl-error.c:178) > ==2174== by 0x4011675: _dl_open (dl-open.c:583) > ==2174== by 0x43AA4A1: do_dlopen (dl-libc.c:86) > ==2174== by 0x400D875: _dl_catch_error (dl-error.c:178) > ==2174== by 0x43AA5A0: dlerror_run (dl-libc.c:47) > ==2174== by 0x43AA6BA: __libc_dlopen_mode (dl-libc.c:160) > ==2174== by 0x43842E4: __nss_lookup_function (nsswitch.c:405) > ==2174== by 0x43843CE: __nss_lookup (nsswitch.c:191) > ==2174== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) > ==2174== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) > ==2174== by 0x4011D15: dl_open_worker (dl-open.c:367) > ==2174== by 0x400D875: _dl_catch_error (dl-error.c:178) > ==2174== by 0x4011675: _dl_open (dl-open.c:583) > ==2174== by 0x43AA4A1: do_dlopen (dl-libc.c:86) > ==2174== by 0x400D875: _dl_catch_error (dl-error.c:178) > ==2174== by 0x43AA5A0: dlerror_run (dl-libc.c:47) > ==2174== by 0x43AA6BA: __libc_dlopen_mode (dl-libc.c:160) > ==2174== by 0x43842E4: __nss_lookup_function (nsswitch.c:405) > ==2174== by 0x43843CE: __nss_lookup (nsswitch.c:191) > ==2174== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) > ==2174== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > ==2174== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x400B217: _dl_relocate_object (do-rel.h:104) > ==2175== by 0x4011D15: dl_open_worker (dl-open.c:367) > ==2175== by 0x400D875: _dl_catch_error (dl-error.c:178) > ==2175== by 0x4011675: _dl_open (dl-open.c:583) > ==2175== by 0x43AA4A1: do_dlopen (dl-libc.c:86) > ==2175== by 0x400D875: _dl_catch_error (dl-error.c:178) > ==2175== by 0x43AA5A0: dlerror_run (dl-libc.c:47) > ==2175== by 0x43AA6BA: __libc_dlopen_mode (dl-libc.c:160) > ==2175== by 0x43842E4: __nss_lookup_function (nsswitch.c:405) > ==2175== by 0x43843CE: __nss_lookup (nsswitch.c:191) > ==2175== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) > ==2175== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) > ==2175== by 0x4011D15: dl_open_worker (dl-open.c:367) > ==2175== by 0x400D875: _dl_catch_error (dl-error.c:178) > ==2175== by 0x4011675: _dl_open (dl-open.c:583) > ==2175== by 0x43AA4A1: do_dlopen (dl-libc.c:86) > ==2175== by 0x400D875: _dl_catch_error (dl-error.c:178) > ==2175== by 0x43AA5A0: dlerror_run (dl-libc.c:47) > ==2175== by 0x43AA6BA: __libc_dlopen_mode (dl-libc.c:160) > ==2175== by 0x43842E4: __nss_lookup_function (nsswitch.c:405) > ==2175== by 0x43843CE: __nss_lookup (nsswitch.c:191) > ==2175== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) > ==2175== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > ==2175== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43197CB: __strlen_sse2 (strlen.S:116) > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x87F559E: PMIU_writeline (simple_pmiutil.c:180) > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Use of uninitialised value of size 4 > ==2174== at 0x87F55A4: PMIU_writeline (simple_pmiutil.c:184) > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Syscall param write(count) contains uninitialised byte(s) > ==2174== at 0x41B1EB3: __write_nocancel (syscall-template.S:82) > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x87F566C: PMIU_writeline (simple_pmiutil.c:197) > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43197CB: __strlen_sse2 (strlen.S:116) > ==2175== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x87F559E: PMIU_writeline (simple_pmiutil.c:180) > ==2175== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Use of uninitialised value of size 4 > ==2175== at 0x87F55A4: PMIU_writeline (simple_pmiutil.c:184) > ==2175== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Syscall param write(count) contains uninitialised byte(s) > ==2175== at 0x41B1EB3: __write_nocancel (syscall-template.S:82) > ==2175== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x87F566C: PMIU_writeline (simple_pmiutil.c:197) > ==2175== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > ==2174== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > ==2174== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > ==2174== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > ==2174== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > ==2174== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) > ==2174== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > ==2175== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > ==2175== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > ==2175== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > ==2175== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > ==2175== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) > ==2175== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > ==2175== by 0x80AEE12: PetscI==2174== Invalid read of size 4 > ==2174== at 0x431983B: __GI_strlen (strlen.S:115) > ==2174== by 0x43843CE: __nss_lookup (nsswitch.c:191) > ==2174== by 0x438564E: __nss_passwd_lookup2 (XXX-lookup.c:76) > ==2174== by 0x433D0DE: getpwuid_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > ==2174== by 0x433C92E: getpwuid (getXXbyYY.c:117) > ==2174== by 0x80BC53A: PetscGetUserName (fuser.c:66) > ==2174== by 0x80822FF: PetscErrorPrintfInitialize (errtrace.c:68) > ==2174== by 0x80AEED1: PetscInitialize (pinit.c:576) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== Address 0x4430700 is 40 bytes inside a block of size 43 alloc'd > ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > ==2174== by 0x4384583: nss_parse_service_list (nsswitch.c:622) > ==2174== by 0x4384E71: __nss_database_lookup (nsswitch.c:775) > ==2174== by 0x43859AD: __nss_hosts_lookup2 (XXX-lookup.c:71) > ==2174== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > ==2174== by 0x438A685: gethostbyname (getXXbyYY.c:117) > ==2174== nitialize (pinit.c:561) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) > ==2174== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) > ==2174== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) > ==2174== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== > ==2175== Invalid read of size 4 > ==2175== at 0x431983B: __GI_strlen (strlen.S:115) > ==2175== by 0x43843CE: __nss_lookup (nsswitch.c:191) > ==2175== by 0x438564E: __nss_passwd_lookup2 (XXX-lookup.c:76) > ==2175== by 0x433D0DE: getpwuid_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > ==2175== by 0x433C92E: getpwuid (getXXbyYY.c:117) > ==2175== by 0x80BC53A: PetscGetUserName (fuser.c:66) > ==2175== by 0x80822FF: PetscErrorPrintfInitialize (errtrace.c:68) > ==2175== by 0x80AEED1: PetscInitialize (pinit.c:576) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== Address 0x4430700 is 40 bytes inside a block of size 43 alloc'd > ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > ==2175== by 0x4384583: nss_parse_service_list (nsswitch.c:622) > ==2175== by 0x4384E71: __nss_database_lookup (nsswitch.c:775) > ==2175== by 0x43859AD: __nss_hosts_lookup2 (XXX-lookup.c:71) > ==2175== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > ==2175== by 0x438A685: gethostbyname (getXXbyYY.c:117) > ==2175== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) > ==2175== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) > ==2175== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) > ==2175== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2175== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80885F9: PetscVSNPrintf (mprint.c:95) > ==2174== by 0x8088B22: PetscSNPrintf (mprint.c:228) > ==2174== by 0x8082456: PetscErrorPrintfInitialize (errtrace.c:71) > ==2174== by 0x80AEED1: PetscInitialize (pinit.c:576) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43197CB: __strlen_sse2 (strlen.S:116) > ==2174== by 0x80885F9: PetscVSNPrintf (mprint.c:95) > ==2174== by 0x8088B22: PetscSNPrintf (mprint.c:228) > ==2174== by 0x8082456: PetscErrorPrintfInitialize (errtrace.c:71) > ==2174== by 0x80AEED1: PetscInitialize (pinit.c:576) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2174== by 0x80824DE: PetscErrorPrintfInitialize (errtrace.c:73) > ==2174== by 0x80AEED1: PetscInitialize (pinit.c:576) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2174== by 0x8082563: PetscErrorPrintfInitialize (errtrace.c:77) > ==2174== by 0x80AEED1: PetscInitialize (pinit.c:576) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2174== by 0x809CFD4: PetscOptionsInsert (options.c:516) > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80885F9: PetscVSNPrintf (mprint.c:95) > ==2175== by 0x8088B22: PetscSNPrintf (mprint.c:228) > ==2175== by 0x8082456: PetscErrorPrintfInitialize (errtrace.c:71) > ==2175== by 0x80AEED1: PetscInitialize (pinit.c:576) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43197CB: __strlen_sse2 (strlen.S:116) > ==2175== by 0x80885F9: PetscVSNPrintf (mprint.c:95) > ==2175== by 0x8088B22: PetscSNPrintf (mprint.c:228) > ==2175== by 0x8082456: PetscErrorPrintfInitialize (errtrace.c:71) > ==2175== by 0x80AEED1: PetscInitialize (pinit.c:576) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) > ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) > ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43197CB: __strlen_sse2 (strlen.S:116) > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) > ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) > ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x87F559E: PMIU_writeline (simple_pmiutil.c:180) > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) > ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) > ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2174== > ==2174== Use of uninitialised value of size 4 > ==2174== at 0x87F55A4: PMIU_writeline (simple_pmiutil.c:184) > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) > ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) > ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2174== > ==2174== Syscall param write(count) contains uninitialised byte(s) > ==2174== at 0x41B1EB3: __write_nocancel (syscall-template.S:82) > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) > ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) > ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x87F566C: PMIU_writeline (simple_pmiutil.c:197) > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) > ==2174== by 0x87CA9F9: getConnIn==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2175== by 0x80824DE: PetscErrorPrintfInitialize (errtrace.c:73) > ==2175== by 0x80AEED1: PetscInitialize (pinit.c:576) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > foKVS (mpidi_pg.c:622) > ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) > ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) > ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) > ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) > ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) > ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) > ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) > ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) > ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) > ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) > ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > ==2174== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2175== by 0x8082563: PetscErrorPrintfInitialize (errtrace.c:77) > ==2175== by 0x80AEED1: PetscInitialize (pinit.c:576) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2175== by 0x809CFD4: PetscOptionsInsert (options.c:516) > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2174== Invalid read of size 8 > ==2174== at 0x4319785: __strlen_sse2 (strlen.S:87) > ==2174== by 0x87BD379: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:776) > ==2174== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) > ==2174== by 0x8793060: MPIC_Wait (helper_fns.c:269) > ==2174== by 0x879377E: MPIC_Send (helper_fns.c:38) > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== Address 0x442e0d0 is 8 bytes before a block of size 257 alloc'd > ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > ==2174== by 0x87C74AE: MPID_Init (mpid_init.c:373) > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43197AD: __strlen_sse2 (strlen.S:104) > ==2174== by 0x87BD379: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:776) > ==2174== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) > ==2174== by 0x8793060: MPIC_Wait (helper_fns.c:269) > ==2174== by 0x879377E: MPIC_Send (helper_fns.c:38) > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Syscall param writev(vector) points to uninitialised byte(s) > ==2174== at 0x436BA61: writev (writev.c:51) > ==2174== by 0x87EB49A: MPIDU_Sock_wait (sock_wait.i:693) > ==2174== by 0x87BD7B0: MPIDI_CH3I_Progress (ch3_progress.c:187) > ==2174== by 0x8793060: MPIC_Wait (helper_fns.c:269) > ==2174== by 0x879377E: MPIC_Send (helper_fns.c:38) > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== Address 0x4462754 is 68 bytes inside a block of size 72 alloc'd > ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > ==2174== by 0x87CBE6C: MPIDI_CH3I_Connection_alloc (ch3u_connect_sock.c:160) > ==2174== by 0x87CBFAD: MPIDI_CH3I_Sock_connect (ch3u_connect_sock.c:1164) > ==2174== by 0x87CC132: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1102) > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > ==2174== > ==2174== Syscall param writev(vector[...]) points to uninitialised byte(s) > ==2174== at 0x436BA61: writev (writev.c:51) > ==2174== by 0x87EB49A: MPIDU_Sock_wait (sock_wait.i:693) > ==2174== by 0x87BD7B0: MPIDI_CH3I_Progress (ch3_progress.c:187) > ==2174== by 0x8793060: MPIC_Wait (helper_fns.c:269) > ==2174== by 0x879377E: MPIC_Send (helper_fns.c:38) > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== Address 0x4462728 is 24 bytes inside a block of size 72 alloc'd > ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > ==2174== by 0x87CBE6C: MPIDI_CH3I_Connection_alloc (ch3u_connect_sock.c:160) > ==2174== by 0x87CBFAD: MPIDI_CH3I_Sock_connect (ch3u_connect_sock.c:1164) > ==2174== by 0x87CC132: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1102) > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x87EC050: MPIDU_Sock_wait (socki_util.i:543) > ==2174== by 0x87BD7B0: MPIDI_CH3I_Progress (ch3_progress.c:187) > ==2174== by 0x8793060: MPIC_Wait (helper_fns.c:269) > ==2174== by 0x879377E: MPIC_Send (helper_fns.c:38) > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > ==2175== by 0x87CA206: MPIDI_PG_Find (mpidi_pg.c:341) > ==2175== by 0x87CB66D: MPIDI_CH3_Sockconn_handle_connopen_event (ch3u_connect_sock.c:883) > ==2175== by 0x87BD3FA: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:639) > ==2175== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) > ==2175== by 0x8793060: MPIC_Wait (helper_fns.c:269) > ==2175== by 0x8793626: MPIC_Recv (helper_fns.c:74) > ==2175== by 0x878C049: MPIR_Bcast (bcast.c:195) > ==2175== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2175== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2175== by 0x809D14A: PetscOptionsInsert (options.c:522) > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > ==2175== by 0x87CA206: MPIDI_PG_Find (mpidi_pg.c:341) > ==2175== by 0x87CB66D: MPIDI_CH3_Sockconn_handle_connopen_event (ch3u_connect_sock.c:883) > ==2175== by 0x87BD3FA: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:639) > ==2175== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) > ==2175== by 0x8793060: MPIC_Wait (helper_fns.c:269) > ==2175== by 0x8793626: MPIC_Recv (helper_fns.c:74) > ==2175== by 0x878C049: MPIR_Bcast (bcast.c:195) > ==2175== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2175== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2175== by 0x809D14A: PetscOptionsInsert (options.c:522) > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > ==2175== by 0x87CA206: MPIDI_PG_Find (mpidi_pg.c:341) > ==2175== by 0x87CB66D: MPIDI_CH3_Sockconn_handle_connopen_event (ch3u_connect_sock.c:883) > ==2175== by 0x87BD3FA: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:639) > ==2175== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) > ==2175== by 0x8793060: MPIC_Wait (helper_fns.c:269) > ==2175== by 0x8793626: MPIC_Recv (helper_fns.c:74) > ==2175== by 0x878C049: MPIR_Bcast (bcast.c:195) > ==2175== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2175== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2175== by 0x809D14A: PetscOptionsInsert (options.c:522) > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > ==2175== by 0x87CA206: MPIDI_PG_Find (mpidi_pg.c:341) > ==2175== by 0x87CB66D: MPIDI_CH3_Sockconn_handle_connopen_event (ch3u_connect_sock.c:883) > ==2175== by 0x87BD3FA: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:639) > ==2175== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) > ==2175== by 0x8793060: MPIC_Wait (helper_fns.c:269) > ==2175== by 0x8793626: MPIC_Recv (helper_fns.c:74) > ==2175== by 0x878C049: MPIR_Bcast (bcast.c:195) > ==2175== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2175== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2175== by 0x809D14A: PetscOptionsInsert (options.c:522) > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > ==2175== by 0x87CA206: MPIDI_PG_Find (mpidi_pg.c:341) > ==2175== by 0x87CB66D: MPIDI_CH3_Sockconn_handle_connopen_event (ch3u_connect_sock.c:883) > ==2175== by 0x87BD3FA: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:639) > ==2175== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) > ==2175== by 0x8793060: MPIC_Wait (helper_fns.c:269) > ==2175== by 0x8793626: MPIC_Recv (helper_fns.c:74) > ==2175== by 0x878C049: MPIR_Bcast (bcast.c:195) > ==2175== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > ==2175== by 0x809C467: PetscOptionsInsertFile (options.c:436) > ==2175== by 0x809D14A: PetscOptionsInsert (options.c:522) > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2175== > ==2174== Invalid read of size 8 > ==2174== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) > ==2174== by 0x809F18B: PetscOptionsSetValue (options.c:803) > ==2174== by 0x809DD95: PetscOptionsInsert (options.c:588) > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== Address 0x44c0d88 is 8 bytes inside a block of size 10 alloc'd > ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > ==2174== by 0x809F5F2: PetscOptionsSetValue (options.c:829) > ==2174== by 0x809DD95: PetscOptionsInsert (options.c:588) > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Invalid read of size 8 > ==2174== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) > ==2174== by 0x809F18B: PetscOptionsSetValue (options.c:803) > ==2174== by 0x809DD95: PetscOptionsInsert (options.c:588) > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== Address 0x44c0d90 is 6 bytes after a block of size 10 alloc'd > ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > ==2174== by 0x809F5F2: PetscOptionsSetValue (options.c:829) > ==2174== by 0x809DD95: PetscOptionsInsert (options.c:588) > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) > ==2174== by 0x80AF535: PetscInitialize (pinit.c:635) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2174== by 0x80A8875: PetscOptionsCheckInitial_Private (init.c:242) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Invalid read of size 8 > ==2174== at 0x4319785: __strlen_sse2 (strlen.S:87) > ==2174== by 0x8099BCC: PetscOptionsAtol (options.c:152) > ==2174== by 0x80A1E9D: PetscOptionsGetTruth (options.c:1310) > ==2174== by 0x80A88F7: PetscOptionsCheckInitial_Private (init.c:244) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== Address 0x44c0d40 is 8 bytes before a block of size 4 alloc'd > ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > ==2174== by 0x809F6EB: PetscOptionsSetValue (options.c:833) > ==2174== by 0x809DD95: PetscOptionsInsert (options.c:588) > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2174== by 0x80A8A66: PetscOptionsCheckInitial_Private (init.c:257) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2174== by 0x80A8BC3: PetscOptionsCheckInitial_Private (init.c:264) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2174== by 0x80A8C5B: PetscOptionsCheckInitial_Private (init.c:267) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2175== Invalid read of size 8 > ==2175== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) > ==2175== by 0x809F18B: PetscOptionsSetValue (options.c:803) > ==2175== by 0x809DD95: PetscOptionsInsert (options.c:588) > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== Address 0x4433c38 is 8 bytes inside a block of size 10 alloc'd > ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > ==2175== by 0x809F5F2: PetscOptionsSetValue (options.c:829) > ==2175== by 0x809DD95: PetscOptionsInsert (options.c:588) > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Invalid read of size 8 > ==2175== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) > ==2175== by 0x809F18B: PetscOptionsSetValue (options.c:803) > ==2175== by 0x809DD95: PetscOptionsInsert (options.c:588) > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== Address 0x4433c40 is 6 bytes after a block of size 10 alloc'd > ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > ==2175== by 0x809F5F2: PetscOptionsSetValue (options.c:829) > ==2175== by 0x809DD95: PetscOptionsInsert (options.c:588) > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > ==2175== by 0x80AF535: PetscInitialize (pinit.c:635) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2174== by 0x80C395E: PetscSetDisplay (pdisplay.c:99) > ==2174== by 0x80A8D23: PetscOptionsCheckInitial_Private (init.c:276) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2175== by 0x80A8875: PetscOptionsCheckInitial_Private (init.c:242) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Invalid read of size 8 > ==2175== at 0x4319785: __strlen_sse2 (strlen.S:87) > ==2175== by 0x8099BCC: PetscOptionsAtol (options.c:152) > ==2175== by 0x80A1E9D: PetscOptionsGetTruth (options.c:1310) > ==2175== by 0x80A88F7: PetscOptionsCheckInitial_Private (init.c:244) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== Address 0x4433bf0 is 8 bytes before a block of size 4 alloc'd > ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > ==2175== by 0x809F6EB: PetscOptionsSetValue (options.c:833) > ==2175== by 0x809DD95: PetscOptionsInsert (options.c:588) > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) > ==2174== by 0x80A8D9B: PetscOptionsCheckInitial_Private (init.c:281) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) > ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) > ==2174== by 0x80A8D9B: PetscOptionsCheckInitial_Private (init.c:281) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) > ==2174== by 0x80A8E13: PetscOptionsCheckInitial_Private (init.c:282) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) > ==2174== by 0x80A8E8B: PetscOptionsCheckInitial_Private (init.c:283) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2174== by 0x80A94FC: PetscOptionsCheckInitial_Private (init.c:320) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2175== by 0x80A8A66: PetscOptionsCheckInitial_Private (init.c:257) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2175== by 0x80A8BC3: PetscOptionsCheckInitial_Private (init.c:264) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2175== by 0x80A8C5B: PetscOptionsCheckInitial_Private (init.c:267) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2174== by 0x80A95F6: PetscOptionsCheckInitial_Private (init.c:323) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2174== by 0x80A96F8: PetscOptionsCheckInitial_Private (init.c:326) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2174== by 0x80A97FA: PetscOptionsCheckInitial_Private (init.c:329) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2175== by 0x80C395E: PetscSetDisplay (pdisplay.c:99) > ==2175== by 0x80A8D23: PetscOptionsCheckInitial_Private (init.c:276) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2174== by 0x80A98FE: PetscOptionsCheckInitial_Private (init.c:334) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2174== by 0x80A9A5C: PetscOptionsCheckInitial_Private (init.c:341) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2174== by 0x80A9CA7: PetscOptionsCheckInitial_Private (init.c:350) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2174== by 0x80A9D9D: PetscOptionsCheckInitial_Private (init.c:352) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2174== by 0x80A9E24: PetscOptionsCheckInitial_Private (init.c:353) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2174== by 0x80AA655: PetscOptionsCheckInitial_Private (init.c:402) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) > ==2174== by 0x80AA74F: PetscOptionsCheckInitial_Private (init.c:409) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) > ==2174== by 0x80AA7C7: PetscOptionsCheckInitial_Private (init.c:410) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2174== by 0x80AAABF: PetscOptionsCheckInitial_Private (init.c:439) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2174== by 0x80AAC26: PetscOptionsCheckInitial_Private (init.c:452) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > ==2175== by 0x80A8D9B: PetscOptionsCheckInitial_Private (init.c:281) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > ==2175== by 0x80A8D9B: PetscOptionsCheckInitial_Private (init.c:281) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > ==2175== by 0x80A8E13: PetscOptionsCheckInitial_Private (init.c:282) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > ==2175== by 0x80A8E8B: PetscOptionsCheckInitial_Private (init.c:283) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2175== by 0x80A94FC: PetscOptionsCheckInitial_Private (init.c:320) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2175== by 0x80A95F6: PetscOptionsCheckInitial_Private (init.c:323) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2175== by 0x80A96F8: PetscOptionsCheckInitial_Private (init.c:326) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2175== by 0x80A97FA: PetscOptionsCheckInitial_Private (init.c:329) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2175== by 0x80A98FE: PetscOptionsCheckInitial_Private (init.c:334) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2174== by 0x80AADBF: PetscOptionsCheckInitial_Private (init.c:468) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2174== by 0x80AAE3F: PetscOptionsCheckInitial_Private (init.c:469) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2174== by 0x80AAE3F: PetscOptionsCheckInitial_Private (init.c:469) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) > ==2174== by 0x80AAEB7: PetscOptionsCheckInitial_Private (init.c:470) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2174== by 0x80AB021: PetscOptionsCheckInitial_Private (init.c:474) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2174== by 0x80AB29D: PetscOptionsCheckInitial_Private (init.c:499) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) > ==2174== by 0x80AB315: PetscOptionsCheckInitial_Private (init.c:504) > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== > ==2174== More than 100 errors detected. Subsequent errors > ==2174== will still be recorded, but in less detail than before. > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2175== by 0x80A9A5C: PetscOptionsCheckInitial_Private (init.c:341) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2175== by 0x80A9CA7: PetscOptionsCheckInitial_Private (init.c:350) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2175== by 0x80A9D9D: PetscOptionsCheckInitial_Private (init.c:352) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2175== by 0x80A9E24: PetscOptionsCheckInitial_Private (init.c:353) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2175== by 0x80AA655: PetscOptionsCheckInitial_Private (init.c:402) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > ==2175== by 0x80AA74F: PetscOptionsCheckInitial_Private (init.c:409) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > ==2175== by 0x80AA7C7: PetscOptionsCheckInitial_Private (init.c:410) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2175== by 0x80AAABF: PetscOptionsCheckInitial_Private (init.c:439) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2175== by 0x80AAC26: PetscOptionsCheckInitial_Private (init.c:452) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2175== by 0x80AADBF: PetscOptionsCheckInitial_Private (init.c:468) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2175== by 0x80AAE3F: PetscOptionsCheckInitial_Private (init.c:469) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2175== by 0x80AAE3F: PetscOptionsCheckInitial_Private (init.c:469) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > ==2175== by 0x80AAEB7: PetscOptionsCheckInitial_Private (init.c:470) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2175== by 0x80AB021: PetscOptionsCheckInitial_Private (init.c:474) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > ==2175== by 0x80AB29D: PetscOptionsCheckInitial_Private (init.c:499) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > ==2175== by 0x80AB315: PetscOptionsCheckInitial_Private (init.c:504) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A27DF: PetscOptionsGetReal (options.c:1419) > ==2175== by 0x80AC610: PetscOptionsCheckInitial_Private (init.c:554) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2175== by 0x80AC70C: PetscOptionsCheckInitial_Private (init.c:559) > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > ==2175== by 0x80C52B1: PetscLogBegin_Private (plog.c:196) > ==2175== by 0x80AF677: PetscInitialize (pinit.c:643) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > ==2175== by 0x80C532E: PetscLogBegin_Private (plog.c:200) > ==2175== by 0x80AF677: PetscInitialize (pinit.c:643) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A420A: PetscOptionsGetStringArray (options.c:1756) > ==2175== by 0x808387F: PetscInitialize_DynamicLibraries (reg.c:80) > ==2175== by 0x80AF6D6: PetscInitialize (pinit.c:650) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2175== by 0x81DE8F5: PetscInitializePackage (dlregispetsc.c:58) > ==2175== by 0x8083A20: PetscInitialize_DynamicLibraries (reg.c:93) > ==2175== by 0x80AF6D6: PetscInitialize (pinit.c:650) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2175== by 0x81DEA77: PetscInitializePackage (dlregispetsc.c:66) > ==2175== by 0x8083A20: PetscInitialize_DynamicLibraries (reg.c:93) > ==2175== by 0x80AF6D6: PetscInitialize (pinit.c:650) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A420A: PetscOptionsGetStringArray (options.c:1756) > ==2175== by 0x8083AA5: PetscInitialize_DynamicLibraries (reg.c:117) > ==2175== by 0x80AF6D6: PetscInitialize (pinit.c:650) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80A066B: PetscOptionsFindPair_Private (options.c:967) > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2174== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) > ==2174== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) > ==2174== by 0x80AF975: PetscInitialize (pinit.c:659) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x40249EA: strncat (mc_replace_strmem.c:202) > ==2174== by 0x80BE6E2: PetscStrncat (str.c:205) > ==2174== by 0x80A06F1: PetscOptionsFindPair_Private (options.c:968) > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2174== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) > ==2174== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) > ==2174== by 0x80AF975: PetscInitialize (pinit.c:659) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > ==2175== by 0x80AC9CD: PetscOptionsCheckInitial_Components (pinit.c:57) > ==2175== by 0x80AF916: PetscInitialize (pinit.c:657) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4024A18: strncat (mc_replace_strmem.c:202) > ==2174== by 0x80BE6E2: PetscStrncat (str.c:205) > ==2174== by 0x80A06F1: PetscOptionsFindPair_Private (options.c:968) > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2174== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) > ==2174== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) > ==2174== by 0x80AF975: PetscInitialize (pinit.c:659) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > ==2175== by 0x80B290F: PetscOptionsBegin_Private (aoptions.c:44) > ==2175== by 0x80A4EED: PetscOptionsSetFromOptions (options.c:1890) > ==2175== by 0x80AF975: PetscInitialize (pinit.c:659) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80A066B: PetscOptionsFindPair_Private (options.c:967) > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2175== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) > ==2175== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) > ==2175== by 0x80AF975: PetscInitialize (pinit.c:659) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== > ==2175== More than 100 errors detected. Subsequent errors > ==2175== will still be recorded, but in less detail than before. > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x40249EA: strncat (mc_replace_strmem.c:202) > ==2175== by 0x80BE6E2: PetscStrncat (str.c:205) > ==2175== by 0x80A06F1: PetscOptionsFindPair_Private (options.c:968) > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2175== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) > ==2175== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) > ==2175== by 0x80AF975: PetscInitialize (pinit.c:659) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4024A18: strncat (mc_replace_strmem.c:202) > ==2175== by 0x80BE6E2: PetscStrncat (str.c:205) > ==2175== by 0x80A06F1: PetscOptionsFindPair_Private (options.c:968) > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2175== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) > ==2175== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) > ==2175== by 0x80AF975: PetscInitialize (pinit.c:659) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2174== by 0x80BDE83: PetscStrallocpy (str.c:79) > ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > ==2174== by 0x80841CC: PetscFListAdd (reg.c:201) > ==2174== by 0x81843D0: DARegister (dareg.c:104) > ==2174== by 0x818476B: DARegisterAll (daregall.c:32) > ==2174== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) > ==2174== by 0x815F1D1: DACreate (dacreate.c:173) > ==2174== by 0x81558E2: DACreate2d (da2.c:1837) > ==2174== by 0x804BE2A: main (ex19.c:107) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43197AD: __strlen_sse2 (strlen.S:104) > ==2174== by 0x80BDE83: PetscStrallocpy (str.c:79) > ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > ==2174== by 0x80841CC: PetscFListAdd (reg.c:201) > ==2174== by 0x81843D0: DARegister (dareg.c:104) > ==2174== by 0x818476B: DARegisterAll (daregall.c:32) > ==2174== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) > ==2174== by 0x815F1D1: DACreate (dacreate.c:173) > ==2174== by 0x81558E2: DACreate2d (da2.c:1837) > ==2174== by 0x804BE2A: main (ex19.c:107) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x80BDEDB: PetscStrallocpy (str.c:80) > ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > ==2175== by 0x80BDE83: PetscStrallocpy (str.c:79) > ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > ==2175== by 0x80841CC: PetscFListAdd (reg.c:201) > ==2175== by 0x81843D0: DARegister (dareg.c:104) > ==2175== by 0x818476B: DARegisterAll (daregall.c:32) > ==2175== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) > ==2175== by 0x815F1D1: DACreate (dacreate.c:173) > ==2175== by 0x81558E2: DACreate2d (da2.c:1837) > ==2175== by 0x804BE2A: main (ex19.c:107) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43197AD: __strlen_sse2 (strlen.S:104) > ==2175== by 0x80BDE83: PetscStrallocpy (str.c:79) > ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > ==2175== by 0x80841CC: PetscFListAdd (reg.c:201) > ==2175== by 0x81843D0: DARegister (dareg.c:104) > ==2175== by 0x818476B: DARegisterAll (daregall.c:32) > ==2175== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) > ==2175== by 0x815F1D1: DACreate (dacreate.c:173) > ==2175== by 0x81558E2: DACreate2d (da2.c:1837) > ==2175== by 0x804BE2A: main (ex19.c:107) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x80BDEDB: PetscStrallocpy (str.c:80) > ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > ==2175== by 0x80841CC: PetscFListAdd (reg.c:201) > ==2175== by 0x81843D0: DARegister (dareg.c:104) > ==2175== by 0x818476B: DARegisterAll (daregall.c:32) > ==2175== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) > ==2175== by 0x815F1D1: DACreate (dacreate.c:173) > ==2175== by 0x81558E2: DACreate2d (da2.c:1837) > ==2175== by 0x804BE2A: main (ex19.c:107) > ==2175== > ==2174== by 0x80841CC: PetscFListAdd (reg.c:201) > ==2174== by 0x81843D0: DARegister (dareg.c:104) > ==2174== by 0x818476B: DARegisterAll (daregall.c:32) > ==2174== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) > ==2174== by 0x815F1D1: DACreate (dacreate.c:173) > ==2174== by 0x81558E2: DACreate2d (da2.c:1837) > ==2174== by 0x804BE2A: main (ex19.c:107) > ==2174== > ==2175== Invalid read of size 8 > ==2175== at 0x43197A0: __strlen_sse2 (strlen.S:99) > ==2175== by 0x8099209: PetscOptionsAtoi (options.c:70) > ==2175== by 0x80A13CE: PetscOptionsGetInt (options.c:1138) > ==2175== by 0x80B5AA5: PetscOptionsInt (aoptions.c:473) > ==2175== by 0x815E8BC: DASetFromOptions (dacreate.c:109) > ==2175== by 0x8155C96: DACreate2d (da2.c:1847) > ==2175== by 0x804BE2A: main (ex19.c:107) > ==2175== Address 0x4433c70 is 0 bytes inside a block of size 3 alloc'd > ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > ==2175== by 0x809F6EB: PetscOptionsSetValue (options.c:833) > ==2175== by 0x809DD95: PetscOptionsInsert (options.c:588) > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2174== Invalid read of size 8 > ==2174== at 0x43197A0: __strlen_sse2 (strlen.S:99) > ==2174== by 0x8099209: PetscOptionsAtoi (options.c:70) > ==2174== by 0x80A13CE: PetscOptionsGetInt (options.c:1138) > ==2174== by 0x80B5AA5: PetscOptionsInt (aoptions.c:473) > ==2174== by 0x815E8BC: DASetFromOptions (dacreate.c:109) > ==2174== by 0x8155C96: DACreate2d (da2.c:1847) > ==2174== by 0x804BE2A: main (ex19.c:107) > ==2174== Address 0x44c0dc0 is 0 bytes inside a block of size 3 alloc'd > ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > ==2174== by 0x809F6EB: PetscOptionsSetValue (options.c:833) > ==2174== by 0x809DD95: PetscOptionsInsert (options.c:588) > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2175== Invalid read of size 8 > ==2175== at 0x4319785: __strlen_sse2 (strlen.S:87) > ==2175== by 0x8099209: PetscOptionsAtoi (options.c:70) > ==2175== by 0x80A13CE: PetscOptionsGetInt (options.c:1138) > ==2175== by 0x80B5AA5: PetscOptionsInt (aoptions.c:473) > ==2175== by 0x815E96A: DASetFromOptions (dacreate.c:114) > ==2175== by 0x8155C96: DACreate2d (da2.c:1847) > ==2175== by 0x804BE2A: main (ex19.c:107) > ==2175== Address 0x4433ce0 is 8 bytes before a block of size 3 alloc'd > ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > ==2175== by 0x809F6EB: PetscOptionsSetValue (options.c:833) > ==2175== by 0x809DD95: PetscOptionsInsert (options.c:588) > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2175== by 0x804BA0C: main (ex19.c:96) > ==2175== > ==2174== Invalid read of size 8 > ==2174== at 0x4319785: __strlen_sse2 (strlen.S:87) > ==2174== by 0x8099209: PetscOptionsAtoi (options.c:70) > ==2174== by 0x80A13CE: PetscOptionsGetInt (options.c:1138) > ==2174== by 0x80B5AA5: PetscOptionsInt (aoptions.c:473) > ==2174== by 0x815E96A: DASetFromOptions (dacreate.c:114) > ==2174== by 0x8155C96: DACreate2d (da2.c:1847) > ==2174== by 0x804BE2A: main (ex19.c:107) > ==2174== Address 0x44c0e30 is 8 bytes before a block of size 3 alloc'd > ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > ==2174== by 0x809F6EB: PetscOptionsSetValue (options.c:833) > ==2174== by 0x809DD95: PetscOptionsInsert (options.c:588) > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > ==2174== by 0x804BA0C: main (ex19.c:96) > ==2174== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43197BC: __strlen_sse2 (strlen.S:110) > ==2175== by 0x80BDE83: PetscStrallocpy (str.c:79) > ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > ==2175== by 0x80846F3: PetscFListAdd (reg.c:238) > ==2175== by 0x829D9AC: MatRegister (matreg.c:139) > ==2175== by 0x86C6837: MatRegisterAll (matregis.c:85) > ==2175== by 0x82A0B71: MatInitializePackage (dlregismat.c:80) > ==2175== by 0x8542386: MatCreate (gcreate.c:72) > ==2175== by 0x8171B37: DAGetInterpolation_2D_Q1 (dainterp.c:308) > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) > ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) > ==2175== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43197BC: __strlen_sse2 (strlen.S:110) > ==2174== by 0x80BDE83: PetscStrallocpy (str.c:79) > ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > ==2174== by 0x80846F3: PetscFListAdd (reg.c:238) > ==2174== by 0x829D9AC: MatRegister (matreg.c:139) > ==2174== by 0x86C6837: MatRegisterAll (matregis.c:85) > ==2174== by 0x82A0B71: MatInitializePackage (dlregismat.c:80) > ==2174== by 0x8542386: MatCreate (gcreate.c:72) > ==2174== by 0x8171B37: DAGetInterpolation_2D_Q1 (dainterp.c:308) > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) > ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) > ==2174== > ==2175== Invalid read of size 8 > ==2175== at 0x43BEC7D: __strcmp_ssse3 (strcmp-ssse3.S:1021) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x834D30B: MatCreate_MPIAIJ (mpiaij.c:5096) > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > ==2175== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > ==2175== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) > ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) > ==2175== Address 0x44db848 is 24 bytes inside a block of size 28 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > ==2175== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > ==2175== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2175== > ==2174== Invalid read of size 8 > ==2174== at 0x43BEC7D: __strcmp_ssse3 (strcmp-ssse3.S:1021) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x834D30B: MatCreate_MPIAIJ (mpiaij.c:5096) > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > ==2174== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > ==2174== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) > ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) > ==2174== Address 0x456b7c8 is 24 bytes inside a block of size 28 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > ==2174== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > ==2174== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2174== > ==2175== Invalid read of size 8 > ==2175== at 0x43BE415: __strcmp_ssse3 (strcmp-ssse3.S:225) > ==2175== by 0x8085399: PetscFListFind (reg.c:375) > ==2175== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > ==2175== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > ==2175== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) > ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) > ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) > ==2175== by 0x804BEA0: main (ex19.c:108) > ==2175== Address 0x44db848 is 24 bytes inside a block of size 28 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > ==2175== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > ==2175== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2175== > ==2175== Invalid read of size 8 > ==2175== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) > ==2175== by 0x8085399: PetscFListFind (reg.c:375) > ==2175== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > ==2175== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > ==2175== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) > ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) > ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) > ==2175== by 0x804BEA0: main (ex19.c:108) > ==2175== Address 0x44dc7d8 is 24 bytes inside a block of size 28 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > ==2175== by 0x8085186: PetscFListFind (reg.c:356) > ==2175== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > ==2175== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > ==2175== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) > ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) > ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) > ==2175== > ==2175== Invalid read of size 8 > ==2175== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) > ==2175== by 0x8085406: PetscFListFind (reg.c:376) > ==2175== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > ==2175== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > ==2175== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) > ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) > ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) > ==2175== by 0x804BEA0: main (ex19.c:108) > ==2175== Address 0x44dc7d8 is 24 bytes inside a block of size 28 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > ==2175== by 0x8085186: PetscFListFind (reg.c:356) > ==2175== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > ==2175== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > ==2175== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) > ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) > ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) > ==2175== > ==2174== Invalid read of size 8 > ==2174== at 0x43BE415: __strcmp_ssse3 (strcmp-ssse3.S:225) > ==2174== by 0x8085399: PetscFListFind (reg.c:375) > ==2174== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > ==2174== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > ==2174== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) > ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) > ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) > ==2174== by 0x804BEA0: main (ex19.c:108) > ==2174== Address 0x456b7c8 is 24 bytes inside a block of size 28 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > ==2174== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > ==2174== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2174== > ==2174== Invalid read of size 8 > ==2174== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) > ==2174== by 0x8085399: PetscFListFind (reg.c:375) > ==2174== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > ==2174== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > ==2174== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) > ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) > ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) > ==2174== by 0x804BEA0: main (ex19.c:108) > ==2174== Address 0x456c758 is 24 bytes inside a block of size 28 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > ==2174== by 0x8085186: PetscFListFind (reg.c:356) > ==2174== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > ==2174== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > ==2174== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) > ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) > ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) > ==2174== > ==2174== Invalid read of size 8 > ==2174== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) > ==2174== by 0x8085406: PetscFListFind (reg.c:376) > ==2174== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > ==2174== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > ==2174== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) > ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) > ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) > ==2174== by 0x804BEA0: main (ex19.c:108) > ==2174== Address 0x456c758 is 24 bytes inside a block of size 28 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > ==2174== by 0x8085186: PetscFListFind (reg.c:356) > ==2174== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > ==2174== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > ==2174== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) > ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) > ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) > ==2174== > ==2174== Invalid read of size 8 > ==2174== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x82BE3A6: MatCreate_SeqAIJ (aij.c:3363) > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > ==2174== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > ==2174== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) > ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) > ==2174== Address 0x456d3a0 is 16 bytes inside a block of size 21 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x8084162: PetscFListAdd (reg.c:200) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x82BE331: MatCreate_SeqAIJ (aij.c:3360) > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > ==2174== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > ==2174== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2174== > ==2175== Invalid read of size 8 > ==2175== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x82BE3A6: MatCreate_SeqAIJ (aij.c:3363) > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > ==2175== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > ==2175== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) > ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) > ==2175== Address 0x44dd420 is 16 bytes inside a block of size 21 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x8084162: PetscFListAdd (reg.c:200) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x82BE331: MatCreate_SeqAIJ (aij.c:3360) > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > ==2175== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > ==2175== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2175== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x80BDEDB: PetscStrallocpy (str.c:80) > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2174== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) > ==2174== by 0x80A6564: PetscObjectAppendOptionsPrefix (prefix.c:70) > ==2174== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) > ==2174== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) > ==2174== by 0x85A5056: PCMGSetLevels (mg.c:195) > ==2174== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) > ==2174== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2174== by 0x804C4FF: main (ex19.c:140) > ==2174== > ==2175== at 0x80BDEDB: PetscStrallocpy (str.c:80) > ==2175== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) > ==2175== by 0x80A6564: PetscObjectAppendOptionsPrefix (prefix.c:70) > ==2175== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) > ==2175== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) > ==2175== by 0x85A5056: PCMGSetLevels (mg.c:195) > ==2175== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) > ==2175== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2175== by 0x804C4FF: main (ex19.c:140) > ==2175== > ==2174== Invalid read of size 8 > ==2174== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x8668D6B: KSPDestroy_GMRES (gmres.c:302) > ==2174== by 0x8675E35: KSPDestroy_FGMRES (fgmres.c:341) > ==2174== by 0x8635CA0: KSPSetType (itcreate.c:569) > ==2174== by 0x81C5A98: DMMGSetSNES (damgsnes.c:668) > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2174== by 0x804C4FF: main (ex19.c:140) > ==2174== Address 0x462f078 is 24 bytes inside a block of size 31 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x86784F2: KSPCreate_FGMRES (fgmres.c:753) > ==2174== by 0x8635D92: KSPSetType (itcreate.c:576) > ==2174== by 0x81BE7AA: DMMGSetUpLevel (damg.c:372) > ==2174== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2174== by 0x804C4FF: main (ex19.c:140) > ==2174== > ==2174== Invalid read of size 8 > ==2174== at 0x43BE320: __strcmp_ssse3 (strcmp-ssse3.S:141) > ==2174== by 0x8085399: PetscFListFind (reg.c:375) > ==2174== by 0x858986E: PCSetType (pcset.c:66) > ==2174== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2174== by 0x804C4FF: main (ex19.c:140) > ==2174== Address 0x45a0778 is 8 bytes inside a block of size 10 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== Invalid read of size 8 > ==2175== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x8668D6B: KSPDestroy_GMRES (gmres.c:302) > ==2175== by 0x8675E35: KSPDestroy_FGMRES (fgmres.c:341) > ==2175== by 0x8635CA0: KSPSetType (itcreate.c:569) > ==2175== by 0x81C5A98: DMMGSetSNES (damgsnes.c:668) > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2175== by 0x804C4FF: main (ex19.c:140) > ==2175== Address 0x459cf78 is 24 bytes inside a block of size 31 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x86784F2: KSPCreate_FGMRES (fgmres.c:753) > ==2175== by 0x8635D92: KSPSetType (itcreate.c:576) > ==2175== by 0x81BE7AA: DMMGSetUpLevel (damg.c:372) > ==2175== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2175== by 0x804C4FF: main (ex19.c:140) > ==2175== > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > ==2174== by 0x8718EF5: PCRegister (precon.c:1537) > ==2174== by 0x858B07B: PCRegisterAll (pcregis.c:95) > ==2174== by 0x8642C37: PCInitializePackage (dlregisksp.c:60) > ==2174== by 0x870E72D: PCCreate (precon.c:299) > ==2174== by 0x862ADD8: KSPGetPC (itfunc.c:1251) > ==2174== by 0x861BF6A: KSPSetOptionsPrefix (itcl.c:87) > ==2174== by 0x81AD4A9: SNESSetOptionsPrefix (snes.c:2529) > ==2174== by 0x81C52C6: DMMGSetSNES (damgsnes.c:612) > ==2174== > ==2174== Invalid read of size 8 > ==2174== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) > ==2174== by 0x8085399: PetscFListFind (reg.c:375) > ==2174== by 0x858986E: PCSetType (pcset.c:66) > ==2174== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2174== by 0x804C4FF: main (ex19.c:140) > ==2174== Address 0x4630188 is 8 bytes inside a block of size 10 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > ==2174== by 0x8085186: PetscFListFind (reg.c:356) > ==2174== by 0x858986E: PCSetType (pcset.c:66) > ==2174== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2174== by 0x804C4FF: main (ex19.c:140) > ==2174== > ==2174== Invalid read of size 8 > ==2174== at 0x43BE328: __strcmp_ssse3 (strcmp-ssse3.S:143) > ==2174== by 0x8085399: PetscFListFind (reg.c:375) > ==2174== by 0x858986E: PCSetType (pcset.c:66) > ==2174== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2174== by 0x804C4FF: main (ex19.c:140) > ==2174== Address 0x45a0780 is 6 bytes after a block of size 10 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > ==2174== by 0x8718EF5: PCRegister (precon.c:1537) > ==2174== by 0x858B07B: PCRegisterAll (pcregis.c:95) > ==2174== by 0x8642C37: PCInitializePackage (dlregisksp.c:60) > ==2174== by 0x870E72D: PCCreate (precon.c:299) > ==2174== by 0x862ADD8: KSPGetPC (itfunc.c:1251) > ==2174== by 0x861BF6A: KSPSetOptionsPrefix (itcl.c:87) > ==2174== by 0x81AD4A9: SNESSetOptionsPrefix (snes.c:2529) > ==2174== by 0x81C52C6: DMMGSetSNES (damgsnes.c:612) > ==2174== > ==2174== Invalid read of size 8 > ==2174== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) > ==2174== by 0x8085399: PetscFListFind (reg.c:375) > ==2174== by 0x858986E: PCSetType (pcset.c:66) > ==2174== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2174== by 0x804C4FF: main (ex19.c:140) > ==2174== Address 0x4630190 is 6 bytes after a block of size 10 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174==2175== Invalid read of size 8 > ==2175== at 0x43BE320: __strcmp_ssse3 (strcmp-ssse3.S:141) > ==2175== by 0x8085399: PetscFListFind (reg.c:375) > ==2175== by 0x858986E: PCSetType (pcset.c:66) > ==2175== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2175== by 0x804C4FF: main (ex19.c:140) > ==2175== Address 0x450e678 is 8 bytes inside a block of size 10 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > ==2175== by 0x8718EF5: PCRegister (precon.c:1537) > ==2175== by 0x858B07B: PCRegisterAll (pcregis.c:95) > ==2175== by 0x8642C37: PCInitializePackage (dlregisksp.c:60) > ==2175== by 0x870E72D: PCCreate (precon.c:299) > ==2175== by 0x862ADD8: KSPGetPC (itfunc.c:1251) > ==2175== by 0x861BF6A: KSPSetOptionsPrefix (itcl.c:87) > ==2175== by 0x81AD4A9: SNESSetOpti== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > ==2174== by 0x8085186: PetscFListFind (reg.c:356) > ==2174== by 0x858986E: PCSetType (pcset.c:66) > ==2174== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2174== by 0x804C4FF: main (ex19.c:140) > ==2174== > onsPrefix (snes.c:2529) > ==2175== by 0x81C52C6: DMMGSetSNES (damgsnes.c:612) > ==2175== > ==2175== Invalid read of size 8 > ==2175== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) > ==2175== by 0x8085399: PetscFListFind (reg.c:375) > ==2175== by 0x858986E: PCSetType (pcset.c:66) > ==2175== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2175== by 0x804C4FF: main (ex19.c:140) > ==2175== Address 0x459e088 is 8 bytes inside a block of size 10 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > ==2175== by 0x8085186: PetscFListFind (reg.c:356) > ==2175== by 0x858986E: PCSetType (pcset.c:66) > ==2175== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2175== by 0x804C4FF: main (ex19.c:140) > ==2175== > ==2175== Invalid read of size 8 > ==2175== at 0x43BE328: __strcmp_ssse3 (strcmp-ssse3.S:143) > ==2175== by 0x8085399: PetscFListFind (reg.c:375) > ==2175== by 0x858986E: PCSetType (pcset.c:66) > ==2175== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2175== by 0x804C4FF: main (ex19.c:140) > ==2175== Address 0x450e680 is 6 bytes after a block of size 10 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > ==2175== by 0x8718EF5: PCRegister (precon.c:1537) > ==2175== by 0x858B07B: PCRegisterAll (pcregis.c:95) > ==2175== by 0x8642C37: PCInitializePackage (dlregisksp.c:60) > ==2175== by 0x870E72D: PCCreate (precon.c:299) > ==2175== by 0x862ADD8: KSPGetPC (itfunc.c:1251) > ==2175== by 0x861BF6A: KSPSetOptionsPrefix (itcl.c:87) > ==2175== by 0x81AD4A9: SNESSetOptionsPrefix (snes.c:2529) > ==2175== by 0x81C52C6: DMMGSetSNES (damgsnes.c:612) > ==2175== > ==2175== Invalid read of size 8 > ==2175== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) > ==2175== by 0x8085399: PetscFListFind (reg.c:375) > ==2175== by 0x858986E: PCSetType (pcset.c:66) > ==2175== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2175== by 0x804C4FF: main (ex19.c:140) > ==2175== Address 0x459e090 is 6 bytes after a block of size 10 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > ==2175== by 0x8085186: PetscFListFind (reg.c:356) > ==2175== by 0x858986E: PCSetType (pcset.c:66) > ==2175== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2175== by 0x804C4FF: main (ex19.c:140) > ==2175== > ==2175== Invalid read of size 8 > ==2175== at 0x43B674F: __memcpy_ssse3 (memcpy-ssse3.S:703) > ==2175== by 0x87923CB: MPIR_Allgatherv (allgatherv.c:340) > ==2175== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) > ==2175== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) > ==2175== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) > ==2175== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2175== by 0x804C4FF: main (ex19.c:140) > ==2175== Address 0x47cd128 is 0 bytes after a block of size 720 alloc'd > ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > ==2175== by 0x8791B68: MPIR_Allgatherv (allgatherv.c:143) > ==2175== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) > ==2175== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) > ==2175== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) > ==2175== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2175== by 0x804C4FF: main (ex19.c:140) > ==2175== > ==2174== Invalid read of size 8 > ==2174== at 0x43B677F: __memcpy_ssse3 (memcpy-ssse3.S:715) > ==2174== by 0x8791BD2: MPIR_Allgatherv (allgatherv.c:160) > ==2174== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) > ==2174== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) > ==2174== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) > ==2174== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2174== by 0x804C4FF: main (ex19.c:140) > ==2174== Address 0x4920208 is 0 bytes after a block of size 1,416 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x8132724: ISCreateGeneral (general.c:342) > ==2174== by 0x813BE92: ISColoringGetIS (iscoloring.c:161) > ==2174== by 0x836235A: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:30) > ==2174== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) > ==2174== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2174== by 0x804C4FF: main (ex19.c:140) > ==2174== > lid velocity = 0.000287274, prandtl # = 1, grashof # = 1 > ==2174== Invalid read of size 8 > ==2174== at 0x43197A0: __strlen_sse2 (strlen.S:99) > ==2175== Invalid read of size 8 > ==2174== by 0x80BDE83: PetscStrallocpy (str.c:79) > ==2174== by 0x80B2778: PetscOptionsBegin_Private (aoptions.c:38) > ==2174== by 0x858A3C6: PCSetFromOptions (pcset.c:170) > ==2174== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > ==2174== by 0x85A8052: PCSetUp_MG (mg.c:490) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x8622C90: KSPSolve (itfunc.c:353) > ==2174== by 0x81B0295: SNES_KSPSolve (snes.c:2944) > ==2174== by 0x86A2C12: SNESSolve_LS (ls.c:191) > ==2174== by 0x81AB5EC: SNESSolve (snes.c:2255) > ==2174== Address 0x4858f38 is 8 bytes inside a block of size 11 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) > ==2174== by 0x80A6564: PetscObjectAppendOptionsPrefix (prefix.c:70) > ==2174== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) > ==2174== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) > ==2174== by 0x85A4D63: PCMGSetLevels (mg.c:180) > ==2174== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) > ==2174== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2174== by 0x804C4FF: main (ex19.c:140) > ==2174== > ==2175== at 0x43197A0: __strlen_sse2 (strlen.S:99) > ==2175== by 0x80BDE83: PetscStrallocpy (str.c:79) > ==2175== by 0x80B2778: PetscOptionsBegin_Private (aoptions.c:38) > ==2175== by 0x858A3C6: PCSetFromOptions (pcset.c:170) > ==2175== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > ==2175== by 0x85A8052: PCSetUp_MG (mg.c:490) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x8622C90: KSPSolve (itfunc.c:353) > ==2175== by 0x81B0295: SNES_KSPSolve (snes.c:2944) > ==2175== by 0x86A2C12: SNESSolve_LS (ls.c:191) > ==2175== by 0x81AB5EC: SNESSolve (snes.c:2255) > ==2175== Address 0x4797eb8 is 8 bytes inside a block of size 11 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) > ==2175== by 0x80A6564: PetscObjectAppendOptionsPrefix (prefix.c:70) > ==2175== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) > ==2175== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) > ==2175== by 0x85A4D63: PCMGSetLevels (mg.c:180) > ==2175== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) > ==2175== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (dam==2174== Invalid read of size 8 > ==2174== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) > ==2174== by 0x8085406: PetscFListFind (reg.c:376) > ==2174== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > ==2174== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > ==2174== by 0x870C60A: PCGetDefaultType_Private (precon.c:25) > ==2174== by 0x858A499: PCSetFromOptions (pcset.c:172) > ==2174== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > ==2174== by 0x85A8052: PCSetUp_MG (mg.c:490) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x8622C90: KSPSolve (itfunc.c:353) > ==2174== by 0x81B0295: SNES_KSPSolve (snes.c:2944) > ==2174== Address 0x4cfa520 is 16 bytes inside a block of size 22 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24)gsnes.c:952) > ==2175== by 0x804C4FF: main (ex19.c:140) > ==2175== > > ==2174== by 0x8085186: PetscFListFind (reg.c:356) > ==2174== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > ==2174== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > ==2174== by 0x870C60A: PCGetDefaultType_Private (precon.c:25) > ==2174== by 0x858A499: PCSetFromOptions (pcset.c:172) > ==2174== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > ==2174== by 0x85A8052: PCSetUp_MG (mg.c:490) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== > ==2175== Invalid read of size 8 > ==2175== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) > ==2175== by 0x8085406: PetscFListFind (reg.c:376) > ==2175== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > ==2175== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > ==2175== by 0x870C60A: PCGetDefaultType_Private (precon.c:25) > ==2175== by 0x858A499: PCSetFromOptions (pcset.c:172) > ==2175== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > ==2175== by 0x85A8052: PCSetUp_MG (mg.c:490) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x8622C90: KSPSolve (itfunc.c:353) > ==2175== by 0x81B0295: SNES_KSPSolve (snes.c:2944) > ==2175== Address 0x4c39110 is 16 bytes inside a block of size 22 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > ==2175== by 0x8085186: PetscFListFind (reg.c:356) > ==2175== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > ==2175== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > ==2175== by 0x870C60A: PCGetDefaultType_Private (precon.c:25) > ==2175== by 0x858A499: PCSetFromOptions (pcset.c:172) > ==2175== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > ==2175== by 0x85A8052: PCSetUp_MG (mg.c:490) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== > ==2175== Invalid read of size 8 > ==2175== at 0x43197A0: __strlen_sse2 (strlen.S:99) > ==2175== by 0x80A671F: PetscObjectAppendOptionsPrefix (prefix.c:76) > ==2175== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) > ==2174== Invalid read of size 8 > ==2174== at 0x43197A0: __strlen_sse2 (strlen.S:99) > ==2174== by 0x80A671F: PetscObjectAppendOptionsPrefix (prefix.c:76) > ==2174== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) > ==2174== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) > ==2174== by 0x859F3D9: PCSetUp_BJacobi_Singleblock (bjacobi.c:905) > ==2174== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x85A8E40: PCSetUp_MG (mg.c:556) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x8622C90: KSPSolve (itfunc.c:353) > ==2174== Address 0x4d567c8 is 8 bytes inside a block of size 13 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) > ==2174== by 0x87166FA: PCSetOptionsPrefix (precon.c:1209) > ==2174== by 0x861BFD3: KSPSetOptionsPrefix (itcl.c:88) > ==2174== by 0x859F375: PCSetUp_BJacobi_Singleblock (bjacobi.c:904) > ==2174== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x85A8E40: PCSetUp_MG (mg.c:556) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== > ==2175== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) > ==2175== by 0x859F3D9: PCSetUp_BJacobi_Singleblock (bjacobi.c:905) > ==2175== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x85A8E40: PCSetUp_MG (mg.c:556) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x8622C90: KSPSolve (itfunc.c:353) > ==2175== Address 0x4c42198 is 8 bytes inside a block of size 13 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) > ==2175== by 0x87166FA: PCSetOptionsPrefix (precon.c:1209) > ==2175== by 0x861BFD3: KSPSetOptionsPrefix (itcl.c:88) > ==2175== by 0x859F375: PCSetUp_BJacobi_Singleblock (bjacobi.c:904) > ==2175== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x85A8E40: PCSetUp_MG (mg.c:556) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== > ==2174== Invalid read of size 8 > ==2174== at 0x43197AF: __strlen_sse2 (strlen.S:106) > ==2174== by 0x80BDE83: PetscStrallocpy (str.c:79) > ==2174== by 0x80B2778: PetscOptionsBegin_Private (aoptions.c:38) > ==2174== by 0x858A3C6: PCSetFromOptions (pcset.c:170) > ==2174== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > ==2174== by 0x859F978: PCSetUp_BJacobi_Singleblock (bjacobi.c:944) > ==2174== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x85A8E40: PCSetUp_MG (mg.c:556) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== Address 0x4d568d0 is 16 bytes inside a block of size 17 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80A67BF: PetscObjectAppendOptionsPrefix (prefix.c:77) > ==2174== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) > ==2174== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) > ==2174== by 0x859F3D9: PCSetUp_BJacobi_Singleblock (bjacobi.c:905) > ==2174== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x85A8E40: PCSetUp_MG (mg.c:556) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== > ==2175== Invalid read of size 8 > ==2175== at 0x43197AF: __strlen_sse2 (strlen.S:106) > ==2175== by 0x80BDE83: PetscStrallocpy (str.c:79) > ==2175== by 0x80B2778: PetscOptionsBegin_Private (aoptions.c:38) > ==2175== by 0x858A3C6: PCSetFromOptions (pcset.c:170) > ==2175== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > ==2175== by 0x859F978: PCSetUp_BJacobi_Singleblock (bjacobi.c:944) > ==2175== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x85A8E40: PCSetUp_MG (mg.c:556) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== Address 0x4c422a0 is 16 bytes inside a block of size 17 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80A67BF: PetscObjectAppendOptionsPrefix (prefix.c:77) > ==2175== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) > ==2175== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) > ==2175== by 0x859F3D9: PCSetUp_BJacobi_Singleblock (bjacobi.c:905) > ==2175== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x85A8E40: PCSetUp_MG (mg.c:556) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== > ==2175== Invalid read of size 8 > ==2175== at 0x43BF3A5: __strcmp_ssse3 (strcmp-ssse3.S:1687) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x85E8F9A: PCCreate_ILU (ilu.c:379) > ==2175== by 0x8589A8B: PCSetType (pcset.c:78) > ==2175== by 0x858A64C: PCSetFromOptions (pcset.c:181) > ==2175== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > ==2175== by 0x859F978: PCSetUp_BJacobi_Singleblock (bjacobi.c:944) > ==2175== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== Address 0x4c94a28 is 24 bytes inside a block of size 27 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x85E8F25: PCCreate_ILU (ilu.c:377) > ==2175== by 0x8589A8B: PCSetType (pcset.c:78) > ==2175== by 0x858A64C: PCSetFromOptions (pcset.c:181) > ==2175== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > ==2175== by 0x859F978: PCSetUp_BJacobi_Singleblock (bjacobi.c:944) > ==2175== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > ==2175== > ==2174== Invalid read of size 8 > ==2174== at 0x43BF3A5: __strcmp_ssse3 (strcmp-ssse3.S:1687) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x85E8F9A: PCCreate_ILU (ilu.c:379) > ==2174== by 0x8589A8B: PCSetType (pcset.c:78) > ==2174== by 0x858A64C: PCSetFromOptions (pcset.c:181) > ==2174== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > ==2174== by 0x859F978: PCSetUp_BJacobi_Singleblock (bjacobi.c:944) > ==2174== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== Address 0x4d58a28 is 24 bytes inside a block of size 27 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x85E8F25: PCCreate_ILU (ilu.c:377) > ==2174== by 0x8589A8B: PCSetType (pcset.c:78) > ==2174== by 0x858A64C: PCSetFromOptions (pcset.c:181) > ==2174== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > ==2174== by 0x859F978: PCSetUp_BJacobi_Singleblock (bjacobi.c:944) > ==2174== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > ==2174== > ==2175== Invalid read of size 8 > ==2175== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) > ==2175== by 0x8093FC2: PetscTypeCompare (destroy.c:254) > ==2175== by 0x85A956A: PCSetUp_MG (mg.c:585) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x8622C90: KSPSolve (itfunc.c:353) > ==2175== by 0x81B0295: SNES_KSPSolve (snes.c:2944) > ==2175== by 0x86A2C12: SNESSolve_LS (ls.c:191) > ==2175== by 0x81AB5EC: SNESSolve (snes.c:2255) > ==2175== by 0x81C4919: DMMGSolveSNES (damgsnes.c:510) > ==2175== by 0x81BDF6C: DMMGSolve (damg.c:313) > ==2175== by 0x804C9A7: main (ex19.c:155) > ==2175== Address 0x4798858 is 8 bytes inside a block of size 10 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x80902DB: PetscObjectChangeTypeName (pname.c:114) > ==2175== by 0x8589AEE: PCSetType (pcset.c:79) > ==2175== by 0x85A4F3C: PCMGSetLevels (mg.c:187) > ==2175== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) > ==2175== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2175== by 0x804C4FF: main (ex19.c:140) > ==2175== > ==2175== Invalid read of size 8 > ==2175== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) > ==2175== by 0x8093FC2: PetscTypeCompare (destroy.c:254) > ==2175== by 0x85A956A: PCSetUp_MG (mg.c:585) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x8622C90: KSPSolve (itfunc.c:353) > ==2175== by 0x81B0295: SNES_KSPSolve (snes.c:2944) > ==2175== by 0x86A2C12: SNESSolve_LS (ls.c:191) > ==2175== by 0x81AB5EC: SNESSolve (snes.c:2255) > ==2175== by 0x81C4919: DMMGSolveSNES (damgsnes.c:510) > ==2175== by 0x81BDF6C: DMMGSolve (damg.c:313) > ==2175== by 0x804C9A7: main (ex19.c:155) > ==2175== Address 0x4798860 is 6 bytes after a block of size 10 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x80902DB: PetscObjectChangeTypeName (pname.c:114) > ==2175== by 0x8589AEE: PCSetType (pcset.c:79) > ==2175== by 0x85A4F3C: PCMGSetLevels (mg.c:187) > ==2175== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) > ==2175== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2175== by 0x804C4FF: main (ex19.c:140) > ==2175== > ==2174== Invalid read of size 8 > ==2174== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) > ==2174== by 0x8093FC2: PetscTypeCompare (destroy.c:254) > ==2174== by 0x85A956A: PCSetUp_MG (mg.c:585) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x8622C90: KSPSolve (itfunc.c:353) > ==2174== by 0x81B0295: SNES_KSPSolve (snes.c:2944) > ==2174== by 0x86A2C12: SNESSolve_LS (ls.c:191) > ==2174== by 0x81AB5EC: SNESSolve (snes.c:2255) > ==2174== by 0x81C4919: DMMGSolveSNES (damgsnes.c:510) > ==2174== by 0x81BDF6C: DMMGSolve (damg.c:313) > ==2174== by 0x804C9A7: main (ex19.c:155) > ==2174== Address 0x48598d8 is 8 bytes inside a block of size 10 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x80902DB: PetscObjectChangeTypeName (pname.c:114) > ==2174== by 0x8589AEE: PCSetType (pcset.c:79) > ==2174== by 0x85A4F3C: PCMGSetLevels (mg.c:187) > ==2174== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) > ==2174== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2174== by 0x804C4FF: main (ex19.c:140) > ==2174== > ==2174== Invalid read of size 8 > ==2174== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) > ==2174== by 0x8093FC2: PetscTypeCompare (destroy.c:254) > ==2174== by 0x85A956A: PCSetUp_MG (mg.c:585) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x8622C90: KSPSolve (itfunc.c:353) > ==2174== by 0x81B0295: SNES_KSPSolve (snes.c:2944) > ==2174== by 0x86A2C12: SNESSolve_LS (ls.c:191) > ==2174== by 0x81AB5EC: SNESSolve (snes.c:2255) > ==2174== by 0x81C4919: DMMGSolveSNES (damgsnes.c:510) > ==2174== by 0x81BDF6C: DMMGSolve (damg.c:313) > ==2174== by 0x804C9A7: main (ex19.c:155) > ==2174== Address 0x48598e0 is 6 bytes after a block of size 10 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x80902DB: PetscObjectChangeTypeName (pname.c:114) > ==2174== by 0x8589AEE: PCSetType (pcset.c:79) > ==2174== by 0x85A4F3C: PCMGSetLevels (mg.c:187) > ==2174== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) > ==2174== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2174== by 0x804C4FF: main (ex19.c:140) > ==2174== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF669: __strcmp_ssse3 (strcmp-ssse3.S:2007) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF66E: __strcmp_ssse3 (strcmp-ssse3.S:2010) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF673: __strcmp_ssse3 (strcmp-ssse3.S:2013) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF678: __strcmp_ssse3 (strcmp-ssse3.S:2016) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF67D: __strcmp_ssse3 (strcmp-ssse3.S:2019) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8086A53: Pets630) > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2175== > ==2175== Invalid read of size 8 > ==2175== at 0x43BF10D: __strcmp_ssse3 (strcmp-ssse3.S:1446) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== cFListDuplicate (reg.c:596) > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF669: __strcmp_ssse3 (strcmp-ssse3.S:2007) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2174== by 0x82D2300: MatILUFactorSym by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2175== Address 0x501a448 is 24 bytes inside a block of size 30 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x82BE3A6: MatCreate_SeqAIJ (aij.c:3363) > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > ==2175== by 0x82C3982: MatGetFactor_seqaij_petsc (aijfact.c:118) > ==2175== by 0x8276893: MatGetFactor (matrix.c:3649) > ==2175== by 0x85E7687: PCSetUp_ILU (ilu.c:202) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==21bolic_SeqAIJ (aijfact.c:1731) > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF66E: __strcmp_ssse3 (strcmp-ssse3.S:2010) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x86275== at 0x43BF12F: __strcmp_ssse3 (strcmp-ssse3.S:1456) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2175== by 0x82BFA68: MatDuplicateNoCre1C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF673: __strcmp_ssse3 (strcmp-ssse3.S:2013) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2174== > ==2174== Conditional jump or move depate_SeqAIJ (aij.c:3511) > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2175== by 0x82866FB: MatILUFactorSymends on uninitialised value(s) > ==2174== at 0x43BF678: __strcmp_ssse3 (strcmp-ssse3.S:2016) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF67D: __strcmp_ssse3 (strcmp-ssse3.S:2019) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2174==bolic (matrix.c:5464) > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x859D9CE: PCSetUpOnBlocks by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2174== > ==2174== Invalid read of size 8 > ==2174== at 0x43BF10D: __strcmp_ssse3 (strcmp-ssse3.S:1446) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2174== by 0x82866FB: MatILUFactorSymb_BJacobi_Singleblock (bjacobi.c:753) > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5D4: __strcmp_ssse3olic (matrix.c:5464) > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2174== Address 0x5799948 is 24 bytes inside a block of size 30 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x82BE3A6: MatCreate_SeqAIJ (aij.c:3363) > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > ==2174== by 0x82C3982: MatGetFactor_seqaij_petsc (aijfact.c:118) > ==2174== by 0x8276893: MatGetFactor (matrix.c:3649) > ==2174== by 0x85E7687: PCSetUp_ILU (ilu.c:202) > = (strcmp-ssse3.S:1902) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5D9: __strcmp_ssse3 (strcmp-ssse3.S:1905) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2175== =2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF12F: __strcmp_ssse3 (strcmp-ssse3.S:1456) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > ==2174== by 0x80842Bby 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2175== > ==2175== Invalid read of size 8 > ==2175== at 0x43197BE: __strlen_sse2 (strlen.S:112) > ==2175== by 0x80BDE83: PetscStrallocpy (str.c:79) > ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > ==2175== by 0x808432D: PetscFListAdd (reg.c:225) > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==8: PetscFListAdd (reg.c:223) > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu02175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== Address 0x45afcd0 is 32 bytes inside a block of size 35 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > ==2175== by 0x80846F3: PetscFListAdd (reg.c:238) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x82BE3A6: MatCreate_SeqAIJ (aij.c:3363) > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > ==2175== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > ==2175== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > ==2175== by 0x818A96F: DAGetMatrix2d_MPIAIJ (fdda.c:779) > (aijfact.c:1630) > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2174== ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5DE: __strcmp_ssse3 (strcmp-ssse3.S:1908) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2175== > ==2175== Invalid read of size 8 > ==2175== at 0x43BF165: __strcmp_ssse3 (strcmp-ssse3.S:1474) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:59 by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2174== by 0x8714602: PCSetUpOnBlocks (pr6) > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2175== Address 0x541be90 is 32 bytes inside a block of size 37 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x82BEAecon.c:828) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5D4: __strcmp_ssse3 (strcmp-ssse3.S:1902) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5D9: __strcmp_ssse3 (strcmp-ssse3.S:1905) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174=33: MatCreate_SeqAIJ (aij.c:3408) > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > ==2175== by 0x82C3982: MatGetFactor_seqaij_petsc (aijfact.c:118) > ==2175== by 0x8276893: MatGetFactor (matrix.c:3649) > ==2175== by 0x85E7687: PCSetUp_ILU (ilu.c:202) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF18F: __strcmp_ssse3 (strcmp-ssse3.S:1485) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2175== by 0x8714039: PCSetUp (precon.c:795) > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2175== by 0x859D9CE: PCSetUpOnBlock= by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2174== > ==2174== Invalid read of size 8 > ==2174== at 0x43197BE: __strlen_sse2 (strlen.S:112) > ==2174== by 0x80BDE83: PetscStrallocpy (str.c:79) > ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > ==2174== by 0x808432D: PetscFListAdd (reg.c:225) > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2174== by 0x82s_BJacobi_Singleblock (bjacobi.c:753) > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2175== > D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== Address 0x4642530 is 32 bytes inside a block of size 35 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > ==2174== by 0x80846F3: PetscFListAdd (reg.c:238) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x82BE3A6: MatCreate_SeqAIJ (aij.c:3363) > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > ==2174== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > ==2174== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > ==2174== by 0x818A96F: DAGetMatrix2d_MPIAIJ (fdda.c:779) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5DE: __strcmp_ssse3 (strcmp-ssse3.S:1908) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2174== > ==2174== Invalid read of size 8 > ==2174== at 0x43BF165: __strcmp_ssse3 (strcmp-ssse3.S:1474) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2174== Address 0x54138b0 is 32 bytes inside a block of size 37 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x82BEA33: MatCreate_SeqAIJ (aij.c:3408) > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > ==2174== by 0x82C3982: MatGetFactor_seqaij_petsc (aijfact.c:118) > ==2174== by 0x8276893: MatGetFactor (matrix.c:3649) > ==2174== by 0x85E7687: PCSetUp_ILU (ilu.c:202) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF18F: __strcmp_ssse3 (strcmp-ssse3.S:1485) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > ==2174== by 0x8714039: PCSetUp (precon.c:795) > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > ==2174== > Number of Newton iterations = 2 > ==2175== Invalid read of size 8 > ==2175== at 0x43BEFE9: __strcmp_ssse3 (strcmp-ssse3.S:1339) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x86A4028: SNESDestroy_LS (ls.c:322) > ==2175== by 0x81A578E: SNESDestroy (snes.c:1406) > ==2175== by 0x8093606: PetscObjectDestroy (destroy.c:172) > ==2175== by 0x81BCE39: DMMGDestroy (damg.c:179) > ==2175== by 0x804CBD4: main (ex19.c:174) > ==2175== Address 0x4b28898 is 24 bytes inside a block of size 28 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x86AA093: SNESCreate_LS (ls.c:1199) > ==2175== by 0x81AC1EF: SNESSetType (snes.c:2353) > ==2175== by 0x819BDE2: SNESSetFromOptions (snes.c:306) > ==2175== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) > ==2175== by 0x804C56B: main (ex19.c:141) > ==2175== > ==2175== Invalid read of size 8 > ==2175== at 0x43BF015: __strcmp_ssse3 (strcmp-ssse3.S:1354) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x86A4028: SNESDestroy_LS (ls.c:322) > ==2175== by 0x81A578E: SNESDestroy (snes.c:1406) > ==2175== by 0x8093606: PetscObjectDestroy (destroy.c:172) > ==2175== by 0x81BCE39: DMMGDestroy (damg.c:179) > ==2175== by 0x804CBD4: main (ex19.c:174) > ==2175== Address 0x4b28898 is 24 bytes inside a block of size 28 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x86AA093: SNESCreate_LS (ls.c:1199) > ==2175== by 0x81AC1EF: SNESSetType (snes.c:2353) > ==2175== by 0x819BDE2: SNESSetFromOptions (snes.c:306) > ==2175== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) > ==2175== by 0x804C56B: main (ex19.c:141) > ==2175== > ==2175== Invalid read of size 8 > ==2175== at 0x43BF04A: __strcmp_ssse3 (strcmp-ssse3.S:1369) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x86A4028: SNESDestroy_LS (ls.c:322) > ==2175== by 0x81A578E: SNESDestroy (snes.c:1406) > ==2175== by 0x8093606: PetscObjectDestroy (destroy.c:172) > ==2175== by 0x81BCE39: DMMGDestroy (damg.c:179) > ==2175== by 0x804CBD4: main (ex19.c:174) > ==2175== Address 0x4b288a0 is 4 bytes after a block of size 28 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x86AA093: SNESCreate_LS (ls.c:1199) > ==2175== by 0x81AC1EF: SNESSetType (snes.c:2353) > ==2175== by 0x819BDE2: SNESSetFromOptions (snes.c:306) > ==2175== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) > ==2175== by 0x804C56B: main (ex19.c:141) > ==2175== > ==2174== Invalid read of size 8 > ==2174== at 0x43BEFE9: __strcmp_ssse3 (strcmp-ssse3.S:1339) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x86A4028: SNESDestroy_LS (ls.c:322) > ==2174== by 0x81A578E: SNESDestroy (snes.c:1406) > ==2174== by 0x8093606: PetscObjectDestroy (destroy.c:172) > ==2174== by 0x81BCE39: DMMGDestroy (damg.c:179) > ==2174== by 0x804CBD4: main (ex19.c:174) > ==2174== Address 0x4be1e28 is 24 bytes inside a block of size 28 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x86AA093: SNESCreate_LS (ls.c:1199) > ==2174== by 0x81AC1EF: SNESSetType (snes.c:2353) > ==2174== by 0x819BDE2: SNESSetFromOptions (snes.c:306) > ==2174== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) > ==2174== by 0x804C56B: main (ex19.c:141) > ==2174== > ==2174== Invalid read of size 8 > ==2174== at 0x43BF015: __strcmp_ssse3 (strcmp-ssse3.S:1354) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x86A4028: SNESDestroy_LS (ls.c:322) > ==2174== by 0x81A578E: SNESDestroy (snes.c:1406) > ==2174== by 0x8093606: PetscObjectDestroy (destroy.c:172) > ==2174== by 0x81BCE39: DMMGDestroy (damg.c:179) > ==2174== by 0x804CBD4: main (ex19.c:174) > ==2174== Address 0x4be1e28 is 24 bytes inside a block of size 28 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x86AA093: SNESCreate_LS (ls.c:1199) > ==2174== by 0x81AC1EF: SNESSetType (snes.c:2353) > ==2174== by 0x819BDE2: SNESSetFromOptions (snes.c:306) > ==2174== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) > ==2174== by 0x804C56B: main (ex19.c:141) > ==2174== > ==2174== Invalid read of size 8 > ==2174== at 0x43BF04A: __strcmp_ssse3 (strcmp-ssse3.S:1369) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x86A4028: SNESDestroy_LS (ls.c:322) > ==2174== by 0x81A578E: SNESDestroy (snes.c:1406) > ==2174== by 0x8093606: PetscObjectDestroy (destroy.c:172) > ==2174== by 0x81BCE39: DMMGDestroy (damg.c:179) > ==2174== by 0x804CBD4: main (ex19.c:174) > ==2174== Address 0x4be1e30 is 4 bytes after a block of size 28 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x86AA093: SNESCreate_LS (ls.c:1199) > ==2174== by 0x81AC1EF: SNESSetType (snes.c:2353) > ==2174== by 0x819BDE2: SNESSetFromOptions (snes.c:306) > ==2174== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) > ==2174== by 0x804C56B: main (ex19.c:141) > ==2174== > ==2175== Invalid read of size 8 > ==2175== at 0x43BF469: __strcmp_ssse3 (strcmp-ssse3.S:1765) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x830D1DF: MatDestroy_SeqAIJ_Inode (inode2.c:62) > ==2175== by 0x82AB1E1: MatDestroy_SeqAIJ (aij.c:810) > ==2175== by 0x82595D1: MatDestroy (matrix.c:876) > ==2175== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) > ==2175== by 0x82595D1: MatDestroy (matrix.c:876) > ==2175== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) > ==2175== by 0x82595D1: MatDestroy (matrix.c:876) > ==2175== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) > ==2175== Address 0x44df298 is 24 bytes inside a block of size 26 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x830D905: MatCreate_SeqAIJ_Inode (inode2.c:101) > ==2175== by 0x82BEB04: MatCreate_SeqAIJ (aij.c:3414) > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > ==2175== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > ==2175== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > ==2175== > ==2175== Invalid read of size 8 > ==2175== at 0x43BF46D: __strcmp_ssse3 (strcmp-ssse3.S:1766) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x82AB357: MatDestroy_SeqAIJ (aij.c:815) > ==2175== by 0x82595D1: MatDestroy (matrix.c:876) > ==2175== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) > ==2175== by 0x82595D1: MatDestroy (matrix.c:876) > ==2175== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) > ==2175== by 0x82595D1: MatDestroy (matrix.c:876) > ==2175== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) > ==2175== by 0x85A5A49: PCDestroy_MG (mg.c:257) > ==2175== Address 0x44dd918 is 24 bytes inside a block of size 28 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x82BE442: MatCreate_SeqAIJ (aij.c:3369) > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > ==2175== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > ==2175== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2175== > ==2175== Invalid read of size 8 > ==2175== at 0x43BF22D: __strcmp_ssse3 (strcmp-ssse3.S:1553) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x82AB527: MatDestroy_SeqAIJ (aij.c:819) > ==2175== by 0x82595D1: MatDestroy (matrix.c:876) > ==2175== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) > ==2175== by 0x82595D1: MatDestroy (matrix.c:876) > ==2175== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) > ==2175== by 0x82595D1: MatDestroy (matrix.c:876) > ==2175== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) > ==2175== by 0x85A5A49: PCDestroy_MG (mg.c:257) > ==2175== Address 0x44ddfa8 is 24 bytes inside a block of size 28 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x82BE616: MatCreate_SeqAIJ (aij.c:3381) > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > ==2175== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > ==2175== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2175== > ==2175== Invalid read of size 8 > ==2175== at 0x43BEFED: __strcmp_ssse3 (strcmp-ssse3.S:1340) > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x8327F64: MatDestroy_MPIAIJ (mpiaij.c:919) > ==2175== by 0x82595D1: MatDestroy (matrix.c:876) > ==2175== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) > ==2175== by 0x82595D1: MatDestroy (matrix.c:876) > ==2175== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) > ==2175== by 0x85A5A49: PCDestroy_MG (mg.c:257) > ==2175== by 0x870CC31: PCDestroy (precon.c:83) > ==2175== by 0x8627601: KSPDestroy (itfunc.c:695) > ==2175== Address 0x44db848 is 24 bytes inside a block of size 28 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2175== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > ==2175== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > ==2175== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2175== > ==2174== Invalid read of size 8 > ==2174== at 0x43BF469: __strcmp_ssse3 (strcmp-ssse3.S:1765) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x830D1DF: MatDestroy_SeqAIJ_Inode (inode2.c:62) > ==2174== by 0x82AB1E1: MatDestroy_SeqAIJ (aij.c:810) > ==2174== by 0x82595D1: MatDestroy (matrix.c:876) > ==2174== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) > ==2174== by 0x82595D1: MatDestroy (matrix.c:876) > ==2174== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) > ==2174== by 0x82595D1: MatDestroy (matrix.c:876) > ==2174== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) > ==2174== Address 0x456f218 is 24 bytes inside a block of size 26 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x830D905: MatCreate_SeqAIJ_Inode (inode2.c:101) > ==2174== by 0x82BEB04: MatCreate_SeqAIJ (aij.c:3414) > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > ==2174== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > ==2174== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > ==2174== > ==2174== Invalid read of size 8 > ==2174== at 0x43BF46D: __strcmp_ssse3 (strcmp-ssse3.S:1766) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x82AB357: MatDestroy_SeqAIJ (aij.c:815) > ==2174== by 0x82595D1: MatDestroy (matrix.c:876) > ==2174== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) > ==2174== by 0x82595D1: MatDestroy (matrix.c:876) > ==2174== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) > ==2174== by 0x82595D1: MatDestroy (matrix.c:876) > ==2174== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) > ==2174== by 0x85A5A49: PCDestroy_MG (mg.c:257) > ==2174== Address 0x456d898 is 24 bytes inside a block of size 28 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x82BE442: MatCreate_SeqAIJ (aij.c:3369) > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > ==2174== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > ==2174== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2174== > ==2174== Invalid read of size 8 > ==2174== at 0x43BF22D: __strcmp_ssse3 (strcmp-ssse3.S:1553) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x82AB527: MatDestroy_SeqAIJ (aij.c:819) > ==2174== by 0x82595D1: MatDestroy (matrix.c:876) > ==2174== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) > ==2174== by 0x82595D1: MatDestroy (matrix.c:876) > ==2174== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) > ==2174== by 0x82595D1: MatDestroy (matrix.c:876) > ==2174== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) > ==2174== by 0x85A5A49: PCDestroy_MG (mg.c:257) > ==2174== Address 0x456df28 is 24 bytes inside a block of size 28 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x82BE616: MatCreate_SeqAIJ (aij.c:3381) > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > ==2174== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > ==2174== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2174== > ==2174== Invalid read of size 8 > ==2174== at 0x43BEFED: __strcmp_ssse3 (strcmp-ssse3.S:1340) > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x8327F64: MatDestroy_MPIAIJ (mpiaij.c:919) > ==2174== by 0x82595D1: MatDestroy (matrix.c:876) > ==2174== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) > ==2174== by 0x82595D1: MatDestroy (matrix.c:876) > ==2174== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) > ==2174== by 0x85A5A49: PCDestroy_MG (mg.c:257) > ==2174== by 0x870CC31: PCDestroy (precon.c:83) > ==2174== by 0x8627601: KSPDestroy (itfunc.c:695) > ==2174== Address 0x456b7c8 is 24 bytes inside a block of size 28 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > ==2174== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > ==2174== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > ==2174== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > ==2174== > ==2175== Invalid read of size 8 > ==2175== at 0x43B674F: __memcpy_ssse3 (memcpy-ssse3.S:703) > ==2175== by 0x8791BD2: MPIR_Allgatherv (allgatherv.c:160) > ==2175== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) > ==2175== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) > ==2175== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) > ==2175== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2175== by 0x804C4FF: main (ex19.c:140) > ==2175== Address 0x4c37f38 is 0 bytes after a block of size 360 alloc'd > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2175== by 0x8132724: ISCreateGeneral (general.c:342) > ==2175== by 0x813BE92: ISColoringGetIS (iscoloring.c:161) > ==2175== by 0x836235A: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:30) > ==2175== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) > ==2175== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2175== by 0x804C4FF: main (ex19.c:140) > ==2175== > ==2174== Invalid read of size 8 > ==2174== at 0x43B674F: __memcpy_ssse3 (memcpy-ssse3.S:703) > ==2174== by 0x8791BD2: MPIR_Allgatherv (allgatherv.c:160) > ==2174== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) > ==2174== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) > ==2174== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) > ==2174== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2174== by 0x804C4FF: main (ex19.c:140) > ==2174== Address 0x4cb0c68 is 0 bytes after a block of size 360 alloc'd > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > ==2174== by 0x8132724: ISCreateGeneral (general.c:342) > ==2174== by 0x813BE92: ISColoringGetIS (iscoloring.c:161) > ==2174== by 0x836235A: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:30) > ==2174== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) > ==2174== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2174== by 0x804C4FF: main (ex19.c:140) > ==2174== > ==2174== Invalid read of size 8 > ==2174== at 0x43B674F: __memcpy_ssse3 (memcpy-ssse3.S:703) > ==2174== by 0x87923CB: MPIR_Allgatherv (allgatherv.c:340) > ==2174== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) > ==2174== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) > ==2174== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) > ==2174== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2174== by 0x804C4FF: main (ex19.c:140) > ==2174== Address 0x4461998 is 0 bytes after a block of size 720 alloc'd > ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > ==2174== by 0x8791B68: MPIR_Allgatherv (allgatherv.c:143) > ==2174== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) > ==2174== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) > ==2174== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) > ==2174== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > ==2174== by 0x804C4FF: main (ex19.c:140) > ==2174== > lid velocity = 0.000287274, prandtl # = 1, grashof # = 1 > Number of Newton iterations = 2 > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2175== by 0x80B0A16: PetscFinalize (pinit.c:829) > ==2175== by 0x804CCA7: main (ex19.c:181) > ==2175== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > ==2174== by 0x80B0A16: PetscFinalize (pinit.c:829) > ==2174== by 0x804CCA7: main (ex19.c:181) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > ==2174== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > ==2174== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > ==2174== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > ==2174== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > ==2174== by 0x80B2276: PetscFinalize (pinit.c:973) > ==2174== by 0x804CCA7: main (ex19.c:181) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > ==2174== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > ==2174== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > ==2174== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > ==2174== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > ==2174== by 0x80B2276: PetscFinalize (pinit.c:973) > ==2174== by 0x804CCA7: main (ex19.c:181) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > ==2174== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > ==2174== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > ==2174== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > ==2174== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > ==2174== by 0x80B2276: PetscFinalize (pinit.c:973) > ==2174== by 0x804CCA7: main (ex19.c:181) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > ==2174== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > ==2174== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > ==2174== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > ==2174== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > ==2174== by 0x80B2276: PetscFinalize (pinit.c:973) > ==2174== by 0x804CCA7: main (ex19.c:181) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > ==2174== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > ==2174== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > ==2174== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > ==2174== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > ==2174== by 0x80B2276: PetscFinalize (pinit.c:973) > ==2174== by 0x804CCA7: main (ex19.c:181) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) > ==2174== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > ==2174== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > ==2174== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > ==2174== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > ==2174== by 0x80B2276: PetscFinalize (pinit.c:973) > ==2174== by 0x804CCA7: main (ex19.c:181) > ==2174== > ==2174== Conditional jump or move depends on uninitialised value(s) > ==2174== at 0x43BF5D4: __strcmp_ssse3 (strcmp-ssse3.S:1902) > ==2174== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > ==2174== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > ==2174== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > ==2174== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > ==2174== by 0x80B2276: PetscFinalize (pinit.c:973) > ==2174== by 0x804CCA7: main (ex19.c:181) > ==2174== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > ==2175== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > ==2175== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > ==2175== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > ==2175== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > ==2175== by 0x80B2276: PetscFinalize (pinit.c:973) > ==2175== by 0x804CCA7: main (ex19.c:181) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > ==2175== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > ==2175== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > ==2175== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > ==2175== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > ==2175== by 0x80B2276: PetscFinalize (pinit.c:973) > ==2175== by 0x804CCA7: main (ex19.c:181) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > ==2175== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > ==2175== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > ==2175== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > ==2175== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > ==2175== by 0x80B2276: PetscFinalize (pinit.c:973) > ==2175== by 0x804CCA7: main (ex19.c:181) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > ==2175== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > ==2175== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > ==2175== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > ==2175== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > ==2175== by 0x80B2276: PetscFinalize (pinit.c:973) > ==2175== by 0x804CCA7: main (ex19.c:181) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > ==2175== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > ==2175== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > ==2175== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > ==2175== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > ==2175== by 0x80B2276: PetscFinalize (pinit.c:973) > ==2175== by 0x804CCA7: main (ex19.c:181) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) > ==2175== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > ==2175== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > ==2175== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > ==2175== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > ==2175== by 0x80B2276: PetscFinalize (pinit.c:973) > ==2175== by 0x804CCA7: main (ex19.c:181) > ==2175== > ==2175== Conditional jump or move depends on uninitialised value(s) > ==2175== at 0x43BF5D4: __strcmp_ssse3 (strcmp-ssse3.S:1902) > ==2175== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > ==2175== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > ==2175== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > ==2175== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > ==2175== by 0x80B2276: PetscFinalize (pinit.c:973) > ==2175== by 0x804CCA7: main (ex19.c:181) > ==2175== > ==2174== > ==2174== HEAP SUMMARY: > ==2174== in use at exit: 160 bytes in 11 blocks > ==2174== total heap usage: 60,266 allocs, 60,255 frees, 51,015,236 bytes allocated > ==2174== > ==2174== LEAK SUMMARY: > ==2174== definitely lost: 40 bytes in 1 blocks > ==2174== indirectly lost: 120 bytes in 10 blocks > ==2174== possibly lost: 0 bytes in 0 blocks > ==2174== still reachable: 0 bytes in 0 blocks > ==2174== suppressed: 0 bytes in 0 blocks > ==2174== Rerun with --leak-check=full to see details of leaked memory > ==2174== > ==2174== For counts of detected and suppressed errors, rerun with: -v > ==2174== Use --track-origins=yes to see where uninitialised values come from > ==2174== ERROR SUMMARY: 15690 errors from 164 contexts (suppressed: 0 from 0) > ==2175== > ==2175== HEAP SUMMARY: > ==2175== in use at exit: 160 bytes in 11 blocks > ==2175== total heap usage: 59,069 allocs, 59,058 frees, 49,630,900 bytes allocated > ==2175== > ==2175== LEAK SUMMARY: > ==2175== definitely lost: 40 bytes in 1 blocks > ==2175== indirectly lost: 120 bytes in 10 blocks > ==2175== possibly lost: 0 bytes in 0 blocks > ==2175== still reachable: 0 bytes in 0 blocks > ==2175== suppressed: 0 bytes in 0 blocks > ==2175== Rerun with --leak-check=full to see details of leaked memory > ==2175== > ==2175== For counts of detected and suppressed errors, rerun with: -v > ==2175== Use --track-origins=yes to see where uninitialised values come from > ==2175== ERROR SUMMARY: 15664 errors from 162 contexts (suppressed: 0 from 0) > > > What is going on here? Shall I ignore those errors? > > Thanks a lot! > > Rebecca Xuefei YUAN > Department of Applied Physics and Applied Mathematics > Columbia University > Tel:917-399-8032 > www.columbia.edu/~xy2102 > From balay at mcs.anl.gov Mon Sep 6 20:43:06 2010 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 6 Sep 2010 20:43:06 -0500 (CDT) Subject: [petsc-users] valgrind error comes out when upgrade from ubuntu 8.04 LTS to 10.04 In-Reply-To: <196CED0F-3AD9-487C-AC88-FD94624407ED@mcs.anl.gov> References: <20100906203101.9i11ikcdkos4gss4@cubmail.cc.columbia.edu> <196CED0F-3AD9-487C-AC88-FD94624407ED@mcs.anl.gov> Message-ID: I can't reproduce this. How did you build PETSc? Satish -------------- [configured with: ./configure CC=gcc FC=gfortran --download-mpich=1] ------------- petsc:/sandbox/balay/petsc-3.1-p4/src/snes/examples/tutorials>cat /etc/issue Ubuntu 10.04.1 LTS \n \l petsc:/sandbox/balay/petsc-3.1-p4/src/snes/examples/tutorials>/sandbox/balay/petsc-3.1-p4/linux-gnu-c-debug/bin/mpiexec -n 2 valgrind -q --tool=memcheck ./ex19 -malloc off -da_grid_x 30 -da_grid_y 30 lid velocity = 0.000287274, prandtl # = 1, grashof # = 1 Number of Newton iterations = 2 lid velocity = 0.000287274, prandtl # = 1, grashof # = 1 Number of Newton iterations = 2 petsc:/sandbox/balay/petsc-3.1-p4/src/snes/examples/tutorials> On Mon, 6 Sep 2010, Barry Smith wrote: > > Looks like ubuntu 10.04 is a terrible release without proper testing. All those are problems with the OS and system libraries. Unfortunately it makes it impossible to find errors in PETSc since it gives all those meaningless errors. > > > Barry > > On Sep 6, 2010, at 7:31 PM, Rebecca Xuefei Yuan wrote: > > > Dear all, > > > > I upgrade my laptop from ubuntu 8.04 LTS to 10.04, after the upgrade, I reinstalled PETSc, but there are tons of valgrind errors coming out even the code is unchanged. Then I tried with > > > > ~/soft/petsc-3.1-p4/src/snes/examples/tutorials/ex19.c > > > > of the command: > > > > ~/soft/petsc-3.1-p4/externalpackages/mpich2-1.0.8/bin/mpiexec -np 2 valgrind --tool=memcheck ./ex19.exe -malloc off -da_grid_x 30 -da_grid_y 30 > > > > as instructed in FAQ. However, the errors are very long(hope it is right to post the full log of valgrind here...), as > > > > rebecca at YuanWork:~/linux/code/twoway/twoway_brandnew/trunk/set_a$ ~/soft/petsc-3.1-p4/externalpackages/mpich2-1.0.8/bin/mpiexec -np 2 valgrind --tool=memcheck ./ex19.exe -malloc off -da_grid_x 30 -da_grid_y 30 > > ==2174== Memcheck, a memory error detector > > ==2175== Memcheck, a memory error detector > > ==2175== Copyright (C) 2002-2009, and GNU GPL'd, by Julian Seward et al. > > ==2175== Using Valgrind-3.5.0 and LibVEX; rerun with -h for copyright info > > ==2175== Command: ./ex19.exe -malloc off -da_grid_x 30 -da_grid_y 30 > > ==2175== > > ==2174== Copyright (C) 2002-2009, and GNU GPL'd, by Julian Seward et al. > > ==2174== Using Valgrind-3.5.0 and LibVEX; rerun with -h for copyright info > > ==2174== Command: ./ex19.exe -malloc off -da_grid_x 30 -da_grid_y 30 > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x400B217: _dl_relocate_object (do-rel.h:104) > > ==2174== by 0x40031D0: dl_main (rtld.c:2229) > > ==2174== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > > ==2174== by 0x4000C6C: _dl_start (rtld.c:333) > > ==2174== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) > > ==2174== by 0x40031D0: dl_main (rtld.c:2229) > > ==2174== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > > ==2174== by 0x4000C6C: _dl_start (rtld.c:333) > > ==2174== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > > ==2174== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x400B217: _dl_relocate_object (do-rel.h:104) > > ==2175== by 0x40031D0: dl_main (rtld.c:2229) > > ==2175== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > > ==2175== by 0x4000C6C: _dl_start (rtld.c:333) > > ==2175== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) > > ==2175== by 0x40031D0: dl_main (rtld.c:2229) > > ==2175== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > > ==2175== by 0x4000C6C: _dl_start (rtld.c:333) > > ==2175== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > > ==2175== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x400B27A: _dl_relocate_object (do-rel.h:127) > > ==2174== by 0x40031D0: dl_main (rtld.c:2229) > > ==2174== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > > ==2174== by 0x4000C6C: _dl_start (rtld.c:333) > > ==2174== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > > ==2174== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x400B27A: _dl_relocate_object (do-rel.h:127) > > ==2175== by 0x40031D0: dl_main (rtld.c:2229) > > ==2175== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > > ==2175== by 0x4000C6C: _dl_start (rtld.c:333) > > ==2175== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > > ==2175== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x400A5DE: _dl_relocate_object (do-rel.h:65) > > ==2174== by 0x40030FE: dl_main (rtld.c:2292) > > ==2174== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > > ==2174== by 0x4000C6C: _dl_start (rtld.c:333) > > ==2174== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x400A5E6: _dl_relocate_object (do-rel.h:68) > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x400A5DE: _dl_relocate_object (do-rel.h:65) > > ==2175== by 0x40030FE: dl_main (rtld.c:2292) > > ==2175== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > > ==2175== by 0x4000C6C: _dl_start (rtld.c:333) > > ==2175== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x400A5E6: _dl_relocate_object (do-rel.h:68) > > ==2175== by 0x40030FE: dl_main (rtld.c:2292) > > ==2175== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > > ==2175== by 0x4000C6C: _dl_start (rtld.c:333) > > ==2175== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) > > ==2175== by 0x40030FE: dl_main (rtld.c:2292) > > ==2175== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > > ==2175== by 0x4000C6C: _dl_start (rtld.c:333) > > ==2175== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > > ==2175== > > ==2174== by 0x40030FE: dl_main (rtld.c:2292) > > ==2174== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > > ==2174== by 0x4000C6C: _dl_start (rtld.c:333) > > ==2174== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) > > ==2174== by 0x40030FE: dl_main (rtld.c:2292) > > ==2174== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) > > ==2174== by 0x4000C6C: _dl_start (rtld.c:333) > > ==2174== by 0x4000856: ??? (in /lib/ld-2.11.1.so) > > ==2174== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x87E2BD2: T.206 (simple_pmi.c:985) > > ==2175== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > > ==2175== by 0x87C742A: MPID_Init (mpid_init.c:331) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43197CB: __strlen_sse2 (strlen.S:116) > > ==2175== by 0x87E2BD2: T.206 (simple_pmi.c:985) > > ==2175== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > > ==2175== by 0x87C742A: MPID_Init (mpid_init.c:331) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x87F559E: PMIU_writeline (simple_pmiutil.c:180) > > ==2175== by 0x87E2BD2: T.206 (simple_pmi.c:985) > > ==2175== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > > ==2175== by 0x87C742A: MPID_Init (mpid_init.c:331) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Use of uninitialised value of size 4 > > ==2175== at 0x87F55A4: PMIU_writeline (simple_pmiutil.c:184) > > ==2175== by 0x87E2BD2: T.206 (simple_pmi.c:985) > > ==2175== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > > ==2175== by 0x87C742A: MPID_Init (mpid_init.c:331) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Syscall param write(count) contains uninitialised byte(s) > > ==2175== at 0x41B1EB3: __write_nocancel (syscall-template.S:82) > > ==2175== by 0x87E2BD2: T.206 (simple_pmi.c:985) > > ==2175== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > > ==2175== by 0x87C742A: MPID_Init (mpid_init.c:331) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x87F566C: PMIU_writeline (simple_pmiutil.c:197) > > ==2175== by 0x87E2BD2: T.206 (simple_pmi.c:985) > > ==2175== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > > ==2175== by 0x87C742A: MPID_Init (mpid_init.c:331) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x87E2BD2: T.206 (simple_pmi.c:985) > > ==2174== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > > ==2174== by 0x87C742A: MPID_Init (mpid_init.c:331) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43197CB: __strlen_sse2 (strlen.S:116) > > ==2174== by 0x87E2BD2: T.206 (simple_pmi.c:985) > > ==2174== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > > ==2174== by 0x87C742A: MPID_Init (mpid_init.c:331) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x87F559E: PMIU_writeline (simple_pmiutil.c:180) > > ==2174== by 0x87E2BD2: T.206 (simple_pmi.c:985) > > ==2174== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > > ==2174== by 0x87C742A: MPID_Init (mpid_init.c:331) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Use of uninitialised value of size 4 > > ==2174== at 0x87F55A4: PMIU_writeline (simple_pmiutil.c:184) > > ==2174== by 0x87E2BD2: T.206 (simple_pmi.c:985) > > ==2174== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > > ==2174== by 0x87C742A: MPID_Init (mpid_init.c:331) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Syscall param write(count) contains uninitialised byte(s) > > ==2174== at 0x41B1EB3: __write_nocancel (syscall-template.S:82) > > ==2174== by 0x87E2BD2: T.206 (simple_pmi.c:985) > > ==2174== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > > ==2174== by 0x87C742A: MPID_Init (mpid_init.c:331) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x87F566C: PMIU_writeline (simple_pmiutil.c:197) > > ==2174== by 0x87E2BD2: T.206 (simple_pmi.c:985) > > ==2174== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) > > ==2174== by 0x87C742A: MPID_Init (mpid_init.c:331) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > > ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > > ==2175== by 0x87C74C7: MPID_Init (mpid_init.c:381) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > > ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > > ==2175== by 0x87C74C7: MPID_Init (mpid_init.c:381) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > > ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > > ==2175== by 0x87C74C7: MPID_Init (mpid_init.c:381) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > > ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > > ==2175== by 0x87C74C7: MPID_Init (mpid_init.c:381) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > > ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > > ==2175== by 0x87C74C7: MPID_Init (mpid_init.c:381) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > > ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > > ==2175== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) > > ==2175== by 0x87C7526: MPID_Init (mpid_init.c:417) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > > ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > > ==2175== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) > > ==2175== by 0x87C7526: MPID_Init (mpid_init.c:417) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > > ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > > ==2175== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) > > ==2175== by 0x87C7526: MPID_Init (mpid_init.c:417) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > > ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > > ==2175== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) > > ==2175== by 0x87C7526: MPID_Init (mpid_init.c:417) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > > ==2175== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > > ==2175== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) > > ==2175== by 0x87C7526: MPID_Init (mpid_init.c:417) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > > ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > > ==2174== by 0x87C74C7: MPID_Init (mpid_init.c:381) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > > ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > > ==2174== by 0x87C74C7: MPID_Init (mpid_init.c:381) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > > ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > > ==2174== by 0x87C74C7: MPID_Init (mpid_init.c:381) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > > ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > > ==2174== by 0x87C74C7: MPID_Init (mpid_init.c:381) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > > ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > > ==2174== by 0x87C74C7: MPID_Init (mpid_init.c:381) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > > ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > > ==2174== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) > > ==2174== by 0x87C7526: MPID_Init (mpid_init.c:417) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > > ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > > ==2174== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) > > ==2174== by 0x87C7526: MPID_Init (mpid_init.c:417) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > > ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > > ==2174== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) > > ==2174== by 0x87C7526: MPID_Init (mpid_init.c:417) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > > ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > > ==2174== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) > > ==2174== by 0x87C7526: MPID_Init (mpid_init.c:417) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > > ==2174== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) > > ==2174== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) > > ==2174== by 0x87C7526: MPID_Init (mpid_init.c:417) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319863: __GI_strlen (strlen.S:138) > > ==2175== by 0x438AD43: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:191) > > ==2175== by 0x438A685: gethostbyname (getXXbyYY.c:117) > > ==2175== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) > > ==2175== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) > > ==2175== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) > > ==2175== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) > > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x431986D: __GI_strlen (strlen.S:144) > > ==2175== by 0x438AD43: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:191) > > ==2175== by 0x438A685: gethostbyname (getXXbyYY.c:117) > > ==2175== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) > > ==2175== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) > > ==2175== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) > > ==2175== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) > > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319863: __GI_strlen (strlen.S:138) > > ==2174== by 0x438AD43: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:191) > > ==2174== by 0x438A685: gethostbyname (getXXbyYY.c:117) > > ==2174== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) > > ==2174== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) > > ==2174== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) > > ==2174== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) > > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x431986D: __GI_strlen (strlen.S:144) > > ==2174== by 0x438AD43: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:191) > > ==2174== by 0x438A685: gethostbyname (getXXbyYY.c:117) > > ==2174== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) > > ==2174== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) > > ==2174== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) > > ==2174== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) > > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2175== Invalid read of size 4 > > ==2175== at 0x431983B: __GI_strlen (strlen.S:115) > > ==2175== by 0x43843CE: __nss_lookup (nsswitch.c:191) > > ==2175== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) > > ==2175== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > > ==2175== by 0x438A685: gethostbyname (getXXbyYY.c:117) > > ==2175== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) > > ==2175== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) > > ==2175== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) > > ==2175== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) > > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== Address 0x44308f8 is 40 bytes inside a block of size 42 alloc'd > > ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > > ==2175== by 0x4384583: nss_parse_service_list (nsswitch.c:622) > > ==2175== by 0x4384E71: __nss_database_lookup (nsswitch.c:775) > > ==2175== by 0x43859AD: __nss_hosts_lookup2 (XXX-lookup.c:71) > > ==2175== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > > ==2175== by 0x438A685: gethostbyname (getXXbyYY.c:117) > > ==2175== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) > > ==2175== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) > > ==2175== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) > > ==2175== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) > > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== > > ==2174== Invalid read of size 4 > > ==2174== at 0x431983B: __GI_strlen (strlen.S:115) > > ==2174== by 0x43843CE: __nss_lookup (nsswitch.c:191) > > ==2174== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) > > ==2174== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > > ==2174== by 0x438A685: gethostbyname (getXXbyYY.c:117) > > ==2174== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) > > ==2174== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) > > ==2174== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) > > ==2174== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) > > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== Address 0x44308f8 is 40 bytes inside a block of size 42 alloc'd > > ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > > ==2174== by 0x4384583: nss_parse_service_list (nsswitch.c:622) > > ==2174== by 0x4384E71: __nss_database_lookup (nsswitch.c:775) > > ==2174== by 0x43859AD: __nss_hosts_lookup2 (XXX-lookup.c:71) > > ==2174== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > > ==2174== by 0x438A685: gethostbyname (getXXbyYY.c:117) > > ==2174== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) > > ==2174== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) > > ==2174== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) > > ==2174== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) > > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x400B217: _dl_relocate_object (do-rel.h:104) > > ==2174== by 0x4011D15: dl_open_worker (dl-open.c:367) > > ==2174== by 0x400D875: _dl_catch_error (dl-error.c:178) > > ==2174== by 0x4011675: _dl_open (dl-open.c:583) > > ==2174== by 0x43AA4A1: do_dlopen (dl-libc.c:86) > > ==2174== by 0x400D875: _dl_catch_error (dl-error.c:178) > > ==2174== by 0x43AA5A0: dlerror_run (dl-libc.c:47) > > ==2174== by 0x43AA6BA: __libc_dlopen_mode (dl-libc.c:160) > > ==2174== by 0x43842E4: __nss_lookup_function (nsswitch.c:405) > > ==2174== by 0x43843CE: __nss_lookup (nsswitch.c:191) > > ==2174== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) > > ==2174== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) > > ==2174== by 0x4011D15: dl_open_worker (dl-open.c:367) > > ==2174== by 0x400D875: _dl_catch_error (dl-error.c:178) > > ==2174== by 0x4011675: _dl_open (dl-open.c:583) > > ==2174== by 0x43AA4A1: do_dlopen (dl-libc.c:86) > > ==2174== by 0x400D875: _dl_catch_error (dl-error.c:178) > > ==2174== by 0x43AA5A0: dlerror_run (dl-libc.c:47) > > ==2174== by 0x43AA6BA: __libc_dlopen_mode (dl-libc.c:160) > > ==2174== by 0x43842E4: __nss_lookup_function (nsswitch.c:405) > > ==2174== by 0x43843CE: __nss_lookup (nsswitch.c:191) > > ==2174== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) > > ==2174== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > > ==2174== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x400B217: _dl_relocate_object (do-rel.h:104) > > ==2175== by 0x4011D15: dl_open_worker (dl-open.c:367) > > ==2175== by 0x400D875: _dl_catch_error (dl-error.c:178) > > ==2175== by 0x4011675: _dl_open (dl-open.c:583) > > ==2175== by 0x43AA4A1: do_dlopen (dl-libc.c:86) > > ==2175== by 0x400D875: _dl_catch_error (dl-error.c:178) > > ==2175== by 0x43AA5A0: dlerror_run (dl-libc.c:47) > > ==2175== by 0x43AA6BA: __libc_dlopen_mode (dl-libc.c:160) > > ==2175== by 0x43842E4: __nss_lookup_function (nsswitch.c:405) > > ==2175== by 0x43843CE: __nss_lookup (nsswitch.c:191) > > ==2175== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) > > ==2175== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) > > ==2175== by 0x4011D15: dl_open_worker (dl-open.c:367) > > ==2175== by 0x400D875: _dl_catch_error (dl-error.c:178) > > ==2175== by 0x4011675: _dl_open (dl-open.c:583) > > ==2175== by 0x43AA4A1: do_dlopen (dl-libc.c:86) > > ==2175== by 0x400D875: _dl_catch_error (dl-error.c:178) > > ==2175== by 0x43AA5A0: dlerror_run (dl-libc.c:47) > > ==2175== by 0x43AA6BA: __libc_dlopen_mode (dl-libc.c:160) > > ==2175== by 0x43842E4: __nss_lookup_function (nsswitch.c:405) > > ==2175== by 0x43843CE: __nss_lookup (nsswitch.c:191) > > ==2175== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) > > ==2175== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > > ==2175== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > > ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > > ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43197CB: __strlen_sse2 (strlen.S:116) > > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > > ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > > ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x87F559E: PMIU_writeline (simple_pmiutil.c:180) > > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > > ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > > ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Use of uninitialised value of size 4 > > ==2174== at 0x87F55A4: PMIU_writeline (simple_pmiutil.c:184) > > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > > ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > > ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Syscall param write(count) contains uninitialised byte(s) > > ==2174== at 0x41B1EB3: __write_nocancel (syscall-template.S:82) > > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > > ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > > ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x87F566C: PMIU_writeline (simple_pmiutil.c:197) > > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > > ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > > ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > > ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > > ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > > ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > > ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > > ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > > ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > > ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > > ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > > ==2174== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > > ==2174== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > > ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > > ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43197CB: __strlen_sse2 (strlen.S:116) > > ==2175== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > > ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > > ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x87F559E: PMIU_writeline (simple_pmiutil.c:180) > > ==2175== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > > ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > > ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Use of uninitialised value of size 4 > > ==2175== at 0x87F55A4: PMIU_writeline (simple_pmiutil.c:184) > > ==2175== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > > ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > > ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Syscall param write(count) contains uninitialised byte(s) > > ==2175== at 0x41B1EB3: __write_nocancel (syscall-template.S:82) > > ==2175== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > > ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > > ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x87F566C: PMIU_writeline (simple_pmiutil.c:197) > > ==2175== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > > ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > > ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > > ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > > ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > > ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > > ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > > ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > > ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > > ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > > ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > > ==2175== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) > > ==2175== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) > > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > > ==2174== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > > ==2174== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > > ==2174== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > > ==2174== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > > ==2174== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) > > ==2174== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > > ==2174== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > > ==2175== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > > ==2175== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > > ==2175== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > > ==2175== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > > ==2175== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) > > ==2175== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) > > ==2175== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) > > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== by 0x879F15A: PMPI_Init (init.c:106) > > ==2175== by 0x80AEE12: PetscI==2174== Invalid read of size 4 > > ==2174== at 0x431983B: __GI_strlen (strlen.S:115) > > ==2174== by 0x43843CE: __nss_lookup (nsswitch.c:191) > > ==2174== by 0x438564E: __nss_passwd_lookup2 (XXX-lookup.c:76) > > ==2174== by 0x433D0DE: getpwuid_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > > ==2174== by 0x433C92E: getpwuid (getXXbyYY.c:117) > > ==2174== by 0x80BC53A: PetscGetUserName (fuser.c:66) > > ==2174== by 0x80822FF: PetscErrorPrintfInitialize (errtrace.c:68) > > ==2174== by 0x80AEED1: PetscInitialize (pinit.c:576) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== Address 0x4430700 is 40 bytes inside a block of size 43 alloc'd > > ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > > ==2174== by 0x4384583: nss_parse_service_list (nsswitch.c:622) > > ==2174== by 0x4384E71: __nss_database_lookup (nsswitch.c:775) > > ==2174== by 0x43859AD: __nss_hosts_lookup2 (XXX-lookup.c:71) > > ==2174== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > > ==2174== by 0x438A685: gethostbyname (getXXbyYY.c:117) > > ==2174== nitialize (pinit.c:561) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) > > ==2174== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) > > ==2174== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) > > ==2174== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) > > ==2174== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== > > ==2175== Invalid read of size 4 > > ==2175== at 0x431983B: __GI_strlen (strlen.S:115) > > ==2175== by 0x43843CE: __nss_lookup (nsswitch.c:191) > > ==2175== by 0x438564E: __nss_passwd_lookup2 (XXX-lookup.c:76) > > ==2175== by 0x433D0DE: getpwuid_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > > ==2175== by 0x433C92E: getpwuid (getXXbyYY.c:117) > > ==2175== by 0x80BC53A: PetscGetUserName (fuser.c:66) > > ==2175== by 0x80822FF: PetscErrorPrintfInitialize (errtrace.c:68) > > ==2175== by 0x80AEED1: PetscInitialize (pinit.c:576) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== Address 0x4430700 is 40 bytes inside a block of size 43 alloc'd > > ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > > ==2175== by 0x4384583: nss_parse_service_list (nsswitch.c:622) > > ==2175== by 0x4384E71: __nss_database_lookup (nsswitch.c:775) > > ==2175== by 0x43859AD: __nss_hosts_lookup2 (XXX-lookup.c:71) > > ==2175== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) > > ==2175== by 0x438A685: gethostbyname (getXXbyYY.c:117) > > ==2175== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) > > ==2175== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) > > ==2175== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) > > ==2175== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) > > ==2175== by 0x87C7554: MPID_Init (mpid_init.c:92) > > ==2175== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2175== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80885F9: PetscVSNPrintf (mprint.c:95) > > ==2174== by 0x8088B22: PetscSNPrintf (mprint.c:228) > > ==2174== by 0x8082456: PetscErrorPrintfInitialize (errtrace.c:71) > > ==2174== by 0x80AEED1: PetscInitialize (pinit.c:576) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43197CB: __strlen_sse2 (strlen.S:116) > > ==2174== by 0x80885F9: PetscVSNPrintf (mprint.c:95) > > ==2174== by 0x8088B22: PetscSNPrintf (mprint.c:228) > > ==2174== by 0x8082456: PetscErrorPrintfInitialize (errtrace.c:71) > > ==2174== by 0x80AEED1: PetscInitialize (pinit.c:576) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2174== by 0x80824DE: PetscErrorPrintfInitialize (errtrace.c:73) > > ==2174== by 0x80AEED1: PetscInitialize (pinit.c:576) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2174== by 0x8082563: PetscErrorPrintfInitialize (errtrace.c:77) > > ==2174== by 0x80AEED1: PetscInitialize (pinit.c:576) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2174== by 0x809CFD4: PetscOptionsInsert (options.c:516) > > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80885F9: PetscVSNPrintf (mprint.c:95) > > ==2175== by 0x8088B22: PetscSNPrintf (mprint.c:228) > > ==2175== by 0x8082456: PetscErrorPrintfInitialize (errtrace.c:71) > > ==2175== by 0x80AEED1: PetscInitialize (pinit.c:576) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43197CB: __strlen_sse2 (strlen.S:116) > > ==2175== by 0x80885F9: PetscVSNPrintf (mprint.c:95) > > ==2175== by 0x8088B22: PetscSNPrintf (mprint.c:228) > > ==2175== by 0x8082456: PetscErrorPrintfInitialize (errtrace.c:71) > > ==2175== by 0x80AEED1: PetscInitialize (pinit.c:576) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > > ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) > > ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) > > ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) > > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43197CB: __strlen_sse2 (strlen.S:116) > > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > > ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) > > ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) > > ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) > > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x87F559E: PMIU_writeline (simple_pmiutil.c:180) > > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > > ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) > > ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) > > ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) > > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2174== > > ==2174== Use of uninitialised value of size 4 > > ==2174== at 0x87F55A4: PMIU_writeline (simple_pmiutil.c:184) > > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > > ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) > > ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) > > ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) > > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2174== > > ==2174== Syscall param write(count) contains uninitialised byte(s) > > ==2174== at 0x41B1EB3: __write_nocancel (syscall-template.S:82) > > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > > ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) > > ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) > > ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) > > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x87F566C: PMIU_writeline (simple_pmiutil.c:197) > > ==2174== by 0x87E19C7: GetResponse (simple_pmi.c:1049) > > ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) > > ==2174== by 0x87CA9F9: getConnIn==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2175== by 0x80824DE: PetscErrorPrintfInitialize (errtrace.c:73) > > ==2175== by 0x80AEED1: PetscInitialize (pinit.c:576) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > foKVS (mpidi_pg.c:622) > > ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) > > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > > ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) > > ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) > > ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) > > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > > ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) > > ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) > > ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) > > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > > ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) > > ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) > > ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) > > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > > ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) > > ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) > > ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) > > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > > ==2174== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) > > ==2174== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) > > ==2174== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) > > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > > ==2174== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2175== by 0x8082563: PetscErrorPrintfInitialize (errtrace.c:77) > > ==2175== by 0x80AEED1: PetscInitialize (pinit.c:576) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2175== by 0x809CFD4: PetscOptionsInsert (options.c:516) > > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2174== Invalid read of size 8 > > ==2174== at 0x4319785: __strlen_sse2 (strlen.S:87) > > ==2174== by 0x87BD379: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:776) > > ==2174== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) > > ==2174== by 0x8793060: MPIC_Wait (helper_fns.c:269) > > ==2174== by 0x879377E: MPIC_Send (helper_fns.c:38) > > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== Address 0x442e0d0 is 8 bytes before a block of size 257 alloc'd > > ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > > ==2174== by 0x87C74AE: MPID_Init (mpid_init.c:373) > > ==2174== by 0x879F65C: MPIR_Init_thread (initthread.c:288) > > ==2174== by 0x879F15A: PMPI_Init (init.c:106) > > ==2174== by 0x80AEE12: PetscInitialize (pinit.c:561) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43197AD: __strlen_sse2 (strlen.S:104) > > ==2174== by 0x87BD379: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:776) > > ==2174== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) > > ==2174== by 0x8793060: MPIC_Wait (helper_fns.c:269) > > ==2174== by 0x879377E: MPIC_Send (helper_fns.c:38) > > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Syscall param writev(vector) points to uninitialised byte(s) > > ==2174== at 0x436BA61: writev (writev.c:51) > > ==2174== by 0x87EB49A: MPIDU_Sock_wait (sock_wait.i:693) > > ==2174== by 0x87BD7B0: MPIDI_CH3I_Progress (ch3_progress.c:187) > > ==2174== by 0x8793060: MPIC_Wait (helper_fns.c:269) > > ==2174== by 0x879377E: MPIC_Send (helper_fns.c:38) > > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== Address 0x4462754 is 68 bytes inside a block of size 72 alloc'd > > ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > > ==2174== by 0x87CBE6C: MPIDI_CH3I_Connection_alloc (ch3u_connect_sock.c:160) > > ==2174== by 0x87CBFAD: MPIDI_CH3I_Sock_connect (ch3u_connect_sock.c:1164) > > ==2174== by 0x87CC132: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1102) > > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > > ==2174== > > ==2174== Syscall param writev(vector[...]) points to uninitialised byte(s) > > ==2174== at 0x436BA61: writev (writev.c:51) > > ==2174== by 0x87EB49A: MPIDU_Sock_wait (sock_wait.i:693) > > ==2174== by 0x87BD7B0: MPIDI_CH3I_Progress (ch3_progress.c:187) > > ==2174== by 0x8793060: MPIC_Wait (helper_fns.c:269) > > ==2174== by 0x879377E: MPIC_Send (helper_fns.c:38) > > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== Address 0x4462728 is 24 bytes inside a block of size 72 alloc'd > > ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > > ==2174== by 0x87CBE6C: MPIDI_CH3I_Connection_alloc (ch3u_connect_sock.c:160) > > ==2174== by 0x87CBFAD: MPIDI_CH3I_Sock_connect (ch3u_connect_sock.c:1164) > > ==2174== by 0x87CC132: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1102) > > ==2174== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) > > ==2174== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) > > ==2174== by 0x87C8AC6: MPID_Send (mpid_send.c:115) > > ==2174== by 0x879376B: MPIC_Send (helper_fns.c:34) > > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x87EC050: MPIDU_Sock_wait (socki_util.i:543) > > ==2174== by 0x87BD7B0: MPIDI_CH3I_Progress (ch3_progress.c:187) > > ==2174== by 0x8793060: MPIC_Wait (helper_fns.c:269) > > ==2174== by 0x879377E: MPIC_Send (helper_fns.c:38) > > ==2174== by 0x878BB6A: MPIR_Bcast (bcast.c:227) > > ==2174== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2174== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2174== by 0x809D14A: PetscOptionsInsert (options.c:522) > > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > > ==2175== by 0x87CA206: MPIDI_PG_Find (mpidi_pg.c:341) > > ==2175== by 0x87CB66D: MPIDI_CH3_Sockconn_handle_connopen_event (ch3u_connect_sock.c:883) > > ==2175== by 0x87BD3FA: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:639) > > ==2175== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) > > ==2175== by 0x8793060: MPIC_Wait (helper_fns.c:269) > > ==2175== by 0x8793626: MPIC_Recv (helper_fns.c:74) > > ==2175== by 0x878C049: MPIR_Bcast (bcast.c:195) > > ==2175== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2175== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2175== by 0x809D14A: PetscOptionsInsert (options.c:522) > > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > > ==2175== by 0x87CA206: MPIDI_PG_Find (mpidi_pg.c:341) > > ==2175== by 0x87CB66D: MPIDI_CH3_Sockconn_handle_connopen_event (ch3u_connect_sock.c:883) > > ==2175== by 0x87BD3FA: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:639) > > ==2175== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) > > ==2175== by 0x8793060: MPIC_Wait (helper_fns.c:269) > > ==2175== by 0x8793626: MPIC_Recv (helper_fns.c:74) > > ==2175== by 0x878C049: MPIR_Bcast (bcast.c:195) > > ==2175== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2175== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2175== by 0x809D14A: PetscOptionsInsert (options.c:522) > > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > > ==2175== by 0x87CA206: MPIDI_PG_Find (mpidi_pg.c:341) > > ==2175== by 0x87CB66D: MPIDI_CH3_Sockconn_handle_connopen_event (ch3u_connect_sock.c:883) > > ==2175== by 0x87BD3FA: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:639) > > ==2175== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) > > ==2175== by 0x8793060: MPIC_Wait (helper_fns.c:269) > > ==2175== by 0x8793626: MPIC_Recv (helper_fns.c:74) > > ==2175== by 0x878C049: MPIR_Bcast (bcast.c:195) > > ==2175== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2175== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2175== by 0x809D14A: PetscOptionsInsert (options.c:522) > > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > > ==2175== by 0x87CA206: MPIDI_PG_Find (mpidi_pg.c:341) > > ==2175== by 0x87CB66D: MPIDI_CH3_Sockconn_handle_connopen_event (ch3u_connect_sock.c:883) > > ==2175== by 0x87BD3FA: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:639) > > ==2175== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) > > ==2175== by 0x8793060: MPIC_Wait (helper_fns.c:269) > > ==2175== by 0x8793626: MPIC_Recv (helper_fns.c:74) > > ==2175== by 0x878C049: MPIR_Bcast (bcast.c:195) > > ==2175== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2175== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2175== by 0x809D14A: PetscOptionsInsert (options.c:522) > > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > > ==2175== by 0x87CA206: MPIDI_PG_Find (mpidi_pg.c:341) > > ==2175== by 0x87CB66D: MPIDI_CH3_Sockconn_handle_connopen_event (ch3u_connect_sock.c:883) > > ==2175== by 0x87BD3FA: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:639) > > ==2175== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) > > ==2175== by 0x8793060: MPIC_Wait (helper_fns.c:269) > > ==2175== by 0x8793626: MPIC_Recv (helper_fns.c:74) > > ==2175== by 0x878C049: MPIR_Bcast (bcast.c:195) > > ==2175== by 0x878C6A1: PMPI_Bcast (bcast.c:761) > > ==2175== by 0x809C467: PetscOptionsInsertFile (options.c:436) > > ==2175== by 0x809D14A: PetscOptionsInsert (options.c:522) > > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2175== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) > > ==2174== by 0x809F18B: PetscOptionsSetValue (options.c:803) > > ==2174== by 0x809DD95: PetscOptionsInsert (options.c:588) > > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== Address 0x44c0d88 is 8 bytes inside a block of size 10 alloc'd > > ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > > ==2174== by 0x809F5F2: PetscOptionsSetValue (options.c:829) > > ==2174== by 0x809DD95: PetscOptionsInsert (options.c:588) > > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) > > ==2174== by 0x809F18B: PetscOptionsSetValue (options.c:803) > > ==2174== by 0x809DD95: PetscOptionsInsert (options.c:588) > > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== Address 0x44c0d90 is 6 bytes after a block of size 10 alloc'd > > ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > > ==2174== by 0x809F5F2: PetscOptionsSetValue (options.c:829) > > ==2174== by 0x809DD95: PetscOptionsInsert (options.c:588) > > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) > > ==2174== by 0x80AF535: PetscInitialize (pinit.c:635) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2174== by 0x80A8875: PetscOptionsCheckInitial_Private (init.c:242) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Invalid read of size 8 > > ==2174== at 0x4319785: __strlen_sse2 (strlen.S:87) > > ==2174== by 0x8099BCC: PetscOptionsAtol (options.c:152) > > ==2174== by 0x80A1E9D: PetscOptionsGetTruth (options.c:1310) > > ==2174== by 0x80A88F7: PetscOptionsCheckInitial_Private (init.c:244) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== Address 0x44c0d40 is 8 bytes before a block of size 4 alloc'd > > ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > > ==2174== by 0x809F6EB: PetscOptionsSetValue (options.c:833) > > ==2174== by 0x809DD95: PetscOptionsInsert (options.c:588) > > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2174== by 0x80A8A66: PetscOptionsCheckInitial_Private (init.c:257) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2174== by 0x80A8BC3: PetscOptionsCheckInitial_Private (init.c:264) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2174== by 0x80A8C5B: PetscOptionsCheckInitial_Private (init.c:267) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) > > ==2175== by 0x809F18B: PetscOptionsSetValue (options.c:803) > > ==2175== by 0x809DD95: PetscOptionsInsert (options.c:588) > > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== Address 0x4433c38 is 8 bytes inside a block of size 10 alloc'd > > ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > > ==2175== by 0x809F5F2: PetscOptionsSetValue (options.c:829) > > ==2175== by 0x809DD95: PetscOptionsInsert (options.c:588) > > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) > > ==2175== by 0x809F18B: PetscOptionsSetValue (options.c:803) > > ==2175== by 0x809DD95: PetscOptionsInsert (options.c:588) > > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== Address 0x4433c40 is 6 bytes after a block of size 10 alloc'd > > ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > > ==2175== by 0x809F5F2: PetscOptionsSetValue (options.c:829) > > ==2175== by 0x809DD95: PetscOptionsInsert (options.c:588) > > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > > ==2175== by 0x80AF535: PetscInitialize (pinit.c:635) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2174== by 0x80C395E: PetscSetDisplay (pdisplay.c:99) > > ==2174== by 0x80A8D23: PetscOptionsCheckInitial_Private (init.c:276) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2175== by 0x80A8875: PetscOptionsCheckInitial_Private (init.c:242) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Invalid read of size 8 > > ==2175== at 0x4319785: __strlen_sse2 (strlen.S:87) > > ==2175== by 0x8099BCC: PetscOptionsAtol (options.c:152) > > ==2175== by 0x80A1E9D: PetscOptionsGetTruth (options.c:1310) > > ==2175== by 0x80A88F7: PetscOptionsCheckInitial_Private (init.c:244) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== Address 0x4433bf0 is 8 bytes before a block of size 4 alloc'd > > ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > > ==2175== by 0x809F6EB: PetscOptionsSetValue (options.c:833) > > ==2175== by 0x809DD95: PetscOptionsInsert (options.c:588) > > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) > > ==2174== by 0x80A8D9B: PetscOptionsCheckInitial_Private (init.c:281) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) > > ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) > > ==2174== by 0x80A8D9B: PetscOptionsCheckInitial_Private (init.c:281) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) > > ==2174== by 0x80A8E13: PetscOptionsCheckInitial_Private (init.c:282) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) > > ==2174== by 0x80A8E8B: PetscOptionsCheckInitial_Private (init.c:283) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2174== by 0x80A94FC: PetscOptionsCheckInitial_Private (init.c:320) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2175== by 0x80A8A66: PetscOptionsCheckInitial_Private (init.c:257) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2175== by 0x80A8BC3: PetscOptionsCheckInitial_Private (init.c:264) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2175== by 0x80A8C5B: PetscOptionsCheckInitial_Private (init.c:267) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2174== by 0x80A95F6: PetscOptionsCheckInitial_Private (init.c:323) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2174== by 0x80A96F8: PetscOptionsCheckInitial_Private (init.c:326) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2174== by 0x80A97FA: PetscOptionsCheckInitial_Private (init.c:329) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2175== by 0x80C395E: PetscSetDisplay (pdisplay.c:99) > > ==2175== by 0x80A8D23: PetscOptionsCheckInitial_Private (init.c:276) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2174== by 0x80A98FE: PetscOptionsCheckInitial_Private (init.c:334) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2174== by 0x80A9A5C: PetscOptionsCheckInitial_Private (init.c:341) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2174== by 0x80A9CA7: PetscOptionsCheckInitial_Private (init.c:350) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2174== by 0x80A9D9D: PetscOptionsCheckInitial_Private (init.c:352) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2174== by 0x80A9E24: PetscOptionsCheckInitial_Private (init.c:353) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2174== by 0x80AA655: PetscOptionsCheckInitial_Private (init.c:402) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) > > ==2174== by 0x80AA74F: PetscOptionsCheckInitial_Private (init.c:409) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) > > ==2174== by 0x80AA7C7: PetscOptionsCheckInitial_Private (init.c:410) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2174== by 0x80AAABF: PetscOptionsCheckInitial_Private (init.c:439) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2174== by 0x80AAC26: PetscOptionsCheckInitial_Private (init.c:452) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > > ==2175== by 0x80A8D9B: PetscOptionsCheckInitial_Private (init.c:281) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) > > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > > ==2175== by 0x80A8D9B: PetscOptionsCheckInitial_Private (init.c:281) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > > ==2175== by 0x80A8E13: PetscOptionsCheckInitial_Private (init.c:282) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > > ==2175== by 0x80A8E8B: PetscOptionsCheckInitial_Private (init.c:283) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2175== by 0x80A94FC: PetscOptionsCheckInitial_Private (init.c:320) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2175== by 0x80A95F6: PetscOptionsCheckInitial_Private (init.c:323) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2175== by 0x80A96F8: PetscOptionsCheckInitial_Private (init.c:326) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2175== by 0x80A97FA: PetscOptionsCheckInitial_Private (init.c:329) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2175== by 0x80A98FE: PetscOptionsCheckInitial_Private (init.c:334) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2174== by 0x80AADBF: PetscOptionsCheckInitial_Private (init.c:468) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2174== by 0x80AAE3F: PetscOptionsCheckInitial_Private (init.c:469) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) > > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2174== by 0x80AAE3F: PetscOptionsCheckInitial_Private (init.c:469) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) > > ==2174== by 0x80AAEB7: PetscOptionsCheckInitial_Private (init.c:470) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2174== by 0x80AB021: PetscOptionsCheckInitial_Private (init.c:474) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2174== by 0x80AB29D: PetscOptionsCheckInitial_Private (init.c:499) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2174== by 0x80A1074: PetscOptionsHasName (options.c:1092) > > ==2174== by 0x80AB315: PetscOptionsCheckInitial_Private (init.c:504) > > ==2174== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== > > ==2174== More than 100 errors detected. Subsequent errors > > ==2174== will still be recorded, but in less detail than before. > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2175== by 0x80A9A5C: PetscOptionsCheckInitial_Private (init.c:341) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2175== by 0x80A9CA7: PetscOptionsCheckInitial_Private (init.c:350) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2175== by 0x80A9D9D: PetscOptionsCheckInitial_Private (init.c:352) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2175== by 0x80A9E24: PetscOptionsCheckInitial_Private (init.c:353) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2175== by 0x80AA655: PetscOptionsCheckInitial_Private (init.c:402) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > > ==2175== by 0x80AA74F: PetscOptionsCheckInitial_Private (init.c:409) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > > ==2175== by 0x80AA7C7: PetscOptionsCheckInitial_Private (init.c:410) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2175== by 0x80AAABF: PetscOptionsCheckInitial_Private (init.c:439) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2175== by 0x80AAC26: PetscOptionsCheckInitial_Private (init.c:452) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2175== by 0x80AADBF: PetscOptionsCheckInitial_Private (init.c:468) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2175== by 0x80AAE3F: PetscOptionsCheckInitial_Private (init.c:469) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) > > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2175== by 0x80AAE3F: PetscOptionsCheckInitial_Private (init.c:469) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > > ==2175== by 0x80AAEB7: PetscOptionsCheckInitial_Private (init.c:470) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2175== by 0x80AB021: PetscOptionsCheckInitial_Private (init.c:474) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) > > ==2175== by 0x80AB29D: PetscOptionsCheckInitial_Private (init.c:499) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > > ==2175== by 0x80AB315: PetscOptionsCheckInitial_Private (init.c:504) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A27DF: PetscOptionsGetReal (options.c:1419) > > ==2175== by 0x80AC610: PetscOptionsCheckInitial_Private (init.c:554) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2175== by 0x80AC70C: PetscOptionsCheckInitial_Private (init.c:559) > > ==2175== by 0x80AF618: PetscInitialize (pinit.c:639) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > > ==2175== by 0x80C52B1: PetscLogBegin_Private (plog.c:196) > > ==2175== by 0x80AF677: PetscInitialize (pinit.c:643) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > > ==2175== by 0x80C532E: PetscLogBegin_Private (plog.c:200) > > ==2175== by 0x80AF677: PetscInitialize (pinit.c:643) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A420A: PetscOptionsGetStringArray (options.c:1756) > > ==2175== by 0x808387F: PetscInitialize_DynamicLibraries (reg.c:80) > > ==2175== by 0x80AF6D6: PetscInitialize (pinit.c:650) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2175== by 0x81DE8F5: PetscInitializePackage (dlregispetsc.c:58) > > ==2175== by 0x8083A20: PetscInitialize_DynamicLibraries (reg.c:93) > > ==2175== by 0x80AF6D6: PetscInitialize (pinit.c:650) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2175== by 0x81DEA77: PetscInitializePackage (dlregispetsc.c:66) > > ==2175== by 0x8083A20: PetscInitialize_DynamicLibraries (reg.c:93) > > ==2175== by 0x80AF6D6: PetscInitialize (pinit.c:650) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A420A: PetscOptionsGetStringArray (options.c:1756) > > ==2175== by 0x8083AA5: PetscInitialize_DynamicLibraries (reg.c:117) > > ==2175== by 0x80AF6D6: PetscInitialize (pinit.c:650) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80A066B: PetscOptionsFindPair_Private (options.c:967) > > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2174== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) > > ==2174== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) > > ==2174== by 0x80AF975: PetscInitialize (pinit.c:659) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x40249EA: strncat (mc_replace_strmem.c:202) > > ==2174== by 0x80BE6E2: PetscStrncat (str.c:205) > > ==2174== by 0x80A06F1: PetscOptionsFindPair_Private (options.c:968) > > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2174== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) > > ==2174== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) > > ==2174== by 0x80AF975: PetscInitialize (pinit.c:659) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > > ==2175== by 0x80AC9CD: PetscOptionsCheckInitial_Components (pinit.c:57) > > ==2175== by 0x80AF916: PetscInitialize (pinit.c:657) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4024A18: strncat (mc_replace_strmem.c:202) > > ==2174== by 0x80BE6E2: PetscStrncat (str.c:205) > > ==2174== by 0x80A06F1: PetscOptionsFindPair_Private (options.c:968) > > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2174== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) > > ==2174== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) > > ==2174== by 0x80AF975: PetscInitialize (pinit.c:659) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) > > ==2175== by 0x80A1074: PetscOptionsHasName (options.c:1092) > > ==2175== by 0x80B290F: PetscOptionsBegin_Private (aoptions.c:44) > > ==2175== by 0x80A4EED: PetscOptionsSetFromOptions (options.c:1890) > > ==2175== by 0x80AF975: PetscInitialize (pinit.c:659) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80A066B: PetscOptionsFindPair_Private (options.c:967) > > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2175== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) > > ==2175== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) > > ==2175== by 0x80AF975: PetscInitialize (pinit.c:659) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== > > ==2175== More than 100 errors detected. Subsequent errors > > ==2175== will still be recorded, but in less detail than before. > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x40249EA: strncat (mc_replace_strmem.c:202) > > ==2175== by 0x80BE6E2: PetscStrncat (str.c:205) > > ==2175== by 0x80A06F1: PetscOptionsFindPair_Private (options.c:968) > > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2175== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) > > ==2175== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) > > ==2175== by 0x80AF975: PetscInitialize (pinit.c:659) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4024A18: strncat (mc_replace_strmem.c:202) > > ==2175== by 0x80BE6E2: PetscStrncat (str.c:205) > > ==2175== by 0x80A06F1: PetscOptionsFindPair_Private (options.c:968) > > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2175== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) > > ==2175== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) > > ==2175== by 0x80AF975: PetscInitialize (pinit.c:659) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2174== by 0x80BDE83: PetscStrallocpy (str.c:79) > > ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > > ==2174== by 0x80841CC: PetscFListAdd (reg.c:201) > > ==2174== by 0x81843D0: DARegister (dareg.c:104) > > ==2174== by 0x818476B: DARegisterAll (daregall.c:32) > > ==2174== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) > > ==2174== by 0x815F1D1: DACreate (dacreate.c:173) > > ==2174== by 0x81558E2: DACreate2d (da2.c:1837) > > ==2174== by 0x804BE2A: main (ex19.c:107) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43197AD: __strlen_sse2 (strlen.S:104) > > ==2174== by 0x80BDE83: PetscStrallocpy (str.c:79) > > ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > > ==2174== by 0x80841CC: PetscFListAdd (reg.c:201) > > ==2174== by 0x81843D0: DARegister (dareg.c:104) > > ==2174== by 0x818476B: DARegisterAll (daregall.c:32) > > ==2174== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) > > ==2174== by 0x815F1D1: DACreate (dacreate.c:173) > > ==2174== by 0x81558E2: DACreate2d (da2.c:1837) > > ==2174== by 0x804BE2A: main (ex19.c:107) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x80BDEDB: PetscStrallocpy (str.c:80) > > ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x4319794: __strlen_sse2 (strlen.S:93) > > ==2175== by 0x80BDE83: PetscStrallocpy (str.c:79) > > ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > > ==2175== by 0x80841CC: PetscFListAdd (reg.c:201) > > ==2175== by 0x81843D0: DARegister (dareg.c:104) > > ==2175== by 0x818476B: DARegisterAll (daregall.c:32) > > ==2175== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) > > ==2175== by 0x815F1D1: DACreate (dacreate.c:173) > > ==2175== by 0x81558E2: DACreate2d (da2.c:1837) > > ==2175== by 0x804BE2A: main (ex19.c:107) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43197AD: __strlen_sse2 (strlen.S:104) > > ==2175== by 0x80BDE83: PetscStrallocpy (str.c:79) > > ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > > ==2175== by 0x80841CC: PetscFListAdd (reg.c:201) > > ==2175== by 0x81843D0: DARegister (dareg.c:104) > > ==2175== by 0x818476B: DARegisterAll (daregall.c:32) > > ==2175== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) > > ==2175== by 0x815F1D1: DACreate (dacreate.c:173) > > ==2175== by 0x81558E2: DACreate2d (da2.c:1837) > > ==2175== by 0x804BE2A: main (ex19.c:107) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x80BDEDB: PetscStrallocpy (str.c:80) > > ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > > ==2175== by 0x80841CC: PetscFListAdd (reg.c:201) > > ==2175== by 0x81843D0: DARegister (dareg.c:104) > > ==2175== by 0x818476B: DARegisterAll (daregall.c:32) > > ==2175== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) > > ==2175== by 0x815F1D1: DACreate (dacreate.c:173) > > ==2175== by 0x81558E2: DACreate2d (da2.c:1837) > > ==2175== by 0x804BE2A: main (ex19.c:107) > > ==2175== > > ==2174== by 0x80841CC: PetscFListAdd (reg.c:201) > > ==2174== by 0x81843D0: DARegister (dareg.c:104) > > ==2174== by 0x818476B: DARegisterAll (daregall.c:32) > > ==2174== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) > > ==2174== by 0x815F1D1: DACreate (dacreate.c:173) > > ==2174== by 0x81558E2: DACreate2d (da2.c:1837) > > ==2174== by 0x804BE2A: main (ex19.c:107) > > ==2174== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43197A0: __strlen_sse2 (strlen.S:99) > > ==2175== by 0x8099209: PetscOptionsAtoi (options.c:70) > > ==2175== by 0x80A13CE: PetscOptionsGetInt (options.c:1138) > > ==2175== by 0x80B5AA5: PetscOptionsInt (aoptions.c:473) > > ==2175== by 0x815E8BC: DASetFromOptions (dacreate.c:109) > > ==2175== by 0x8155C96: DACreate2d (da2.c:1847) > > ==2175== by 0x804BE2A: main (ex19.c:107) > > ==2175== Address 0x4433c70 is 0 bytes inside a block of size 3 alloc'd > > ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > > ==2175== by 0x809F6EB: PetscOptionsSetValue (options.c:833) > > ==2175== by 0x809DD95: PetscOptionsInsert (options.c:588) > > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43197A0: __strlen_sse2 (strlen.S:99) > > ==2174== by 0x8099209: PetscOptionsAtoi (options.c:70) > > ==2174== by 0x80A13CE: PetscOptionsGetInt (options.c:1138) > > ==2174== by 0x80B5AA5: PetscOptionsInt (aoptions.c:473) > > ==2174== by 0x815E8BC: DASetFromOptions (dacreate.c:109) > > ==2174== by 0x8155C96: DACreate2d (da2.c:1847) > > ==2174== by 0x804BE2A: main (ex19.c:107) > > ==2174== Address 0x44c0dc0 is 0 bytes inside a block of size 3 alloc'd > > ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > > ==2174== by 0x809F6EB: PetscOptionsSetValue (options.c:833) > > ==2174== by 0x809DD95: PetscOptionsInsert (options.c:588) > > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2175== Invalid read of size 8 > > ==2175== at 0x4319785: __strlen_sse2 (strlen.S:87) > > ==2175== by 0x8099209: PetscOptionsAtoi (options.c:70) > > ==2175== by 0x80A13CE: PetscOptionsGetInt (options.c:1138) > > ==2175== by 0x80B5AA5: PetscOptionsInt (aoptions.c:473) > > ==2175== by 0x815E96A: DASetFromOptions (dacreate.c:114) > > ==2175== by 0x8155C96: DACreate2d (da2.c:1847) > > ==2175== by 0x804BE2A: main (ex19.c:107) > > ==2175== Address 0x4433ce0 is 8 bytes before a block of size 3 alloc'd > > ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > > ==2175== by 0x809F6EB: PetscOptionsSetValue (options.c:833) > > ==2175== by 0x809DD95: PetscOptionsInsert (options.c:588) > > ==2175== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2175== by 0x804BA0C: main (ex19.c:96) > > ==2175== > > ==2174== Invalid read of size 8 > > ==2174== at 0x4319785: __strlen_sse2 (strlen.S:87) > > ==2174== by 0x8099209: PetscOptionsAtoi (options.c:70) > > ==2174== by 0x80A13CE: PetscOptionsGetInt (options.c:1138) > > ==2174== by 0x80B5AA5: PetscOptionsInt (aoptions.c:473) > > ==2174== by 0x815E96A: DASetFromOptions (dacreate.c:114) > > ==2174== by 0x8155C96: DACreate2d (da2.c:1847) > > ==2174== by 0x804BE2A: main (ex19.c:107) > > ==2174== Address 0x44c0e30 is 8 bytes before a block of size 3 alloc'd > > ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > > ==2174== by 0x809F6EB: PetscOptionsSetValue (options.c:833) > > ==2174== by 0x809DD95: PetscOptionsInsert (options.c:588) > > ==2174== by 0x80AF4BD: PetscInitialize (pinit.c:629) > > ==2174== by 0x804BA0C: main (ex19.c:96) > > ==2174== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43197BC: __strlen_sse2 (strlen.S:110) > > ==2175== by 0x80BDE83: PetscStrallocpy (str.c:79) > > ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > > ==2175== by 0x80846F3: PetscFListAdd (reg.c:238) > > ==2175== by 0x829D9AC: MatRegister (matreg.c:139) > > ==2175== by 0x86C6837: MatRegisterAll (matregis.c:85) > > ==2175== by 0x82A0B71: MatInitializePackage (dlregismat.c:80) > > ==2175== by 0x8542386: MatCreate (gcreate.c:72) > > ==2175== by 0x8171B37: DAGetInterpolation_2D_Q1 (dainterp.c:308) > > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) > > ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) > > ==2175== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43197BC: __strlen_sse2 (strlen.S:110) > > ==2174== by 0x80BDE83: PetscStrallocpy (str.c:79) > > ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > > ==2174== by 0x80846F3: PetscFListAdd (reg.c:238) > > ==2174== by 0x829D9AC: MatRegister (matreg.c:139) > > ==2174== by 0x86C6837: MatRegisterAll (matregis.c:85) > > ==2174== by 0x82A0B71: MatInitializePackage (dlregismat.c:80) > > ==2174== by 0x8542386: MatCreate (gcreate.c:72) > > ==2174== by 0x8171B37: DAGetInterpolation_2D_Q1 (dainterp.c:308) > > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) > > ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) > > ==2174== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BEC7D: __strcmp_ssse3 (strcmp-ssse3.S:1021) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x834D30B: MatCreate_MPIAIJ (mpiaij.c:5096) > > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2175== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) > > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2175== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) > > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) > > ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) > > ==2175== Address 0x44db848 is 24 bytes inside a block of size 28 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) > > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2175== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) > > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2175== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) > > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2175== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BEC7D: __strcmp_ssse3 (strcmp-ssse3.S:1021) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x834D30B: MatCreate_MPIAIJ (mpiaij.c:5096) > > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2174== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) > > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2174== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) > > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) > > ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) > > ==2174== Address 0x456b7c8 is 24 bytes inside a block of size 28 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) > > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2174== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) > > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2174== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) > > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2174== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BE415: __strcmp_ssse3 (strcmp-ssse3.S:225) > > ==2175== by 0x8085399: PetscFListFind (reg.c:375) > > ==2175== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > > ==2175== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > > ==2175== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) > > ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) > > ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) > > ==2175== by 0x804BEA0: main (ex19.c:108) > > ==2175== Address 0x44db848 is 24 bytes inside a block of size 28 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) > > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2175== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) > > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2175== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) > > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2175== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) > > ==2175== by 0x8085399: PetscFListFind (reg.c:375) > > ==2175== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > > ==2175== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > > ==2175== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) > > ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) > > ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) > > ==2175== by 0x804BEA0: main (ex19.c:108) > > ==2175== Address 0x44dc7d8 is 24 bytes inside a block of size 28 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > > ==2175== by 0x8085186: PetscFListFind (reg.c:356) > > ==2175== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > > ==2175== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > > ==2175== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) > > ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) > > ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) > > ==2175== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) > > ==2175== by 0x8085406: PetscFListFind (reg.c:376) > > ==2175== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > > ==2175== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > > ==2175== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) > > ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) > > ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) > > ==2175== by 0x804BEA0: main (ex19.c:108) > > ==2175== Address 0x44dc7d8 is 24 bytes inside a block of size 28 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > > ==2175== by 0x8085186: PetscFListFind (reg.c:356) > > ==2175== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > > ==2175== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > > ==2175== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) > > ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) > > ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) > > ==2175== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BE415: __strcmp_ssse3 (strcmp-ssse3.S:225) > > ==2174== by 0x8085399: PetscFListFind (reg.c:375) > > ==2174== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > > ==2174== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > > ==2174== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) > > ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) > > ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) > > ==2174== by 0x804BEA0: main (ex19.c:108) > > ==2174== Address 0x456b7c8 is 24 bytes inside a block of size 28 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) > > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2174== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) > > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2174== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) > > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2174== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) > > ==2174== by 0x8085399: PetscFListFind (reg.c:375) > > ==2174== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > > ==2174== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > > ==2174== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) > > ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) > > ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) > > ==2174== by 0x804BEA0: main (ex19.c:108) > > ==2174== Address 0x456c758 is 24 bytes inside a block of size 28 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > > ==2174== by 0x8085186: PetscFListFind (reg.c:356) > > ==2174== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > > ==2174== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > > ==2174== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) > > ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) > > ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) > > ==2174== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) > > ==2174== by 0x8085406: PetscFListFind (reg.c:376) > > ==2174== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > > ==2174== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > > ==2174== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) > > ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) > > ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) > > ==2174== by 0x804BEA0: main (ex19.c:108) > > ==2174== Address 0x456c758 is 24 bytes inside a block of size 28 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > > ==2174== by 0x8085186: PetscFListFind (reg.c:356) > > ==2174== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > > ==2174== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > > ==2174== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) > > ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) > > ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) > > ==2174== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x82BE3A6: MatCreate_SeqAIJ (aij.c:3363) > > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2174== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > > ==2174== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > > ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2174== by 0x8197E93: DMGetInterpolation (dm.c:144) > > ==2174== by 0x81BD897: DMMGSetDM (damg.c:250) > > ==2174== Address 0x456d3a0 is 16 bytes inside a block of size 21 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x8084162: PetscFListAdd (reg.c:200) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x82BE331: MatCreate_SeqAIJ (aij.c:3360) > > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2174== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > > ==2174== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > > ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2174== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x82BE3A6: MatCreate_SeqAIJ (aij.c:3363) > > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2175== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > > ==2175== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > > ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2175== by 0x8197E93: DMGetInterpolation (dm.c:144) > > ==2175== by 0x81BD897: DMMGSetDM (damg.c:250) > > ==2175== Address 0x44dd420 is 16 bytes inside a block of size 21 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x8084162: PetscFListAdd (reg.c:200) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x82BE331: MatCreate_SeqAIJ (aij.c:3360) > > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2175== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > > ==2175== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > > ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2175== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x80BDEDB: PetscStrallocpy (str.c:80) > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2174== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) > > ==2174== by 0x80A6564: PetscObjectAppendOptionsPrefix (prefix.c:70) > > ==2174== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) > > ==2174== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) > > ==2174== by 0x85A5056: PCMGSetLevels (mg.c:195) > > ==2174== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) > > ==2174== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) > > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2174== by 0x804C4FF: main (ex19.c:140) > > ==2174== > > ==2175== at 0x80BDEDB: PetscStrallocpy (str.c:80) > > ==2175== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) > > ==2175== by 0x80A6564: PetscObjectAppendOptionsPrefix (prefix.c:70) > > ==2175== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) > > ==2175== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) > > ==2175== by 0x85A5056: PCMGSetLevels (mg.c:195) > > ==2175== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) > > ==2175== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) > > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2175== by 0x804C4FF: main (ex19.c:140) > > ==2175== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x8668D6B: KSPDestroy_GMRES (gmres.c:302) > > ==2174== by 0x8675E35: KSPDestroy_FGMRES (fgmres.c:341) > > ==2174== by 0x8635CA0: KSPSetType (itcreate.c:569) > > ==2174== by 0x81C5A98: DMMGSetSNES (damgsnes.c:668) > > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2174== by 0x804C4FF: main (ex19.c:140) > > ==2174== Address 0x462f078 is 24 bytes inside a block of size 31 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x86784F2: KSPCreate_FGMRES (fgmres.c:753) > > ==2174== by 0x8635D92: KSPSetType (itcreate.c:576) > > ==2174== by 0x81BE7AA: DMMGSetUpLevel (damg.c:372) > > ==2174== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) > > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2174== by 0x804C4FF: main (ex19.c:140) > > ==2174== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BE320: __strcmp_ssse3 (strcmp-ssse3.S:141) > > ==2174== by 0x8085399: PetscFListFind (reg.c:375) > > ==2174== by 0x858986E: PCSetType (pcset.c:66) > > ==2174== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2174== by 0x804C4FF: main (ex19.c:140) > > ==2174== Address 0x45a0778 is 8 bytes inside a block of size 10 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x8668D6B: KSPDestroy_GMRES (gmres.c:302) > > ==2175== by 0x8675E35: KSPDestroy_FGMRES (fgmres.c:341) > > ==2175== by 0x8635CA0: KSPSetType (itcreate.c:569) > > ==2175== by 0x81C5A98: DMMGSetSNES (damgsnes.c:668) > > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2175== by 0x804C4FF: main (ex19.c:140) > > ==2175== Address 0x459cf78 is 24 bytes inside a block of size 31 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x86784F2: KSPCreate_FGMRES (fgmres.c:753) > > ==2175== by 0x8635D92: KSPSetType (itcreate.c:576) > > ==2175== by 0x81BE7AA: DMMGSetUpLevel (damg.c:372) > > ==2175== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) > > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2175== by 0x804C4FF: main (ex19.c:140) > > ==2175== > > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2174== by 0x8718EF5: PCRegister (precon.c:1537) > > ==2174== by 0x858B07B: PCRegisterAll (pcregis.c:95) > > ==2174== by 0x8642C37: PCInitializePackage (dlregisksp.c:60) > > ==2174== by 0x870E72D: PCCreate (precon.c:299) > > ==2174== by 0x862ADD8: KSPGetPC (itfunc.c:1251) > > ==2174== by 0x861BF6A: KSPSetOptionsPrefix (itcl.c:87) > > ==2174== by 0x81AD4A9: SNESSetOptionsPrefix (snes.c:2529) > > ==2174== by 0x81C52C6: DMMGSetSNES (damgsnes.c:612) > > ==2174== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) > > ==2174== by 0x8085399: PetscFListFind (reg.c:375) > > ==2174== by 0x858986E: PCSetType (pcset.c:66) > > ==2174== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2174== by 0x804C4FF: main (ex19.c:140) > > ==2174== Address 0x4630188 is 8 bytes inside a block of size 10 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > > ==2174== by 0x8085186: PetscFListFind (reg.c:356) > > ==2174== by 0x858986E: PCSetType (pcset.c:66) > > ==2174== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2174== by 0x804C4FF: main (ex19.c:140) > > ==2174== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BE328: __strcmp_ssse3 (strcmp-ssse3.S:143) > > ==2174== by 0x8085399: PetscFListFind (reg.c:375) > > ==2174== by 0x858986E: PCSetType (pcset.c:66) > > ==2174== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2174== by 0x804C4FF: main (ex19.c:140) > > ==2174== Address 0x45a0780 is 6 bytes after a block of size 10 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2174== by 0x8718EF5: PCRegister (precon.c:1537) > > ==2174== by 0x858B07B: PCRegisterAll (pcregis.c:95) > > ==2174== by 0x8642C37: PCInitializePackage (dlregisksp.c:60) > > ==2174== by 0x870E72D: PCCreate (precon.c:299) > > ==2174== by 0x862ADD8: KSPGetPC (itfunc.c:1251) > > ==2174== by 0x861BF6A: KSPSetOptionsPrefix (itcl.c:87) > > ==2174== by 0x81AD4A9: SNESSetOptionsPrefix (snes.c:2529) > > ==2174== by 0x81C52C6: DMMGSetSNES (damgsnes.c:612) > > ==2174== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) > > ==2174== by 0x8085399: PetscFListFind (reg.c:375) > > ==2174== by 0x858986E: PCSetType (pcset.c:66) > > ==2174== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2174== by 0x804C4FF: main (ex19.c:140) > > ==2174== Address 0x4630190 is 6 bytes after a block of size 10 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174==2175== Invalid read of size 8 > > ==2175== at 0x43BE320: __strcmp_ssse3 (strcmp-ssse3.S:141) > > ==2175== by 0x8085399: PetscFListFind (reg.c:375) > > ==2175== by 0x858986E: PCSetType (pcset.c:66) > > ==2175== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2175== by 0x804C4FF: main (ex19.c:140) > > ==2175== Address 0x450e678 is 8 bytes inside a block of size 10 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2175== by 0x8718EF5: PCRegister (precon.c:1537) > > ==2175== by 0x858B07B: PCRegisterAll (pcregis.c:95) > > ==2175== by 0x8642C37: PCInitializePackage (dlregisksp.c:60) > > ==2175== by 0x870E72D: PCCreate (precon.c:299) > > ==2175== by 0x862ADD8: KSPGetPC (itfunc.c:1251) > > ==2175== by 0x861BF6A: KSPSetOptionsPrefix (itcl.c:87) > > ==2175== by 0x81AD4A9: SNESSetOpti== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > > ==2174== by 0x8085186: PetscFListFind (reg.c:356) > > ==2174== by 0x858986E: PCSetType (pcset.c:66) > > ==2174== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2174== by 0x804C4FF: main (ex19.c:140) > > ==2174== > > onsPrefix (snes.c:2529) > > ==2175== by 0x81C52C6: DMMGSetSNES (damgsnes.c:612) > > ==2175== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) > > ==2175== by 0x8085399: PetscFListFind (reg.c:375) > > ==2175== by 0x858986E: PCSetType (pcset.c:66) > > ==2175== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2175== by 0x804C4FF: main (ex19.c:140) > > ==2175== Address 0x459e088 is 8 bytes inside a block of size 10 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > > ==2175== by 0x8085186: PetscFListFind (reg.c:356) > > ==2175== by 0x858986E: PCSetType (pcset.c:66) > > ==2175== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2175== by 0x804C4FF: main (ex19.c:140) > > ==2175== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BE328: __strcmp_ssse3 (strcmp-ssse3.S:143) > > ==2175== by 0x8085399: PetscFListFind (reg.c:375) > > ==2175== by 0x858986E: PCSetType (pcset.c:66) > > ==2175== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2175== by 0x804C4FF: main (ex19.c:140) > > ==2175== Address 0x450e680 is 6 bytes after a block of size 10 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2175== by 0x8718EF5: PCRegister (precon.c:1537) > > ==2175== by 0x858B07B: PCRegisterAll (pcregis.c:95) > > ==2175== by 0x8642C37: PCInitializePackage (dlregisksp.c:60) > > ==2175== by 0x870E72D: PCCreate (precon.c:299) > > ==2175== by 0x862ADD8: KSPGetPC (itfunc.c:1251) > > ==2175== by 0x861BF6A: KSPSetOptionsPrefix (itcl.c:87) > > ==2175== by 0x81AD4A9: SNESSetOptionsPrefix (snes.c:2529) > > ==2175== by 0x81C52C6: DMMGSetSNES (damgsnes.c:612) > > ==2175== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) > > ==2175== by 0x8085399: PetscFListFind (reg.c:375) > > ==2175== by 0x858986E: PCSetType (pcset.c:66) > > ==2175== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2175== by 0x804C4FF: main (ex19.c:140) > > ==2175== Address 0x459e090 is 6 bytes after a block of size 10 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > > ==2175== by 0x8085186: PetscFListFind (reg.c:356) > > ==2175== by 0x858986E: PCSetType (pcset.c:66) > > ==2175== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) > > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2175== by 0x804C4FF: main (ex19.c:140) > > ==2175== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43B674F: __memcpy_ssse3 (memcpy-ssse3.S:703) > > ==2175== by 0x87923CB: MPIR_Allgatherv (allgatherv.c:340) > > ==2175== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) > > ==2175== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) > > ==2175== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) > > ==2175== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) > > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2175== by 0x804C4FF: main (ex19.c:140) > > ==2175== Address 0x47cd128 is 0 bytes after a block of size 720 alloc'd > > ==2175== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > > ==2175== by 0x8791B68: MPIR_Allgatherv (allgatherv.c:143) > > ==2175== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) > > ==2175== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) > > ==2175== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) > > ==2175== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) > > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2175== by 0x804C4FF: main (ex19.c:140) > > ==2175== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43B677F: __memcpy_ssse3 (memcpy-ssse3.S:715) > > ==2174== by 0x8791BD2: MPIR_Allgatherv (allgatherv.c:160) > > ==2174== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) > > ==2174== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) > > ==2174== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) > > ==2174== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) > > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2174== by 0x804C4FF: main (ex19.c:140) > > ==2174== Address 0x4920208 is 0 bytes after a block of size 1,416 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x8132724: ISCreateGeneral (general.c:342) > > ==2174== by 0x813BE92: ISColoringGetIS (iscoloring.c:161) > > ==2174== by 0x836235A: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:30) > > ==2174== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) > > ==2174== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) > > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2174== by 0x804C4FF: main (ex19.c:140) > > ==2174== > > lid velocity = 0.000287274, prandtl # = 1, grashof # = 1 > > ==2174== Invalid read of size 8 > > ==2174== at 0x43197A0: __strlen_sse2 (strlen.S:99) > > ==2175== Invalid read of size 8 > > ==2174== by 0x80BDE83: PetscStrallocpy (str.c:79) > > ==2174== by 0x80B2778: PetscOptionsBegin_Private (aoptions.c:38) > > ==2174== by 0x858A3C6: PCSetFromOptions (pcset.c:170) > > ==2174== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > > ==2174== by 0x85A8052: PCSetUp_MG (mg.c:490) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x8622C90: KSPSolve (itfunc.c:353) > > ==2174== by 0x81B0295: SNES_KSPSolve (snes.c:2944) > > ==2174== by 0x86A2C12: SNESSolve_LS (ls.c:191) > > ==2174== by 0x81AB5EC: SNESSolve (snes.c:2255) > > ==2174== Address 0x4858f38 is 8 bytes inside a block of size 11 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) > > ==2174== by 0x80A6564: PetscObjectAppendOptionsPrefix (prefix.c:70) > > ==2174== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) > > ==2174== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) > > ==2174== by 0x85A4D63: PCMGSetLevels (mg.c:180) > > ==2174== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) > > ==2174== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) > > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2174== by 0x804C4FF: main (ex19.c:140) > > ==2174== > > ==2175== at 0x43197A0: __strlen_sse2 (strlen.S:99) > > ==2175== by 0x80BDE83: PetscStrallocpy (str.c:79) > > ==2175== by 0x80B2778: PetscOptionsBegin_Private (aoptions.c:38) > > ==2175== by 0x858A3C6: PCSetFromOptions (pcset.c:170) > > ==2175== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > > ==2175== by 0x85A8052: PCSetUp_MG (mg.c:490) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x8622C90: KSPSolve (itfunc.c:353) > > ==2175== by 0x81B0295: SNES_KSPSolve (snes.c:2944) > > ==2175== by 0x86A2C12: SNESSolve_LS (ls.c:191) > > ==2175== by 0x81AB5EC: SNESSolve (snes.c:2255) > > ==2175== Address 0x4797eb8 is 8 bytes inside a block of size 11 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) > > ==2175== by 0x80A6564: PetscObjectAppendOptionsPrefix (prefix.c:70) > > ==2175== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) > > ==2175== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) > > ==2175== by 0x85A4D63: PCMGSetLevels (mg.c:180) > > ==2175== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) > > ==2175== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) > > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (dam==2174== Invalid read of size 8 > > ==2174== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) > > ==2174== by 0x8085406: PetscFListFind (reg.c:376) > > ==2174== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > > ==2174== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > > ==2174== by 0x870C60A: PCGetDefaultType_Private (precon.c:25) > > ==2174== by 0x858A499: PCSetFromOptions (pcset.c:172) > > ==2174== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > > ==2174== by 0x85A8052: PCSetUp_MG (mg.c:490) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x8622C90: KSPSolve (itfunc.c:353) > > ==2174== by 0x81B0295: SNES_KSPSolve (snes.c:2944) > > ==2174== Address 0x4cfa520 is 16 bytes inside a block of size 22 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24)gsnes.c:952) > > ==2175== by 0x804C4FF: main (ex19.c:140) > > ==2175== > > > > ==2174== by 0x8085186: PetscFListFind (reg.c:356) > > ==2174== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > > ==2174== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > > ==2174== by 0x870C60A: PCGetDefaultType_Private (precon.c:25) > > ==2174== by 0x858A499: PCSetFromOptions (pcset.c:172) > > ==2174== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > > ==2174== by 0x85A8052: PCSetUp_MG (mg.c:490) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) > > ==2175== by 0x8085406: PetscFListFind (reg.c:376) > > ==2175== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > > ==2175== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > > ==2175== by 0x870C60A: PCGetDefaultType_Private (precon.c:25) > > ==2175== by 0x858A499: PCSetFromOptions (pcset.c:172) > > ==2175== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > > ==2175== by 0x85A8052: PCSetUp_MG (mg.c:490) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x8622C90: KSPSolve (itfunc.c:353) > > ==2175== by 0x81B0295: SNES_KSPSolve (snes.c:2944) > > ==2175== Address 0x4c39110 is 16 bytes inside a block of size 22 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > > ==2175== by 0x8085186: PetscFListFind (reg.c:356) > > ==2175== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) > > ==2175== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) > > ==2175== by 0x870C60A: PCGetDefaultType_Private (precon.c:25) > > ==2175== by 0x858A499: PCSetFromOptions (pcset.c:172) > > ==2175== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > > ==2175== by 0x85A8052: PCSetUp_MG (mg.c:490) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43197A0: __strlen_sse2 (strlen.S:99) > > ==2175== by 0x80A671F: PetscObjectAppendOptionsPrefix (prefix.c:76) > > ==2175== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) > > ==2174== Invalid read of size 8 > > ==2174== at 0x43197A0: __strlen_sse2 (strlen.S:99) > > ==2174== by 0x80A671F: PetscObjectAppendOptionsPrefix (prefix.c:76) > > ==2174== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) > > ==2174== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) > > ==2174== by 0x859F3D9: PCSetUp_BJacobi_Singleblock (bjacobi.c:905) > > ==2174== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x85A8E40: PCSetUp_MG (mg.c:556) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x8622C90: KSPSolve (itfunc.c:353) > > ==2174== Address 0x4d567c8 is 8 bytes inside a block of size 13 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) > > ==2174== by 0x87166FA: PCSetOptionsPrefix (precon.c:1209) > > ==2174== by 0x861BFD3: KSPSetOptionsPrefix (itcl.c:88) > > ==2174== by 0x859F375: PCSetUp_BJacobi_Singleblock (bjacobi.c:904) > > ==2174== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x85A8E40: PCSetUp_MG (mg.c:556) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== > > ==2175== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) > > ==2175== by 0x859F3D9: PCSetUp_BJacobi_Singleblock (bjacobi.c:905) > > ==2175== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x85A8E40: PCSetUp_MG (mg.c:556) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x8622C90: KSPSolve (itfunc.c:353) > > ==2175== Address 0x4c42198 is 8 bytes inside a block of size 13 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) > > ==2175== by 0x87166FA: PCSetOptionsPrefix (precon.c:1209) > > ==2175== by 0x861BFD3: KSPSetOptionsPrefix (itcl.c:88) > > ==2175== by 0x859F375: PCSetUp_BJacobi_Singleblock (bjacobi.c:904) > > ==2175== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x85A8E40: PCSetUp_MG (mg.c:556) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43197AF: __strlen_sse2 (strlen.S:106) > > ==2174== by 0x80BDE83: PetscStrallocpy (str.c:79) > > ==2174== by 0x80B2778: PetscOptionsBegin_Private (aoptions.c:38) > > ==2174== by 0x858A3C6: PCSetFromOptions (pcset.c:170) > > ==2174== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > > ==2174== by 0x859F978: PCSetUp_BJacobi_Singleblock (bjacobi.c:944) > > ==2174== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x85A8E40: PCSetUp_MG (mg.c:556) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== Address 0x4d568d0 is 16 bytes inside a block of size 17 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80A67BF: PetscObjectAppendOptionsPrefix (prefix.c:77) > > ==2174== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) > > ==2174== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) > > ==2174== by 0x859F3D9: PCSetUp_BJacobi_Singleblock (bjacobi.c:905) > > ==2174== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x85A8E40: PCSetUp_MG (mg.c:556) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43197AF: __strlen_sse2 (strlen.S:106) > > ==2175== by 0x80BDE83: PetscStrallocpy (str.c:79) > > ==2175== by 0x80B2778: PetscOptionsBegin_Private (aoptions.c:38) > > ==2175== by 0x858A3C6: PCSetFromOptions (pcset.c:170) > > ==2175== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > > ==2175== by 0x859F978: PCSetUp_BJacobi_Singleblock (bjacobi.c:944) > > ==2175== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x85A8E40: PCSetUp_MG (mg.c:556) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== Address 0x4c422a0 is 16 bytes inside a block of size 17 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80A67BF: PetscObjectAppendOptionsPrefix (prefix.c:77) > > ==2175== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) > > ==2175== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) > > ==2175== by 0x859F3D9: PCSetUp_BJacobi_Singleblock (bjacobi.c:905) > > ==2175== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x85A8E40: PCSetUp_MG (mg.c:556) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BF3A5: __strcmp_ssse3 (strcmp-ssse3.S:1687) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x85E8F9A: PCCreate_ILU (ilu.c:379) > > ==2175== by 0x8589A8B: PCSetType (pcset.c:78) > > ==2175== by 0x858A64C: PCSetFromOptions (pcset.c:181) > > ==2175== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > > ==2175== by 0x859F978: PCSetUp_BJacobi_Singleblock (bjacobi.c:944) > > ==2175== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== Address 0x4c94a28 is 24 bytes inside a block of size 27 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x85E8F25: PCCreate_ILU (ilu.c:377) > > ==2175== by 0x8589A8B: PCSetType (pcset.c:78) > > ==2175== by 0x858A64C: PCSetFromOptions (pcset.c:181) > > ==2175== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > > ==2175== by 0x859F978: PCSetUp_BJacobi_Singleblock (bjacobi.c:944) > > ==2175== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > > ==2175== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BF3A5: __strcmp_ssse3 (strcmp-ssse3.S:1687) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x85E8F9A: PCCreate_ILU (ilu.c:379) > > ==2174== by 0x8589A8B: PCSetType (pcset.c:78) > > ==2174== by 0x858A64C: PCSetFromOptions (pcset.c:181) > > ==2174== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > > ==2174== by 0x859F978: PCSetUp_BJacobi_Singleblock (bjacobi.c:944) > > ==2174== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== Address 0x4d58a28 is 24 bytes inside a block of size 27 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x85E8F25: PCCreate_ILU (ilu.c:377) > > ==2174== by 0x8589A8B: PCSetType (pcset.c:78) > > ==2174== by 0x858A64C: PCSetFromOptions (pcset.c:181) > > ==2174== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) > > ==2174== by 0x859F978: PCSetUp_BJacobi_Singleblock (bjacobi.c:944) > > ==2174== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) > > ==2174== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) > > ==2175== by 0x8093FC2: PetscTypeCompare (destroy.c:254) > > ==2175== by 0x85A956A: PCSetUp_MG (mg.c:585) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x8622C90: KSPSolve (itfunc.c:353) > > ==2175== by 0x81B0295: SNES_KSPSolve (snes.c:2944) > > ==2175== by 0x86A2C12: SNESSolve_LS (ls.c:191) > > ==2175== by 0x81AB5EC: SNESSolve (snes.c:2255) > > ==2175== by 0x81C4919: DMMGSolveSNES (damgsnes.c:510) > > ==2175== by 0x81BDF6C: DMMGSolve (damg.c:313) > > ==2175== by 0x804C9A7: main (ex19.c:155) > > ==2175== Address 0x4798858 is 8 bytes inside a block of size 10 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x80902DB: PetscObjectChangeTypeName (pname.c:114) > > ==2175== by 0x8589AEE: PCSetType (pcset.c:79) > > ==2175== by 0x85A4F3C: PCMGSetLevels (mg.c:187) > > ==2175== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) > > ==2175== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) > > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2175== by 0x804C4FF: main (ex19.c:140) > > ==2175== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) > > ==2175== by 0x8093FC2: PetscTypeCompare (destroy.c:254) > > ==2175== by 0x85A956A: PCSetUp_MG (mg.c:585) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x8622C90: KSPSolve (itfunc.c:353) > > ==2175== by 0x81B0295: SNES_KSPSolve (snes.c:2944) > > ==2175== by 0x86A2C12: SNESSolve_LS (ls.c:191) > > ==2175== by 0x81AB5EC: SNESSolve (snes.c:2255) > > ==2175== by 0x81C4919: DMMGSolveSNES (damgsnes.c:510) > > ==2175== by 0x81BDF6C: DMMGSolve (damg.c:313) > > ==2175== by 0x804C9A7: main (ex19.c:155) > > ==2175== Address 0x4798860 is 6 bytes after a block of size 10 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x80902DB: PetscObjectChangeTypeName (pname.c:114) > > ==2175== by 0x8589AEE: PCSetType (pcset.c:79) > > ==2175== by 0x85A4F3C: PCMGSetLevels (mg.c:187) > > ==2175== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) > > ==2175== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) > > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2175== by 0x804C4FF: main (ex19.c:140) > > ==2175== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) > > ==2174== by 0x8093FC2: PetscTypeCompare (destroy.c:254) > > ==2174== by 0x85A956A: PCSetUp_MG (mg.c:585) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x8622C90: KSPSolve (itfunc.c:353) > > ==2174== by 0x81B0295: SNES_KSPSolve (snes.c:2944) > > ==2174== by 0x86A2C12: SNESSolve_LS (ls.c:191) > > ==2174== by 0x81AB5EC: SNESSolve (snes.c:2255) > > ==2174== by 0x81C4919: DMMGSolveSNES (damgsnes.c:510) > > ==2174== by 0x81BDF6C: DMMGSolve (damg.c:313) > > ==2174== by 0x804C9A7: main (ex19.c:155) > > ==2174== Address 0x48598d8 is 8 bytes inside a block of size 10 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x80902DB: PetscObjectChangeTypeName (pname.c:114) > > ==2174== by 0x8589AEE: PCSetType (pcset.c:79) > > ==2174== by 0x85A4F3C: PCMGSetLevels (mg.c:187) > > ==2174== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) > > ==2174== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) > > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2174== by 0x804C4FF: main (ex19.c:140) > > ==2174== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) > > ==2174== by 0x8093FC2: PetscTypeCompare (destroy.c:254) > > ==2174== by 0x85A956A: PCSetUp_MG (mg.c:585) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x8622C90: KSPSolve (itfunc.c:353) > > ==2174== by 0x81B0295: SNES_KSPSolve (snes.c:2944) > > ==2174== by 0x86A2C12: SNESSolve_LS (ls.c:191) > > ==2174== by 0x81AB5EC: SNESSolve (snes.c:2255) > > ==2174== by 0x81C4919: DMMGSolveSNES (damgsnes.c:510) > > ==2174== by 0x81BDF6C: DMMGSolve (damg.c:313) > > ==2174== by 0x804C9A7: main (ex19.c:155) > > ==2174== Address 0x48598e0 is 6 bytes after a block of size 10 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x80902DB: PetscObjectChangeTypeName (pname.c:114) > > ==2174== by 0x8589AEE: PCSetType (pcset.c:79) > > ==2174== by 0x85A4F3C: PCMGSetLevels (mg.c:187) > > ==2174== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) > > ==2174== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) > > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2174== by 0x804C4FF: main (ex19.c:140) > > ==2174== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF669: __strcmp_ssse3 (strcmp-ssse3.S:2007) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF66E: __strcmp_ssse3 (strcmp-ssse3.S:2010) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF673: __strcmp_ssse3 (strcmp-ssse3.S:2013) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF678: __strcmp_ssse3 (strcmp-ssse3.S:2016) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF67D: __strcmp_ssse3 (strcmp-ssse3.S:2019) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8086A53: Pets630) > > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2175== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BF10D: __strcmp_ssse3 (strcmp-ssse3.S:1446) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== cFListDuplicate (reg.c:596) > > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF669: __strcmp_ssse3 (strcmp-ssse3.S:2007) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2174== by 0x82D2300: MatILUFactorSym by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2175== Address 0x501a448 is 24 bytes inside a block of size 30 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x82BE3A6: MatCreate_SeqAIJ (aij.c:3363) > > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2175== by 0x82C3982: MatGetFactor_seqaij_petsc (aijfact.c:118) > > ==2175== by 0x8276893: MatGetFactor (matrix.c:3649) > > ==2175== by 0x85E7687: PCSetUp_ILU (ilu.c:202) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==21bolic_SeqAIJ (aijfact.c:1731) > > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF66E: __strcmp_ssse3 (strcmp-ssse3.S:2010) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x86275== at 0x43BF12F: __strcmp_ssse3 (strcmp-ssse3.S:1456) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2175== by 0x82BFA68: MatDuplicateNoCre1C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF673: __strcmp_ssse3 (strcmp-ssse3.S:2013) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2174== > > ==2174== Conditional jump or move depate_SeqAIJ (aij.c:3511) > > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2175== by 0x82866FB: MatILUFactorSymends on uninitialised value(s) > > ==2174== at 0x43BF678: __strcmp_ssse3 (strcmp-ssse3.S:2016) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF67D: __strcmp_ssse3 (strcmp-ssse3.S:2019) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2174==bolic (matrix.c:5464) > > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x859D9CE: PCSetUpOnBlocks by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2174== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BF10D: __strcmp_ssse3 (strcmp-ssse3.S:1446) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2174== by 0x82866FB: MatILUFactorSymb_BJacobi_Singleblock (bjacobi.c:753) > > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5D4: __strcmp_ssse3olic (matrix.c:5464) > > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2174== Address 0x5799948 is 24 bytes inside a block of size 30 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x82BE3A6: MatCreate_SeqAIJ (aij.c:3363) > > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2174== by 0x82C3982: MatGetFactor_seqaij_petsc (aijfact.c:118) > > ==2174== by 0x8276893: MatGetFactor (matrix.c:3649) > > ==2174== by 0x85E7687: PCSetUp_ILU (ilu.c:202) > > = (strcmp-ssse3.S:1902) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5D9: __strcmp_ssse3 (strcmp-ssse3.S:1905) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2175== =2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF12F: __strcmp_ssse3 (strcmp-ssse3.S:1456) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > > ==2174== by 0x80842Bby 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2175== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43197BE: __strlen_sse2 (strlen.S:112) > > ==2175== by 0x80BDE83: PetscStrallocpy (str.c:79) > > ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > > ==2175== by 0x808432D: PetscFListAdd (reg.c:225) > > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu02175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== Address 0x45afcd0 is 32 bytes inside a block of size 35 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > > ==2175== by 0x80846F3: PetscFListAdd (reg.c:238) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x82BE3A6: MatCreate_SeqAIJ (aij.c:3363) > > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2175== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > > ==2175== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > > ==2175== by 0x818A96F: DAGetMatrix2d_MPIAIJ (fdda.c:779) > > (aijfact.c:1630) > > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2174== ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5DE: __strcmp_ssse3 (strcmp-ssse3.S:1908) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2175== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BF165: __strcmp_ssse3 (strcmp-ssse3.S:1474) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:59 by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2174== by 0x8714602: PCSetUpOnBlocks (pr6) > > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2175== Address 0x541be90 is 32 bytes inside a block of size 37 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x82BEAecon.c:828) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5D4: __strcmp_ssse3 (strcmp-ssse3.S:1902) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5D9: __strcmp_ssse3 (strcmp-ssse3.S:1905) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174=33: MatCreate_SeqAIJ (aij.c:3408) > > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2175== by 0x82C3982: MatGetFactor_seqaij_petsc (aijfact.c:118) > > ==2175== by 0x8276893: MatGetFactor (matrix.c:3649) > > ==2175== by 0x85E7687: PCSetUp_ILU (ilu.c:202) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF18F: __strcmp_ssse3 (strcmp-ssse3.S:1485) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2175== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2175== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2175== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2175== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2175== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2175== by 0x8714039: PCSetUp (precon.c:795) > > ==2175== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2175== by 0x859D9CE: PCSetUpOnBlock= by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2174== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43197BE: __strlen_sse2 (strlen.S:112) > > ==2174== by 0x80BDE83: PetscStrallocpy (str.c:79) > > ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > > ==2174== by 0x808432D: PetscFListAdd (reg.c:225) > > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2174== by 0x82s_BJacobi_Singleblock (bjacobi.c:753) > > ==2175== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2175== > > D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== Address 0x4642530 is 32 bytes inside a block of size 35 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) > > ==2174== by 0x80846F3: PetscFListAdd (reg.c:238) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x82BE3A6: MatCreate_SeqAIJ (aij.c:3363) > > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2174== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > > ==2174== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > > ==2174== by 0x818A96F: DAGetMatrix2d_MPIAIJ (fdda.c:779) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5DE: __strcmp_ssse3 (strcmp-ssse3.S:1908) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2174== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BF165: __strcmp_ssse3 (strcmp-ssse3.S:1474) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2174== Address 0x54138b0 is 32 bytes inside a block of size 37 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x82BEA33: MatCreate_SeqAIJ (aij.c:3408) > > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2174== by 0x82C3982: MatGetFactor_seqaij_petsc (aijfact.c:118) > > ==2174== by 0x8276893: MatGetFactor (matrix.c:3649) > > ==2174== by 0x85E7687: PCSetUp_ILU (ilu.c:202) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF18F: __strcmp_ssse3 (strcmp-ssse3.S:1485) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8086A53: PetscFListDuplicate (reg.c:596) > > ==2174== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) > > ==2174== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) > > ==2174== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) > > ==2174== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) > > ==2174== by 0x85E7785: PCSetUp_ILU (ilu.c:204) > > ==2174== by 0x8714039: PCSetUp (precon.c:795) > > ==2174== by 0x8621C54: KSPSetUp (itfunc.c:237) > > ==2174== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) > > ==2174== by 0x8714602: PCSetUpOnBlocks (precon.c:828) > > ==2174== > > Number of Newton iterations = 2 > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BEFE9: __strcmp_ssse3 (strcmp-ssse3.S:1339) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x86A4028: SNESDestroy_LS (ls.c:322) > > ==2175== by 0x81A578E: SNESDestroy (snes.c:1406) > > ==2175== by 0x8093606: PetscObjectDestroy (destroy.c:172) > > ==2175== by 0x81BCE39: DMMGDestroy (damg.c:179) > > ==2175== by 0x804CBD4: main (ex19.c:174) > > ==2175== Address 0x4b28898 is 24 bytes inside a block of size 28 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x86AA093: SNESCreate_LS (ls.c:1199) > > ==2175== by 0x81AC1EF: SNESSetType (snes.c:2353) > > ==2175== by 0x819BDE2: SNESSetFromOptions (snes.c:306) > > ==2175== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) > > ==2175== by 0x804C56B: main (ex19.c:141) > > ==2175== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BF015: __strcmp_ssse3 (strcmp-ssse3.S:1354) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x86A4028: SNESDestroy_LS (ls.c:322) > > ==2175== by 0x81A578E: SNESDestroy (snes.c:1406) > > ==2175== by 0x8093606: PetscObjectDestroy (destroy.c:172) > > ==2175== by 0x81BCE39: DMMGDestroy (damg.c:179) > > ==2175== by 0x804CBD4: main (ex19.c:174) > > ==2175== Address 0x4b28898 is 24 bytes inside a block of size 28 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x86AA093: SNESCreate_LS (ls.c:1199) > > ==2175== by 0x81AC1EF: SNESSetType (snes.c:2353) > > ==2175== by 0x819BDE2: SNESSetFromOptions (snes.c:306) > > ==2175== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) > > ==2175== by 0x804C56B: main (ex19.c:141) > > ==2175== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BF04A: __strcmp_ssse3 (strcmp-ssse3.S:1369) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x86A4028: SNESDestroy_LS (ls.c:322) > > ==2175== by 0x81A578E: SNESDestroy (snes.c:1406) > > ==2175== by 0x8093606: PetscObjectDestroy (destroy.c:172) > > ==2175== by 0x81BCE39: DMMGDestroy (damg.c:179) > > ==2175== by 0x804CBD4: main (ex19.c:174) > > ==2175== Address 0x4b288a0 is 4 bytes after a block of size 28 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x86AA093: SNESCreate_LS (ls.c:1199) > > ==2175== by 0x81AC1EF: SNESSetType (snes.c:2353) > > ==2175== by 0x819BDE2: SNESSetFromOptions (snes.c:306) > > ==2175== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) > > ==2175== by 0x804C56B: main (ex19.c:141) > > ==2175== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BEFE9: __strcmp_ssse3 (strcmp-ssse3.S:1339) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x86A4028: SNESDestroy_LS (ls.c:322) > > ==2174== by 0x81A578E: SNESDestroy (snes.c:1406) > > ==2174== by 0x8093606: PetscObjectDestroy (destroy.c:172) > > ==2174== by 0x81BCE39: DMMGDestroy (damg.c:179) > > ==2174== by 0x804CBD4: main (ex19.c:174) > > ==2174== Address 0x4be1e28 is 24 bytes inside a block of size 28 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x86AA093: SNESCreate_LS (ls.c:1199) > > ==2174== by 0x81AC1EF: SNESSetType (snes.c:2353) > > ==2174== by 0x819BDE2: SNESSetFromOptions (snes.c:306) > > ==2174== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) > > ==2174== by 0x804C56B: main (ex19.c:141) > > ==2174== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BF015: __strcmp_ssse3 (strcmp-ssse3.S:1354) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x86A4028: SNESDestroy_LS (ls.c:322) > > ==2174== by 0x81A578E: SNESDestroy (snes.c:1406) > > ==2174== by 0x8093606: PetscObjectDestroy (destroy.c:172) > > ==2174== by 0x81BCE39: DMMGDestroy (damg.c:179) > > ==2174== by 0x804CBD4: main (ex19.c:174) > > ==2174== Address 0x4be1e28 is 24 bytes inside a block of size 28 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x86AA093: SNESCreate_LS (ls.c:1199) > > ==2174== by 0x81AC1EF: SNESSetType (snes.c:2353) > > ==2174== by 0x819BDE2: SNESSetFromOptions (snes.c:306) > > ==2174== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) > > ==2174== by 0x804C56B: main (ex19.c:141) > > ==2174== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BF04A: __strcmp_ssse3 (strcmp-ssse3.S:1369) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x86A4028: SNESDestroy_LS (ls.c:322) > > ==2174== by 0x81A578E: SNESDestroy (snes.c:1406) > > ==2174== by 0x8093606: PetscObjectDestroy (destroy.c:172) > > ==2174== by 0x81BCE39: DMMGDestroy (damg.c:179) > > ==2174== by 0x804CBD4: main (ex19.c:174) > > ==2174== Address 0x4be1e30 is 4 bytes after a block of size 28 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x86AA093: SNESCreate_LS (ls.c:1199) > > ==2174== by 0x81AC1EF: SNESSetType (snes.c:2353) > > ==2174== by 0x819BDE2: SNESSetFromOptions (snes.c:306) > > ==2174== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) > > ==2174== by 0x804C56B: main (ex19.c:141) > > ==2174== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BF469: __strcmp_ssse3 (strcmp-ssse3.S:1765) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x830D1DF: MatDestroy_SeqAIJ_Inode (inode2.c:62) > > ==2175== by 0x82AB1E1: MatDestroy_SeqAIJ (aij.c:810) > > ==2175== by 0x82595D1: MatDestroy (matrix.c:876) > > ==2175== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) > > ==2175== by 0x82595D1: MatDestroy (matrix.c:876) > > ==2175== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) > > ==2175== by 0x82595D1: MatDestroy (matrix.c:876) > > ==2175== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) > > ==2175== Address 0x44df298 is 24 bytes inside a block of size 26 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x830D905: MatCreate_SeqAIJ_Inode (inode2.c:101) > > ==2175== by 0x82BEB04: MatCreate_SeqAIJ (aij.c:3414) > > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2175== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > > ==2175== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > > ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > > ==2175== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BF46D: __strcmp_ssse3 (strcmp-ssse3.S:1766) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x82AB357: MatDestroy_SeqAIJ (aij.c:815) > > ==2175== by 0x82595D1: MatDestroy (matrix.c:876) > > ==2175== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) > > ==2175== by 0x82595D1: MatDestroy (matrix.c:876) > > ==2175== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) > > ==2175== by 0x82595D1: MatDestroy (matrix.c:876) > > ==2175== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) > > ==2175== by 0x85A5A49: PCDestroy_MG (mg.c:257) > > ==2175== Address 0x44dd918 is 24 bytes inside a block of size 28 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x82BE442: MatCreate_SeqAIJ (aij.c:3369) > > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2175== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > > ==2175== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > > ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2175== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BF22D: __strcmp_ssse3 (strcmp-ssse3.S:1553) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x82AB527: MatDestroy_SeqAIJ (aij.c:819) > > ==2175== by 0x82595D1: MatDestroy (matrix.c:876) > > ==2175== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) > > ==2175== by 0x82595D1: MatDestroy (matrix.c:876) > > ==2175== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) > > ==2175== by 0x82595D1: MatDestroy (matrix.c:876) > > ==2175== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) > > ==2175== by 0x85A5A49: PCDestroy_MG (mg.c:257) > > ==2175== Address 0x44ddfa8 is 24 bytes inside a block of size 28 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x82BE616: MatCreate_SeqAIJ (aij.c:3381) > > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2175== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > > ==2175== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > > ==2175== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2175== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43BEFED: __strcmp_ssse3 (strcmp-ssse3.S:1340) > > ==2175== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x8327F64: MatDestroy_MPIAIJ (mpiaij.c:919) > > ==2175== by 0x82595D1: MatDestroy (matrix.c:876) > > ==2175== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) > > ==2175== by 0x82595D1: MatDestroy (matrix.c:876) > > ==2175== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) > > ==2175== by 0x85A5A49: PCDestroy_MG (mg.c:257) > > ==2175== by 0x870CC31: PCDestroy (precon.c:83) > > ==2175== by 0x8627601: KSPDestroy (itfunc.c:695) > > ==2175== Address 0x44db848 is 24 bytes inside a block of size 28 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2175== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2175== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2175== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2175== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) > > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2175== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) > > ==2175== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2175== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) > > ==2175== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2175== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BF469: __strcmp_ssse3 (strcmp-ssse3.S:1765) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x830D1DF: MatDestroy_SeqAIJ_Inode (inode2.c:62) > > ==2174== by 0x82AB1E1: MatDestroy_SeqAIJ (aij.c:810) > > ==2174== by 0x82595D1: MatDestroy (matrix.c:876) > > ==2174== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) > > ==2174== by 0x82595D1: MatDestroy (matrix.c:876) > > ==2174== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) > > ==2174== by 0x82595D1: MatDestroy (matrix.c:876) > > ==2174== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) > > ==2174== Address 0x456f218 is 24 bytes inside a block of size 26 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x830D905: MatCreate_SeqAIJ_Inode (inode2.c:101) > > ==2174== by 0x82BEB04: MatCreate_SeqAIJ (aij.c:3414) > > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2174== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > > ==2174== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > > ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > > ==2174== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BF46D: __strcmp_ssse3 (strcmp-ssse3.S:1766) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x82AB357: MatDestroy_SeqAIJ (aij.c:815) > > ==2174== by 0x82595D1: MatDestroy (matrix.c:876) > > ==2174== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) > > ==2174== by 0x82595D1: MatDestroy (matrix.c:876) > > ==2174== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) > > ==2174== by 0x82595D1: MatDestroy (matrix.c:876) > > ==2174== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) > > ==2174== by 0x85A5A49: PCDestroy_MG (mg.c:257) > > ==2174== Address 0x456d898 is 24 bytes inside a block of size 28 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x82BE442: MatCreate_SeqAIJ (aij.c:3369) > > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2174== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > > ==2174== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > > ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2174== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BF22D: __strcmp_ssse3 (strcmp-ssse3.S:1553) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x82AB527: MatDestroy_SeqAIJ (aij.c:819) > > ==2174== by 0x82595D1: MatDestroy (matrix.c:876) > > ==2174== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) > > ==2174== by 0x82595D1: MatDestroy (matrix.c:876) > > ==2174== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) > > ==2174== by 0x82595D1: MatDestroy (matrix.c:876) > > ==2174== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) > > ==2174== by 0x85A5A49: PCDestroy_MG (mg.c:257) > > ==2174== Address 0x456df28 is 24 bytes inside a block of size 28 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x82BE616: MatCreate_SeqAIJ (aij.c:3381) > > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2174== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) > > ==2174== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) > > ==2174== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) > > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2174== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43BEFED: __strcmp_ssse3 (strcmp-ssse3.S:1340) > > ==2174== by 0x80842B8: PetscFListAdd (reg.c:223) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x8327F64: MatDestroy_MPIAIJ (mpiaij.c:919) > > ==2174== by 0x82595D1: MatDestroy (matrix.c:876) > > ==2174== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) > > ==2174== by 0x82595D1: MatDestroy (matrix.c:876) > > ==2174== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) > > ==2174== by 0x85A5A49: PCDestroy_MG (mg.c:257) > > ==2174== by 0x870CC31: PCDestroy (precon.c:83) > > ==2174== by 0x8627601: KSPDestroy (itfunc.c:695) > > ==2174== Address 0x456b7c8 is 24 bytes inside a block of size 28 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x80BDF14: PetscStrallocpy (str.c:80) > > ==2174== by 0x8084689: PetscFListAdd (reg.c:237) > > ==2174== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) > > ==2174== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) > > ==2174== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) > > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2174== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) > > ==2174== by 0x829D1E0: MatSetType (matreg.c:65) > > ==2174== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) > > ==2174== by 0x8179179: DAGetInterpolation (dainterp.c:879) > > ==2174== > > ==2175== Invalid read of size 8 > > ==2175== at 0x43B674F: __memcpy_ssse3 (memcpy-ssse3.S:703) > > ==2175== by 0x8791BD2: MPIR_Allgatherv (allgatherv.c:160) > > ==2175== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) > > ==2175== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) > > ==2175== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) > > ==2175== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) > > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2175== by 0x804C4FF: main (ex19.c:140) > > ==2175== Address 0x4c37f38 is 0 bytes after a block of size 360 alloc'd > > ==2175== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2175== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2175== by 0x8132724: ISCreateGeneral (general.c:342) > > ==2175== by 0x813BE92: ISColoringGetIS (iscoloring.c:161) > > ==2175== by 0x836235A: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:30) > > ==2175== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) > > ==2175== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) > > ==2175== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2175== by 0x804C4FF: main (ex19.c:140) > > ==2175== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43B674F: __memcpy_ssse3 (memcpy-ssse3.S:703) > > ==2174== by 0x8791BD2: MPIR_Allgatherv (allgatherv.c:160) > > ==2174== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) > > ==2174== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) > > ==2174== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) > > ==2174== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) > > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2174== by 0x804C4FF: main (ex19.c:140) > > ==2174== Address 0x4cb0c68 is 0 bytes after a block of size 360 alloc'd > > ==2174== at 0x4022E01: memalign (vg_replace_malloc.c:532) > > ==2174== by 0x808B1A0: PetscMallocAlign (mal.c:30) > > ==2174== by 0x8132724: ISCreateGeneral (general.c:342) > > ==2174== by 0x813BE92: ISColoringGetIS (iscoloring.c:161) > > ==2174== by 0x836235A: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:30) > > ==2174== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) > > ==2174== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) > > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2174== by 0x804C4FF: main (ex19.c:140) > > ==2174== > > ==2174== Invalid read of size 8 > > ==2174== at 0x43B674F: __memcpy_ssse3 (memcpy-ssse3.S:703) > > ==2174== by 0x87923CB: MPIR_Allgatherv (allgatherv.c:340) > > ==2174== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) > > ==2174== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) > > ==2174== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) > > ==2174== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) > > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2174== by 0x804C4FF: main (ex19.c:140) > > ==2174== Address 0x4461998 is 0 bytes after a block of size 720 alloc'd > > ==2174== at 0x4023BF3: malloc (vg_replace_malloc.c:195) > > ==2174== by 0x8791B68: MPIR_Allgatherv (allgatherv.c:143) > > ==2174== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) > > ==2174== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) > > ==2174== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) > > ==2174== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) > > ==2174== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) > > ==2174== by 0x804C4FF: main (ex19.c:140) > > ==2174== > > lid velocity = 0.000287274, prandtl # = 1, grashof # = 1 > > Number of Newton iterations = 2 > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) > > ==2175== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2175== by 0x80B0A16: PetscFinalize (pinit.c:829) > > ==2175== by 0x804CCA7: main (ex19.c:181) > > ==2175== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) > > ==2174== by 0x80A3E53: PetscOptionsGetString (options.c:1693) > > ==2174== by 0x80B0A16: PetscFinalize (pinit.c:829) > > ==2174== by 0x804CCA7: main (ex19.c:181) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > > ==2174== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > > ==2174== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > > ==2174== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > > ==2174== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > > ==2174== by 0x80B2276: PetscFinalize (pinit.c:973) > > ==2174== by 0x804CCA7: main (ex19.c:181) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > > ==2174== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > > ==2174== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > > ==2174== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > > ==2174== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > > ==2174== by 0x80B2276: PetscFinalize (pinit.c:973) > > ==2174== by 0x804CCA7: main (ex19.c:181) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > > ==2174== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > > ==2174== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > > ==2174== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > > ==2174== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > > ==2174== by 0x80B2276: PetscFinalize (pinit.c:973) > > ==2174== by 0x804CCA7: main (ex19.c:181) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > > ==2174== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > > ==2174== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > > ==2174== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > > ==2174== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > > ==2174== by 0x80B2276: PetscFinalize (pinit.c:973) > > ==2174== by 0x804CCA7: main (ex19.c:181) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > > ==2174== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > > ==2174== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > > ==2174== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > > ==2174== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > > ==2174== by 0x80B2276: PetscFinalize (pinit.c:973) > > ==2174== by 0x804CCA7: main (ex19.c:181) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) > > ==2174== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > > ==2174== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > > ==2174== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > > ==2174== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > > ==2174== by 0x80B2276: PetscFinalize (pinit.c:973) > > ==2174== by 0x804CCA7: main (ex19.c:181) > > ==2174== > > ==2174== Conditional jump or move depends on uninitialised value(s) > > ==2174== at 0x43BF5D4: __strcmp_ssse3 (strcmp-ssse3.S:1902) > > ==2174== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > > ==2174== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > > ==2174== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > > ==2174== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > > ==2174== by 0x80B2276: PetscFinalize (pinit.c:973) > > ==2174== by 0x804CCA7: main (ex19.c:181) > > ==2174== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) > > ==2175== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > > ==2175== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > > ==2175== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > > ==2175== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > > ==2175== by 0x80B2276: PetscFinalize (pinit.c:973) > > ==2175== by 0x804CCA7: main (ex19.c:181) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) > > ==2175== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > > ==2175== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > > ==2175== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > > ==2175== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > > ==2175== by 0x80B2276: PetscFinalize (pinit.c:973) > > ==2175== by 0x804CCA7: main (ex19.c:181) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) > > ==2175== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > > ==2175== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > > ==2175== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > > ==2175== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > > ==2175== by 0x80B2276: PetscFinalize (pinit.c:973) > > ==2175== by 0x804CCA7: main (ex19.c:181) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) > > ==2175== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > > ==2175== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > > ==2175== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > > ==2175== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > > ==2175== by 0x80B2276: PetscFinalize (pinit.c:973) > > ==2175== by 0x804CCA7: main (ex19.c:181) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) > > ==2175== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > > ==2175== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > > ==2175== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > > ==2175== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > > ==2175== by 0x80B2276: PetscFinalize (pinit.c:973) > > ==2175== by 0x804CCA7: main (ex19.c:181) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) > > ==2175== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > > ==2175== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > > ==2175== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > > ==2175== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > > ==2175== by 0x80B2276: PetscFinalize (pinit.c:973) > > ==2175== by 0x804CCA7: main (ex19.c:181) > > ==2175== > > ==2175== Conditional jump or move depends on uninitialised value(s) > > ==2175== at 0x43BF5D4: __strcmp_ssse3 (strcmp-ssse3.S:1902) > > ==2175== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) > > ==2175== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) > > ==2175== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) > > ==2175== by 0x879FA8C: PMPI_Finalize (finalize.c:158) > > ==2175== by 0x80B2276: PetscFinalize (pinit.c:973) > > ==2175== by 0x804CCA7: main (ex19.c:181) > > ==2175== > > ==2174== > > ==2174== HEAP SUMMARY: > > ==2174== in use at exit: 160 bytes in 11 blocks > > ==2174== total heap usage: 60,266 allocs, 60,255 frees, 51,015,236 bytes allocated > > ==2174== > > ==2174== LEAK SUMMARY: > > ==2174== definitely lost: 40 bytes in 1 blocks > > ==2174== indirectly lost: 120 bytes in 10 blocks > > ==2174== possibly lost: 0 bytes in 0 blocks > > ==2174== still reachable: 0 bytes in 0 blocks > > ==2174== suppressed: 0 bytes in 0 blocks > > ==2174== Rerun with --leak-check=full to see details of leaked memory > > ==2174== > > ==2174== For counts of detected and suppressed errors, rerun with: -v > > ==2174== Use --track-origins=yes to see where uninitialised values come from > > ==2174== ERROR SUMMARY: 15690 errors from 164 contexts (suppressed: 0 from 0) > > ==2175== > > ==2175== HEAP SUMMARY: > > ==2175== in use at exit: 160 bytes in 11 blocks > > ==2175== total heap usage: 59,069 allocs, 59,058 frees, 49,630,900 bytes allocated > > ==2175== > > ==2175== LEAK SUMMARY: > > ==2175== definitely lost: 40 bytes in 1 blocks > > ==2175== indirectly lost: 120 bytes in 10 blocks > > ==2175== possibly lost: 0 bytes in 0 blocks > > ==2175== still reachable: 0 bytes in 0 blocks > > ==2175== suppressed: 0 bytes in 0 blocks > > ==2175== Rerun with --leak-check=full to see details of leaked memory > > ==2175== > > ==2175== For counts of detected and suppressed errors, rerun with: -v > > ==2175== Use --track-origins=yes to see where uninitialised values come from > > ==2175== ERROR SUMMARY: 15664 errors from 162 contexts (suppressed: 0 from 0) > > > > > > What is going on here? Shall I ignore those errors? > > > > Thanks a lot! > > > > Rebecca Xuefei YUAN > > Department of Applied Physics and Applied Mathematics > > Columbia University > > Tel:917-399-8032 > > www.columbia.edu/~xy2102 > > > > From lvankampenhout at gmail.com Tue Sep 7 05:03:34 2010 From: lvankampenhout at gmail.com (Leo van Kampenhout) Date: Tue, 7 Sep 2010 12:03:34 +0200 Subject: [petsc-users] [Fortran] checking "ierr" return codes Message-ID: >From the Petsc-manual: The user should check the return codes for all PETSc routines (and possibly user-defined routines as well) with ierr = PetscRoutine(...);CHKERRQ(PetscErrorCode ierr); Likewise, all memory allocations should be checked with ierr = PetscMalloc(n*sizeof(double),&ptr);CHKERRQ(ierr); If this procedure is followed throughout all of the user?s libraries and codes, any error will by default generate a clean traceback of the location of the error. Note that the macro __FUNCT__ is used to keep track of routine names during error tracebacks. Users need not worry about this macro in their application codes; however, users can take advantage of this feature if desired by setting this macro before each user-defined routine that may call SETERRQ(), CHKERRQ(). A simple example of usage is given below. #undef FUNCT #define FUNCT ?MyRoutine1? int MyRoutine1() { /* code here */ return 0; } My question is, what is the correct way do this in Fortran? I have seen the following variant in an example (ex35f90), but I couldn't get it to work myself. Also, it's tedious to copy this macro every time you start a new project. ! Error handler forces traceback of where error occurred subroutine HE2(ierr,line) use mex35f90 use mex35f90interfaces call PetscError(ierr,line,0,'',ierr) return end #define CHKR(n) if(n .ne. 0)then;call HE2(n,__LINE__);return;endif Thanks in advance. Leo van Kampenhout -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Sep 7 06:51:12 2010 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 7 Sep 2010 13:51:12 +0200 Subject: [petsc-users] [Fortran] checking "ierr" return codes In-Reply-To: References: Message-ID: You can use CHKERRQ(ierr) Matt On Tue, Sep 7, 2010 at 12:03 PM, Leo van Kampenhout < lvankampenhout at gmail.com> wrote: > From the Petsc-manual: > > The user should check the return codes for all PETSc routines (and possibly > user-defined routines as well) > with > ierr = PetscRoutine(...);CHKERRQ(PetscErrorCode ierr); > Likewise, all memory allocations should be checked with > ierr = PetscMalloc(n*sizeof(double),&ptr);CHKERRQ(ierr); > If this procedure is followed throughout all of the user?s libraries and > codes, any error will by default generate > a clean traceback of the location of the error. > Note that the macro __FUNCT__ is used to keep track of routine names > during error tracebacks. Users > need not worry about this macro in their application codes; however, users > can take advantage of this feature > if desired by setting this macro before each user-defined routine that may > call SETERRQ(), CHKERRQ(). > A simple example of usage is given below. > #undef FUNCT > #define FUNCT ?MyRoutine1? > int MyRoutine1() { > /* code here */ > return 0; > } > > My question is, what is the correct way do this in Fortran? I have seen > the following variant in an example (ex35f90), but I couldn't get it to work > myself. Also, it's tedious to copy this macro every time you start a new > project. > > ! Error handler forces traceback of where error occurred > subroutine HE2(ierr,line) > use mex35f90 > use mex35f90interfaces > > call PetscError(ierr,line,0,'',ierr) > return > end > #define CHKR(n) if(n .ne. 0)then;call HE2(n,__LINE__);return;endif > > Thanks in advance. > > Leo van Kampenhout > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From petsc-maint at mcs.anl.gov Tue Sep 7 06:54:53 2010 From: petsc-maint at mcs.anl.gov (Matthew Knepley) Date: Tue, 7 Sep 2010 13:54:53 +0200 Subject: [petsc-users] [petsc-maint #52322] Regarding PETSC using for OOFEM parallel and SLEPc library installation In-Reply-To: References: <000001cb4e6b$25fe6f90$71fb4eb0$@in> Message-ID: The first problem is that we changed the structure slightly in the latest release. Instead of include ${PETSC_DIR}/conf/base you use include ${PETSC_DIR}/conf/variables include ${PETSC_DIR}/conf/rules Thanks, Matt On Tue, Sep 7, 2010 at 11:37 AM, shraddha wrote: > Dear Sir/Madam, > > I have installed petsc-3.1-p4 > > > > petsc-3.1-p4]$ time ./configure 2>&1 | tee LOG-config-2 > > > > Compilers: > > C Compiler: mpicc -Wall -Wwrite-strings -Wno-strict-aliasing -g3 > > Fortran Compiler: mpif90 -Wall -Wno-unused-variable -g > > Linkers: > > Static linker: /usr/bin/ar cr > > MPI: > > Includes: -I/opt/mpich2/gnu/include > > X11: > > Includes: > > Library: -lX11 > > BLAS/LAPACK: -llapack -lblas > > PETSc: > > PETSC_ARCH: linux-gnu-c-debug > > PETSC_DIR: /home/shraddha/PETSC/petsc-31p4/petsc-3.1-p4 > > Clanguage: C > > Memory alignment: 16 > > Scalar type: real > > Precision: double > > shared libraries: disabled > > dynamic libraries: disabled > > > xxx========================================================================= > xxx > > Configure stage complete. Now build PETSc libraries with: > > make PETSC_DIR=/home/shraddha/PETSC/petsc-31p4/petsc-3.1-p4 > PETSC_ARCH=linux-gnu-c-debug all > > > xxx========================================================================= > xxx > > > > MAKE: > > > > make PETSC_DIR=/home/shraddha/PETSC/petsc-31p4/petsc-3.1-p4 > PETSC_ARCH=linux-gnu-c-debug all > > > > [shraddha at Dell petsc-3.1-p4]$ ls linux-gnu-c-debug/ > > bin conf include lib > > > > [shraddha at Dell petsc-3.1-p4]$ ls linux-gnu-c-debug/bin/ > > > > [shraddha at Dell petsc-3.1-p4]$ ls linux-gnu-c-debug/lib/ > > libpetsc.a > > [shraddha at Dell petsc-3.1-p4]$ ls linux-gnu-c-debug/include/ > > petscaodef.mod petscda.mod petsckspdef.mod petscmeshdef.mod > petscpcdef.mod petscsys.mod > > petscao.mod petscdef.mod petscksp.mod petscmesh.mod > petscpc.mod petsctsdef.mod > > petscconf.h petscfix.h petscmachineinfo.h petscmgdef.mod > petscsnesdef.mod petscts.mod > > petscconfiginfo.h petscisdef.mod petscmatdef.mod petscmg.mod > petscsnes.mod petscvecdef.mod > > petscdadef.mod petscis.mod petscmat.mod petsc.mod > petscsysdef.mod petscvec.mod > > > > I have set PETSC_ARCH =linux-gnu or linux-gnu-c-debug in bashrc file. > > > > Now I wanted to install OOFEM parallel module. > > > > #export OOFEM_DIR=/home/shraddha/ONAMA/mechanical/OOFEM/oofem-1.8 > > export OOFEM_DIR=/home/shraddha/ONAMA/mechanical/OOFEM/oofem-1.9 > > > > > > ./configure --prefix=/home/shraddha/ONAMA/mechanical/OOFEM/oofem-1.9 > --enable-poofem --with-MPIDIR=/opt/mpich2/gnu --enable-petsc > --with-PETSCDIR=/home/shraddha/PETSC/petsc-31p4/petsc-3.1-p4 > --with-PARMETISDIR=/home/shraddha/PARMETIS/gnu/ParMetis-3.1.1 > PETSC_ARCH=linux-gnu-c-debug CXX=mpicxx > > > > make > > > > makefile:10: /home/shraddha/PETSC/petsc-31p4/petsc-3.1-p4/conf/base: No > such > file or directory > > makefile:63: generalbc.d: No such file or directory > > makefile:63: boundary.d: No such file or directory > > makefile:63: crosssection.d: No such file or directory > > makefile:63: dictionr.d: No such file or directory > > makefile:63: dof.d: No such file or directory > > > > I am not having petsc-3.1-p4/conf/base file. > > What should I do in this case???Please guide me. > > > > Another kind of error I am getting when I am trying to install SLEPc > library: > > > > slepc-3.1-p2]$ ./configure --help > > SLEPc Configure Help > > > ---------------------------------------------------------------------------- > ---- > > --prefix= : Specifiy location to install SLEPc > (e.g., /usr/local) > > ARPACK: > > --with-arpack : Indicate if you wish to test for ARPACK > (PARPACK) > > --with-arpack-dir= : Indicate the directory for ARPACK > libraries > > --with-arpack-flags= : Indicate comma-separated flags for > linking ARPACK > > BLZPACK: > > --with-blzpack : Indicate if you wish to test for > BLZPACK > > --with-blzpack-dir= : Indicate the directory for BLZPACK > libraries > > --with-blzpack-flags= : Indicate comma-separated flags for > linking BLZPACK > > TRLAN: > > --with-trlan : Indicate if you wish to test for TRLAN > > --with-trlan-dir= : Indicate the directory for TRLAN > libraries > > --with-trlan-flags= : Indicate comma-separated flags for > linking TRLAN > > PRIMME: > > --with-primme : Indicate if you wish to test for PRIMME > > --with-primme-dir= : Indicate the directory for PRIMME > libraries > > --with-primme-flags= : Indicate comma-separated flags for > linking PRIMME > > slepc4py: > > --download-slepc4py : Download and install slepc4py in SLEPc > directory > > > > slepc-3.1-p2]$ ./configure > --prefix=/home/shraddha/ONAMA/mechanical/OOFEM/SLEPc/slepc-3.1-p2 > > > > Checking environment... > > ERROR: SLEPc cannot be configured for non-source installation if PETSc is > not configured in the same way. > > > > Please Guide me regarding this. > > > > Regards, > > Shraddha Desai > > > > > -- > This message has been scanned for viruses and > dangerous content by MailScanner, and is > believed to be clean. > > > > Dear Sir/Madam, > > I have installed *petsc-3.1-p4* > > > > *petsc-3.1-p4]$ time ./configure 2>&1 | tee LOG-config-2* > > > > *Compilers:* > > * C Compiler: mpicc -Wall -Wwrite-strings -Wno-strict-aliasing > -g3* > > * Fortran Compiler: mpif90 -Wall -Wno-unused-variable -g* > > Linkers: > > Static linker: /usr/bin/ar cr > > MPI: > > Includes: -I/opt/mpich2/gnu/include > > X11: > > Includes: > > Library: -lX11 > > *BLAS/LAPACK: -llapack -lblas* > > *PETSc:* > > * PETSC_ARCH: linux-gnu-c-debug* > > * PETSC_DIR: /home/shraddha/PETSC/petsc-31p4/petsc-3.1-p4* > > Clanguage: C > > Memory alignment: 16 > > Scalar type: real > > Precision: double > > shared libraries: disabled > > dynamic libraries: disabled > > > xxx=========================================================================xxx > > Configure stage complete. Now build PETSc libraries with: > > make PETSC_DIR=/home/shraddha/PETSC/petsc-31p4/petsc-3.1-p4 > PETSC_ARCH=linux-gnu-c-debug all > > > xxx=========================================================================xxx > > > > *MAKE:* > > > > *make PETSC_DIR=/home/shraddha/PETSC/petsc-31p4/petsc-3.1-p4 > PETSC_ARCH=linux-gnu-c-debug all* > > > > [shraddha@*Dell petsc-3.1-p4]$ ls linux-gnu-c-debug/* > > *bin conf include lib* > > * * > > [shraddha at Dell *petsc-3.1-p4]$ ls linux-gnu-c-debug/bin/* > > > > [shraddha at Dell *petsc-3.1-p4]$ ls linux-gnu-c-debug/lib/* > > *libpetsc.a* > > [shraddha at Dell *petsc-3.1-p4]$ ls linux-gnu-c-debug/include/* > > petscaodef.mod petscda.mod petsckspdef.mod petscmeshdef.mod > petscpcdef.mod petscsys.mod > > petscao.mod petscdef.mod petscksp.mod petscmesh.mod > petscpc.mod petsctsdef.mod > > petscconf.h petscfix.h petscmachineinfo.h petscmgdef.mod > petscsnesdef.mod petscts.mod > > petscconfiginfo.h petscisdef.mod petscmatdef.mod petscmg.mod > petscsnes.mod petscvecdef.mod > > petscdadef.mod petscis.mod petscmat.mod petsc.mod > petscsysdef.mod petscvec.mod > > > > I have set PETSC_ARCH =linux-gnu or linux-gnu-c-debug in bashrc file. > > > > *Now I wanted to install OOFEM parallel module.* > > > > #export OOFEM_DIR=/home/shraddha/ONAMA/mechanical/OOFEM/oofem-1.8 > > *export OOFEM_DIR=/home/shraddha/ONAMA/mechanical/OOFEM/oofem-1.9* > > > > > > *./configure* --prefix=/home/shraddha/ONAMA/mechanical/OOFEM/oofem-1.9 > --enable-poofem --with-MPIDIR=/opt/mpich2/gnu --enable-petsc > --with-PETSCDIR=/home/shraddha/PETSC/petsc-31p4/petsc-3.1-p4 > --with-PARMETISDIR=/home/shraddha/PARMETIS/gnu/ParMetis-3.1.1 * > PETSC_ARCH=linux-gnu-c-debug* CXX=mpicxx > > > > *make* > > > > *makefile:10: /home/shraddha/PETSC/petsc-31p4/petsc-3.1-p4/conf/base: No > such file or directory* > > makefile:63: generalbc.d: No such file or directory > > makefile:63: boundary.d: No such file or directory > > makefile:63: crosssection.d: No such file or directory > > makefile:63: dictionr.d: No such file or directory > > makefile:63: dof.d: No such file or directory > > > > I am not having *petsc-3.1-p4/conf/base file.* > > *What should I do in this case???Please guide me.* > > * * > > Another kind of error I am getting when I am trying to install SLEPc > library: > > > > *slepc-3.1-p2]$ ./configure --help* > > SLEPc Configure Help > > > -------------------------------------------------------------------------------- > > --prefix= : Specifiy location to install SLEPc > (e.g., /usr/local) > > ARPACK: > > --with-arpack : Indicate if you wish to test for > ARPACK (PARPACK) > > --with-arpack-dir= : Indicate the directory for ARPACK > libraries > > --with-arpack-flags= : Indicate comma-separated flags for > linking ARPACK > > BLZPACK: > > --with-blzpack : Indicate if you wish to test for > BLZPACK > > --with-blzpack-dir= : Indicate the directory for BLZPACK > libraries > > --with-blzpack-flags= : Indicate comma-separated flags for > linking BLZPACK > > TRLAN: > > --with-trlan : Indicate if you wish to test for TRLAN > > --with-trlan-dir= : Indicate the directory for TRLAN > libraries > > --with-trlan-flags= : Indicate comma-separated flags for > linking TRLAN > > PRIMME: > > --with-primme : Indicate if you wish to test for > PRIMME > > --with-primme-dir= : Indicate the directory for PRIMME > libraries > > --with-primme-flags= : Indicate comma-separated flags for > linking PRIMME > > slepc4py: > > --download-slepc4py : Download and install slepc4py in SLEPc > directory > > > > *slepc-3.1-p2]$ ./configure*--prefix=/home/shraddha/ONAMA/mechanical/OOFEM/SLEPc/slepc-3.1-p2 > > > > Checking environment... > > ERROR: SLEPc cannot be configured for non-source installation if PETSc is > not configured in the same way. > > > > Please Guide me regarding this. > > > > Regards, > > Shraddha Desai > > > > -- > This message has been scanned for viruses and > dangerous content by *MailScanner* , and is > believed to be clean. > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jroman at dsic.upv.es Tue Sep 7 07:45:06 2010 From: jroman at dsic.upv.es (Jose E. Roman) Date: Tue, 7 Sep 2010 14:45:06 +0200 Subject: [petsc-users] [petsc-maint #52322] Regarding PETSC using for OOFEM parallel and SLEPc library installation In-Reply-To: References: <000001cb4e6b$25fe6f90$71fb4eb0$@in> Message-ID: <06BF962E-5AB0-4DA0-8B37-EED56CF56810@dsic.upv.es> > slepc-3.1-p2]$ ./configure --prefix=/home/shraddha/ONAMA/mechanical/OOFEM/SLEPc/slepc-3.1-p2 > > > Checking environment... > > ERROR: SLEPc cannot be configured for non-source installation if PETSc is not configured in the same way. > > > Please Guide me regarding this. > If you want to use --prefix in SLEPc then you also have to use --prefix in PETSc (not necessarily in the same directory). See section 1.2.4 of the SLEPc Users Manual. Jose From xy2102 at columbia.edu Tue Sep 7 10:44:45 2010 From: xy2102 at columbia.edu (Rebecca Xuefei Yuan) Date: Tue, 07 Sep 2010 11:44:45 -0400 Subject: [petsc-users] valgrind error comes out when upgrade from ubuntu 8.04 LTS to 10.04 In-Reply-To: References: <20100906203101.9i11ikcdkos4gss4@cubmail.cc.columbia.edu> <196CED0F-3AD9-487C-AC88-FD94624407ED@mcs.anl.gov> <20100906214634.fvvfvn63sos8gsgs@cubmail.cc.columbia.edu> Message-ID: <20100907114445.yr96rmlf4s4kscog@cubmail.cc.columbia.edu> Dear Satish, I reinstalled PETSc by "rm -rf petsc-3.1-p4" and downloaded "petsc-lite-3.1-p4.tar.gz" to start over. The commands are ./config/configure.py --with-cc=gcc --with-fc=gfortran --download-f-blas-lapack=1 --download-mpich=1 make PETSC_DIR=/home/rebecca/soft/petsc-3.1-p4 PETSC_ARCH=linux-gnu-c-debug all make PETSC_DIR=/home/rebecca/soft/petsc-3.1-p4 PETSC_ARCH=linux-gnu-c-debug test When I check the valgrind version rebecca at YuanWork:~/soft$ valgrind --version valgrind-3.5.0 When I check at http://valgrind.org/ the current release is 3.5.0. How could you get 3.6? The errors still come out from the valgrind check with ~/soft/petsc-3.1-p4/externalpackages/mpich2-1.0.8/bin/mpiexec -np 2 valgrind --tool=memcheck ./ex19.exe -malloc off -da_grid_x 30 -da_grid_y 30 I attached the error here: -------------------------------------------------------------- rebecca at YuanWork:~/linux/code/twoway/twoway_brandnew/trunk/set_a$ ~/soft/petsc-3.1-p4/externalpackages/mpich2-1.0.8/bin/mpiexec -np 2 valgrind --tool=memcheck ./ex19.exe -malloc off -da_grid_x 30 -da_grid_y 30 ==1368== Memcheck, a memory error detector ==1368== Copyright (C) 2002-2009, and GNU GPL'd, by Julian Seward et al. ==1368== Using Valgrind-3.5.0 and LibVEX; rerun with -h for copyright info ==1368== Command: ./ex19.exe -malloc off -da_grid_x 30 -da_grid_y 30 ==1368== ==1369== Memcheck, a memory error detector ==1369== Copyright (C) 2002-2009, and GNU GPL'd, by Julian Seward et al. ==1369== Using Valgrind-3.5.0 and LibVEX; rerun with -h for copyright info ==1369== Command: ./ex19.exe -malloc off -da_grid_x 30 -da_grid_y 30 ==1369== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x400B217: _dl_relocate_object (do-rel.h:104) ==1368== by 0x40031D0: dl_main (rtld.c:2229) ==1368== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==1368== by 0x4000C6C: _dl_start (rtld.c:333) ==1368== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) ==1368== by 0x40031D0: dl_main (rtld.c:2229) ==1368== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==1368== by 0x4000C6C: _dl_start (rtld.c:333) ==1368== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x400B27A: _dl_relocate_object (do-rel.h:127) ==1368== by 0x40031D0: dl_main (rtld.c:2229) ==1368== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==1368== by 0x4000C6C: _dl_start (rtld.c:333) ==1368== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x400A5DE: _dl_relocate_object (do-rel.h:65) ==1368== by 0x40030FE: dl_main (rtld.c:2292) ==1368== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==1368== by 0x4000C6C: _dl_start (rtld.c:333) ==1368== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x400A5E6: _dl_relocate_object (do-rel.h:68) ==1368== by 0x40030FE: dl_main (rtld.c:2292) ==1368== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==1368== by 0x4000C6C: _dl_start (rtld.c:333) ==1368== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) ==1368== by 0x40030FE: dl_main (rtld.c:2292) ==1368== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==1368== by 0x4000C6C: _dl_start (rtld.c:333) ==1368== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==1368== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==1368== by 0x87C742A: MPID_Init (mpid_init.c:331) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43197CB: __strlen_sse2 (strlen.S:116) ==1368== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==1368== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==1368== by 0x87C742A: MPID_Init (mpid_init.c:331) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x87F559E: PMIU_writeline (simple_pmiutil.c:180) ==1368== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==1368== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==1368== by 0x87C742A: MPID_Init (mpid_init.c:331) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Use of uninitialised value of size 4 ==1368== at 0x87F55A4: PMIU_writeline (simple_pmiutil.c:184) ==1368== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==1368== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==1368== by 0x87C742A: MPID_Init (mpid_init.c:331) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Syscall param write(count) contains uninitialised byte(s) ==1368== at 0x41B1EB3: __write_nocancel (syscall-template.S:82) ==1368== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==1368== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==1368== by 0x87C742A: MPID_Init (mpid_init.c:331) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x87F566C: PMIU_writeline (simple_pmiutil.c:197) ==1368== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==1368== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==1368== by 0x87C742A: MPID_Init (mpid_init.c:331) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==1368== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==1368== by 0x87C74C7: MPID_Init (mpid_init.c:381) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==1368== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==1368== by 0x87C74C7: MPID_Init (mpid_init.c:381) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==1368== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==1368== by 0x87C74C7: MPID_Init (mpid_init.c:381) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==1368== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==1368== by 0x87C74C7: MPID_Init (mpid_init.c:381) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==1368== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==1368== by 0x87C74C7: MPID_Init (mpid_init.c:381) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==1368== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==1368== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) ==1368== by 0x87C7526: MPID_Init (mpid_init.c:417) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==1368== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==1368== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) ==1368== by 0x87C7526: MPID_Init (mpid_init.c:417) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==1368== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==1368== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) ==1368== by 0x87C7526: MPID_Init (mpid_init.c:417) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==1368== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==1368== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) ==1368== by 0x87C7526: MPID_Init (mpid_init.c:417) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==1368== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==1368== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) ==1368== by 0x87C7526: MPID_Init (mpid_init.c:417) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319863: __GI_strlen (strlen.S:138) ==1368== by 0x438AD43: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:191) ==1368== by 0x438A685: gethostbyname (getXXbyYY.c:117) ==1368== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) ==1368== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) ==1368== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) ==1368== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) ==1368== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x431986D: __GI_strlen (strlen.S:144) ==1368== by 0x438AD43: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:191) ==1368== by 0x438A685: gethostbyname (getXXbyYY.c:117) ==1368== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) ==1368== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) ==1368== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) ==1368== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) ==1368== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x400B217: _dl_relocate_object (do-rel.h:104) ==1369== by 0x40031D0: dl_main (rtld.c:2229) ==1369== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==1369== by 0x4000C6C: _dl_start (rtld.c:333) ==1369== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) ==1369== by 0x40031D0: dl_main (rtld.c:2229) ==1369== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==1369== by 0x4000C6C: _dl_start (rtld.c:333) ==1369== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==1369== ==1368== Invalid read of size 4 ==1368== at 0x431983B: __GI_strlen (strlen.S:115) ==1368== by 0x43843CE: __nss_lookup (nsswitch.c:191) ==1368== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) ==1368== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==1368== by 0x438A685: gethostbyname (getXXbyYY.c:117) ==1368== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) ==1368== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) ==1368== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) ==1368== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) ==1368== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== Address 0x44308f8 is 40 bytes inside a block of size 42 alloc'd ==1368== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==1368== by 0x4384583: nss_parse_service_list (nsswitch.c:622) ==1368== by 0x4384E71: __nss_database_lookup (nsswitch.c:775) ==1368== by 0x43859AD: __nss_hosts_lookup2 (XXX-lookup.c:71) ==1368== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==1368== by 0x438A685: gethostbyname (getXXbyYY.c:117) ==1368== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) ==1368== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) ==1368== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) ==1368== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) ==1368== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x400B27A: _dl_relocate_object (do-rel.h:127) ==1369== by 0x40031D0: dl_main (rtld.c:2229) ==1369== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==1369== by 0x4000C6C: _dl_start (rtld.c:333) ==1369== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==1369== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x400B217: _dl_relocate_object (do-rel.h:104) ==1368== by 0x4011D15: dl_open_worker (dl-open.c:367) ==1368== by 0x400D875: _dl_catch_error (dl-error.c:178) ==1368== by 0x4011675: _dl_open (dl-open.c:583) ==1368== by 0x43AA4A1: do_dlopen (dl-libc.c:86) ==1368== by 0x400D875: _dl_catch_error (dl-error.c:178) ==1368== by 0x43AA5A0: dlerror_run (dl-libc.c:47) ==1368== by 0x43AA6BA: __libc_dlopen_mode (dl-libc.c:160) ==1368== by 0x43842E4: __nss_lookup_function (nsswitch.c:405) ==1368== by 0x43843CE: __nss_lookup (nsswitch.c:191) ==1368== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) ==1368== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) ==1368== by 0x4011D15: dl_open_worker (dl-open.c:367) ==1368== by 0x400D875: _dl_catch_error (dl-error.c:178) ==1368== by 0x4011675: _dl_open (dl-open.c:583) ==1368== by 0x43AA4A1: do_dlopen (dl-libc.c:86) ==1368== by 0x400D875: _dl_catch_error (dl-error.c:178) ==1368== by 0x43AA5A0: dlerror_run (dl-libc.c:47) ==1368== by 0x43AA6BA: __libc_dlopen_mode (dl-libc.c:160) ==1368== by 0x43842E4: __nss_lookup_function (nsswitch.c:405) ==1368== by 0x43843CE: __nss_lookup (nsswitch.c:191) ==1368== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) ==1368== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==1368== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x400A5DE: _dl_relocate_object (do-rel.h:65) ==1369== by 0x40030FE: dl_main (rtld.c:2292) ==1369== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==1369== by 0x4000C6C: _dl_start (rtld.c:333) ==1369== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x400A5E6: _dl_relocate_object (do-rel.h:68) ==1369== by 0x40030FE: dl_main (rtld.c:2292) ==1369== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==1369== by 0x4000C6C: _dl_start (rtld.c:333) ==1369== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) ==1369== by 0x40030FE: dl_main (rtld.c:2292) ==1369== by 0x4014206: _dl_sysdep_start (dl-sysdep.c:243) ==1369== by 0x4000C6C: _dl_start (rtld.c:333) ==1369== by 0x4000856: ??? (in /lib/ld-2.11.1.so) ==1369== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==1368== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==1368== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==1368== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1368== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43197CB: __strlen_sse2 (strlen.S:116) ==1368== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==1368== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==1368== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==1368== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1368== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x87F559E: PMIU_writeline (simple_pmiutil.c:180) ==1368== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==1368== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==1368== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==1368== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1368== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Use of uninitialised value of size 4 ==1368== at 0x87F55A4: PMIU_writeline (simple_pmiutil.c:184) ==1368== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==1368== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==1368== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==1368== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1368== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Syscall param write(count) contains uninitialised byte(s) ==1368== at 0x41B1EB3: __write_nocancel (syscall-template.S:82) ==1368== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==1368== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==1368== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==1368== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1368== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x87F566C: PMIU_writeline (simple_pmiutil.c:197) ==1368== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==1368== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==1368== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==1368== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1368== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==1368== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==1368== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==1368== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1368== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==1368== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==1368== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==1368== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1368== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==1368== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==1368== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==1368== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1368== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==1368== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==1368== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==1368== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1368== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==1368== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==1368== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==1368== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1368== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==1369== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==1369== by 0x87C742A: MPID_Init (mpid_init.c:331) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43197CB: __strlen_sse2 (strlen.S:116) ==1369== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==1369== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==1369== by 0x87C742A: MPID_Init (mpid_init.c:331) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x87F559E: PMIU_writeline (simple_pmiutil.c:180) ==1369== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==1369== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==1369== by 0x87C742A: MPID_Init (mpid_init.c:331) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Use of uninitialised value of size 4 ==1369== at 0x87F55A4: PMIU_writeline (simple_pmiutil.c:184) ==1369== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==1369== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==1369== by 0x87C742A: MPID_Init (mpid_init.c:331) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Syscall param write(count) contains uninitialised byte(s) ==1369== at 0x41B1EB3: __write_nocancel (syscall-template.S:82) ==1369== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==1369== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==1369== by 0x87C742A: MPID_Init (mpid_init.c:331) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x87F566C: PMIU_writeline (simple_pmiutil.c:197) ==1369== by 0x87E2BD2: T.206 (simple_pmi.c:985) ==1369== by 0x87E2FF4: PMI_Init (simple_pmi.c:206) ==1369== by 0x87C742A: MPID_Init (mpid_init.c:331) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==1369== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==1369== by 0x87C74C7: MPID_Init (mpid_init.c:381) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==1369== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==1369== by 0x87C74C7: MPID_Init (mpid_init.c:381) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==1369== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==1369== by 0x87C74C7: MPID_Init (mpid_init.c:381) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==1369== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==1369== by 0x87C74C7: MPID_Init (mpid_init.c:381) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==1369== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==1369== by 0x87C74C7: MPID_Init (mpid_init.c:381) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==1369== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==1369== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) ==1369== by 0x87C7526: MPID_Init (mpid_init.c:417) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==1369== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==1369== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) ==1369== by 0x87C7526: MPID_Init (mpid_init.c:417) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==1369== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==1369== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) ==1369== by 0x87C7526: MPID_Init (mpid_init.c:417) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==1369== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==1369== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) ==1369== by 0x87C7526: MPID_Init (mpid_init.c:417) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==1369== by 0x87E2678: PMI_KVS_Get_my_name (simple_pmi.c:435) ==1369== by 0x87CABAB: MPIDI_PG_InitConnKVS (mpidi_pg.c:753) ==1369== by 0x87C7526: MPID_Init (mpid_init.c:417) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319863: __GI_strlen (strlen.S:138) ==1369== by 0x438AD43: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:191) ==1369== by 0x438A685: gethostbyname (getXXbyYY.c:117) ==1369== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) ==1369== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) ==1369== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) ==1369== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) ==1369== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x431986D: __GI_strlen (strlen.S:144) ==1369== by 0x438AD43: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:191) ==1369== by 0x438A685: gethostbyname (getXXbyYY.c:117) ==1369== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) ==1369== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) ==1369== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) ==1369== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) ==1369== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Invalid read of size 4 ==1369== at 0x431983B: __GI_strlen (strlen.S:115) ==1369== by 0x43843CE: __nss_lookup (nsswitch.c:191) ==1369== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) ==1369== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==1369== by 0x438A685: gethostbyname (getXXbyYY.c:117) ==1369== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) ==1369== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) ==1369== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) ==1369== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) ==1369== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== Address 0x44308f8 is 40 bytes inside a block of size 42 alloc'd ==1369== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==1369== by 0x4384583: nss_parse_service_list (nsswitch.c:622) ==1369== by 0x4384E71: __nss_database_lookup (nsswitch.c:775) ==1369== by 0x43859AD: __nss_hosts_lookup2 (XXX-lookup.c:71) ==1369== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==1369== by 0x438A685: gethostbyname (getXXbyYY.c:117) ==1369== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) ==1369== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) ==1369== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) ==1369== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) ==1369== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x400B217: _dl_relocate_object (do-rel.h:104) ==1369== by 0x4011D15: dl_open_worker (dl-open.c:367) ==1369== by 0x400D875: _dl_catch_error (dl-error.c:178) ==1369== by 0x4011675: _dl_open (dl-open.c:583) ==1369== by 0x43AA4A1: do_dlopen (dl-libc.c:86) ==1369== by 0x400D875: _dl_catch_error (dl-error.c:178) ==1369== by 0x43AA5A0: dlerror_run (dl-libc.c:47) ==1369== by 0x43AA6BA: __libc_dlopen_mode (dl-libc.c:160) ==1369== by 0x43842E4: __nss_lookup_function (nsswitch.c:405) ==1369== by 0x43843CE: __nss_lookup (nsswitch.c:191) ==1369== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) ==1369== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x400AF0F: _dl_relocate_object (do-rel.h:117) ==1369== by 0x4011D15: dl_open_worker (dl-open.c:367) ==1369== by 0x400D875: _dl_catch_error (dl-error.c:178) ==1369== by 0x4011675: _dl_open (dl-open.c:583) ==1369== by 0x43AA4A1: do_dlopen (dl-libc.c:86) ==1369== by 0x400D875: _dl_catch_error (dl-error.c:178) ==1369== by 0x43AA5A0: dlerror_run (dl-libc.c:47) ==1369== by 0x43AA6BA: __libc_dlopen_mode (dl-libc.c:160) ==1369== by 0x43842E4: __nss_lookup_function (nsswitch.c:405) ==1369== by 0x43843CE: __nss_lookup (nsswitch.c:191) ==1369== by 0x438596E: __nss_hosts_lookup2 (XXX-lookup.c:76) ==1369== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==1369== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==1369== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==1369== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1369== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43197CB: __strlen_sse2 (strlen.S:116) ==1369== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==1369== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==1369== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==1369== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1369== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x87F559E: PMIU_writeline (simple_pmiutil.c:180) ==1369== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==1369== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==1369== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==1369== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1369== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Use of uninitialised value of size 4 ==1369== at 0x87F55A4: PMIU_writeline (simple_pmiutil.c:184) ==1369== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==1369== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==1369== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==1369== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1369== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Syscall param write(count) contains uninitialised byte(s) ==1369== at 0x41B1EB3: __write_nocancel (syscall-template.S:82) ==1369== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==1369== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==1369== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==1369== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1369== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x87F566C: PMIU_writeline (simple_pmiutil.c:197) ==1369== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==1369== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==1369== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==1369== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1369== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==1369== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==1369== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==1369== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1369== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==1369== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==1369== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==1369== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1369== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==1369== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==1369== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==1369== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1369== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==1369== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==1369== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==1369== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1369== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==1369== by 0x87E2846: PMI_KVS_Put (simple_pmi.c:543) ==1369== by 0x87CAF27: MPIDI_PG_SetConnInfo (mpidi_pg.c:568) ==1369== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1369== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==1368== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==1368== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1368== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==1368== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==1368== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1368== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: ma==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==1369== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==1369== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1369== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==1369== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==1369== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1369== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==1368== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==1368== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1368== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==1368== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==1368== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1368== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==in (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==1369== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==1369== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1369== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==1369== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==1369== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1369== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==1368== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==1368== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1368== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) ==1368== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==1368== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1368== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscI1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==1369== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==1369== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1369== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) ==1369== by 0x87CAF44: MPIDI_PG_SetConnInfo (mpidi_pg.c:579) ==1369== by 0x87EE522: MPIDI_CH3_Init (ch3_init.c:48) ==1369== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== by 0x879F15A: PMPI_Init (init.c:106) ==1369== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== nitialize (pinit.c:561) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Invalid read of size 4 ==1368== Invalid read of size 4 ==1368== at 0x431983B: __GI_strlen (strlen.S:115) ==1368== by 0x43843CE: __nss_lookup (nsswitch.c:191) ==1368== by 0x438564E: __nss_passwd_lookup2 (XXX-lookup.c:76) ==1368== by 0x433D0DE: getpwuid_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==1368== by 0x433C92E: getpwuid (getXXbyYY.c:117) ==1368== by 0x80BC53A: PetscGetUserName (fuser.c:66) ==1368== by 0x80822FF: PetscErrorPrintfInitialize (errtrace.c:68) ==1368== by 0x80AEED1: PetscInitialize (pinit.c:576) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== Address 0x4430700 is 40 bytes inside a block of size 43 alloc'd ==1368== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==1368== by 0x4384583: nss_parse_service_list (nsswitch.c:622) ==1368== by 0x4384E71: __nss_database_lookup (nsswitch.c:775) ==1368== by 0x43859AD: __nss_hosts_lookup2 (XXX-lookup.c:71) ==1368== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==1368== by 0x438A685: gethostbyname (getXXbyYY.c:117) ==1368== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) ==1368== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) ==1368== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) ==1368== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) ==1368== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== ==1369== at 0x431983B: __GI_strlen (strlen.S:115) ==1369== by 0x43843CE: __nss_lookup (nsswitch.c:191) ==1369== by 0x438564E: __nss_passwd_lookup2 (XXX-lookup.c:76) ==1369== by 0x433D0DE: getpwuid_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==1369== by 0x433C92E: getpwuid (getXXbyYY.c:117) ==1369== by 0x80BC53A: PetscGetUserName (fuser.c:66) ==1369== by 0x80822FF: PetscErrorPrintfInitialize (errtrace.c:68) ==1369== by 0x80AEED1: PetscInitialize (pinit.c:576) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== Address 0x4430700 is 40 bytes inside a block of size 43 alloc'd ==1369== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==1369== by 0x4384583: nss_parse_service_list (nsswitch.c:622) ==1369== by 0x4384E71: __nss_database_lookup (nsswitch.c:775) ==1369== by 0x43859AD: __nss_hosts_lookup2 (XXX-lookup.c:71) ==1369== by 0x438AF0F: gethostbyname_r@@GLIBC_2.1.2 (getXXbyYY_r.c:200) ==1369== by 0x438A685: gethostbyname (getXXbyYY.c:117) ==1369== by 0x87CC54F: MPIDU_CH3U_GetSockInterfaceAddr (ch3u_getinterfaces.c:130) ==1369== by 0x87CBC56: MPIDI_CH3U_Get_business_card_sock (ch3u_connect_sock.c:428) ==1369== by 0x87F413B: MPIDI_CH3U_Init_sock (ch3u_init_sock.c:79) ==1369== by 0x87EE511: MPIDI_CH3_Init (ch3_init.c:43) ==1369== by 0x87C7554: MPID_Init (mpid_init.c:92) ==1369== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1369== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80885F9: PetscVSNPrintf (mprint.c:95) ==1368== by 0x8088B22: PetscSNPrintf (mprint.c:228) ==1368== by 0x8082456: PetscErrorPrintfInitialize (errtrace.c:71) ==1368== by 0x80AEED1: PetscInitialize (pinit.c:576) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43197CB: __strlen_sse2 (strlen.S:116) ==1368== by 0x80885F9: PetscVSNPrintf (mprint.c:95) ==1368== by 0x8088B22: PetscSNPrintf (mprint.c:228) ==1368== by 0x8082456: PetscErrorPrintfInitialize (errtrace.c:71) ==1368== by 0x80AEED1: PetscInitialize (pinit.c:576) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1368== by 0x80824DE: PetscErrorPrintfInitialize (errtrace.c:73) ==1368== by 0x80AEED1: PetscInitialize (pinit.c:576) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1368== by 0x8082563: PetscErrorPrintfInitialize (errtrace.c:77) ==1368== by 0x80AEED1: PetscInitialize (pinit.c:576) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1368== by 0x809CFD4: PetscOptionsInsert (options.c:516) ==1368== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80885F9: PetscVSNPrintf (mprint.c:95) ==1369== by 0x8088B22: PetscSNPrintf (mprint.c:228) ==1369== by 0x8082456: PetscErrorPrintfInitialize (errtrace.c:71) ==1369== by 0x80AEED1: PetscInitialize (pinit.c:576) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43197CB: __strlen_sse2 (strlen.S:116) ==1369== by 0x80885F9: PetscVSNPrintf (mprint.c:95) ==1369== by 0x8088B22: PetscSNPrintf (mprint.c:228) ==1369== by 0x8082456: PetscErrorPrintfInitialize (errtrace.c:71) ==1369== by 0x80AEED1: PetscInitialize (pinit.c:576) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1369== by 0x80824DE: PetscErrorPrintfInitialize (errtrace.c:73) ==1369== by 0x80AEED1: PetscInitialize (pinit.c:576) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1369== by 0x8082563: PetscErrorPrintfInitialize (errtrace.c:77) ==1369== by 0x80AEED1: PetscInitialize (pinit.c:576) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==1368== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) ==1368== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) ==1368== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) ==1368== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==1368== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==1368== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==1368== by 0x879376B: MPIC_Send (helper_fns.c:34) ==1368== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==1368== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1368== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43197CB: __strlen_sse2 (strlen.S:116) ==1368== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==1368== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) ==1368== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) ==1368== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) ==1368== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==1368== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==1368== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==1368== by 0x879376B: MPIC_Send (helper_fns.c:34) ==1368== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==1368== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1368== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x87F559E: PMIU_writeline (simple_pmiutil.c:180) ==1368== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==1368== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) ==1368== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) ==1368== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) ==1368== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==1368== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==1368== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==1368== by 0x879376B: MPIC_Send (helper_fns.c:34) ==1368== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==1368== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1368== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1368== ==1368== Use of uninitialised value of size 4 ==1368== at 0x87F55A4: PMIU_writeline (simple_pmiutil.c:184) ==1368== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==1368== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) ==1368== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) ==1368== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) ==1368== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==1368== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==1368== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==1368== by 0x879376B: MPIC_Send (helper_fns.c:34) ==1368== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==1368== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1368== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1368== ==1368== Syscall param write(count) contains uninitialised byte(s) ==1368== at 0x41B1EB3: __write_nocancel (syscall-template.S:82) ==1368== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==1368== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) ==1368== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) ==1368== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) ==1368== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==1368== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==1368== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==1368== by 0x879376B: MPIC_Send (helper_fns.c:34) ==1368== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==1368== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1368== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x87F566C: PMIU_writeline (simple_pmiutil.c:197) ==1368== by 0x87E19C7: GetResponse (simple_pmi.c:1049) ==1368== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) ==1368== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) ==1368== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) ==1368== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==1368== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==1368== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==1368== by 0x879376B: MPIC_Send (helper_fns.c:34) ==1368== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==1368== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1368== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==1368== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) ==1368== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) ==1368== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) ==1368== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==1368== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==1368== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==1368== by 0x879376B: MPIC_Send (helper_fns.c:34) ==1368== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==1368== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1368== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1368== by 0x809D14A: PetscOptionsInsert (options.c:522) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==1368== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) ==1368== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) ==1368== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) ==1368== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==1368== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==1368== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==1368== by 0x879376B: MPIC_Send (helper_fns.c:34) ==1368== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==1368== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1368== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1368== by 0x809D14A: PetscOptionsInsert (options.c:522) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==1368== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) ==1368== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) ==1368== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) ==1368== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==1368== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==1368== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==1368== by 0x879376B: MPIC_Send (helper_fns.c:34) ==1368== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==1368== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1368== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1368== by 0x809D14A: PetscOptionsInsert (options.c:522) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==1368== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) ==1368== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) ==1368== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) ==1368== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==1368== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==1368== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==1368== by 0x879376B: MPIC_Send (helper_fns.c:34) ==1368== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==1368== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1368== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1368== by 0x809D14A: PetscOptionsInsert (options.c:522) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==1368== by 0x87E420A: PMI_KVS_Get (simple_pmi.c:572) ==1368== by 0x87CA9F9: getConnInfoKVS (mpidi_pg.c:622) ==1368== by 0x87CC11A: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1097) ==1368== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==1368== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==1368== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==1368== by 0x879376B: MPIC_Send (helper_fns.c:34) ==1368== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==1368== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1368== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1368== by 0x809D14A: PetscOptionsInsert (options.c:522) ==1368== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1369== by 0x809CFD4: PetscOptionsInsert (options.c:516) ==1369== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1368== Invalid read of size 8 ==1368== at 0x4319785: __strlen_sse2 (strlen.S:87) ==1368== by 0x87BD379: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:776) ==1368== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) ==1368== by 0x8793060: MPIC_Wait (helper_fns.c:269) ==1368== by 0x879377E: MPIC_Send (helper_fns.c:38) ==1368== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==1368== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1368== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1368== by 0x809D14A: PetscOptionsInsert (options.c:522) ==1368== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== Address 0x442e0d0 is 8 bytes before a block of size 257 alloc'd ==1368== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==1368== by 0x87C74AE: MPID_Init (mpid_init.c:373) ==1368== by 0x879F65C: MPIR_Init_thread (initthread.c:288) ==1368== by 0x879F15A: PMPI_Init (init.c:106) ==1368== by 0x80AEE12: PetscInitialize (pinit.c:561) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43197AD: __strlen_sse2 (strlen.S:104) ==1368== by 0x87BD379: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:776) ==1368== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) ==1368== by 0x8793060: MPIC_Wait (helper_fns.c:269) ==1368== by 0x879377E: MPIC_Send (helper_fns.c:38) ==1368== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==1368== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1368== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1368== by 0x809D14A: PetscOptionsInsert (options.c:522) ==1368== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Syscall param writev(vector) points to uninitialised byte(s) ==1368== at 0x436BA61: writev (writev.c:51) ==1368== by 0x87EB49A: MPIDU_Sock_wait (sock_wait.i:693) ==1368== by 0x87BD7B0: MPIDI_CH3I_Progress (ch3_progress.c:187) ==1368== by 0x8793060: MPIC_Wait (helper_fns.c:269) ==1368== by 0x879377E: MPIC_Send (helper_fns.c:38) ==1368== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==1368== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1368== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1368== by 0x809D14A: PetscOptionsInsert (options.c:522) ==1368== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== Address 0x4462754 is 68 bytes inside a block of size 72 alloc'd ==1368== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==1368== by 0x87CBE6C: MPIDI_CH3I_Connection_alloc (ch3u_connect_sock.c:160) ==1368== by 0x87CBFAD: MPIDI_CH3I_Sock_connect (ch3u_connect_sock.c:1164) ==1368== by 0x87CC132: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1102) ==1368== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==1368== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==1368== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==1368== by 0x879376B: MPIC_Send (helper_fns.c:34) ==1368== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==1368== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1368== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1368== by 0x809D14A: PetscOptionsInsert (options.c:522) ==1368== ==1368== Syscall param writev(vector[...]) points to uninitialised byte(s) ==1368== at 0x436BA61: writev (writev.c:51) ==1368== by 0x87EB49A: MPIDU_Sock_wait (sock_wait.i:693) ==1368== by 0x87BD7B0: MPIDI_CH3I_Progress (ch3_progress.c:187) ==1368== by 0x8793060: MPIC_Wait (helper_fns.c:269) ==1368== by 0x879377E: MPIC_Send (helper_fns.c:38) ==1368== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==1368== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1368== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1368== by 0x809D14A: PetscOptionsInsert (options.c:522) ==1368== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== Address 0x4462728 is 24 bytes inside a block of size 72 alloc'd ==1368== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==1368== by 0x87CBE6C: MPIDI_CH3I_Connection_alloc (ch3u_connect_sock.c:160) ==1368== by 0x87CBFAD: MPIDI_CH3I_Sock_connect (ch3u_connect_sock.c:1164) ==1368== by 0x87CC132: MPIDI_CH3I_VC_post_sockconnect (ch3u_connect_sock.c:1102) ==1368== by 0x87EEF6B: MPIDI_CH3_iStartMsg (ch3_istartmsg.c:177) ==1368== by 0x87C6341: MPIDI_CH3_EagerContigShortSend (ch3u_eager.c:250) ==1368== by 0x87C8AC6: MPID_Send (mpid_send.c:115) ==1368== by 0x879376B: MPIC_Send (helper_fns.c:34) ==1368== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==1368== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1368== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1368== by 0x809D14A: PetscOptionsInsert (options.c:522) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x87EC050: MPIDU_Sock_wait (socki_util.i:543) ==1368== by 0x87BD7B0: MPIDI_CH3I_Progress (ch3_progress.c:187) ==1368== by 0x8793060: MPIC_Wait (helper_fns.c:269) ==1368== by 0x879377E: MPIC_Send (helper_fns.c:38) ==1368== by 0x878BB6A: MPIR_Bcast (bcast.c:227) ==1368== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1368== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1368== by 0x809D14A: PetscOptionsInsert (options.c:522) ==1368== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==1369== by 0x87CA206: MPIDI_PG_Find (mpidi_pg.c:341) ==1369== by 0x87CB66D: MPIDI_CH3_Sockconn_handle_connopen_event (ch3u_connect_sock.c:883) ==1369== by 0x87BD3FA: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:639) ==1369== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) ==1369== by 0x8793060: MPIC_Wait (helper_fns.c:269) ==1369== by 0x8793626: MPIC_Recv (helper_fns.c:74) ==1369== by 0x878C049: MPIR_Bcast (bcast.c:195) ==1369== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1369== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1369== by 0x809D14A: PetscOptionsInsert (options.c:522) ==1369== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==1369== by 0x87CA206: MPIDI_PG_Find (mpidi_pg.c:341) ==1369== by 0x87CB66D: MPIDI_CH3_Sockconn_handle_connopen_event (ch3u_connect_sock.c:883) ==1369== by 0x87BD3FA: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:639) ==1369== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) ==1369== by 0x8793060: MPIC_Wait (helper_fns.c:269) ==1369== by 0x8793626: MPIC_Recv (helper_fns.c:74) ==1369== by 0x878C049: MPIR_Bcast (bcast.c:195) ==1369== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1369== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1369== by 0x809D14A: PetscOptionsInsert (options.c:522) ==1369== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==1369== by 0x87CA206: MPIDI_PG_Find (mpidi_pg.c:341) ==1369== by 0x87CB66D: MPIDI_CH3_Sockconn_handle_connopen_event (ch3u_connect_sock.c:883) ==1369== by 0x87BD3FA: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:639) ==1369== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) ==1369== by 0x8793060: MPIC_Wait (helper_fns.c:269) ==1369== by 0x8793626: MPIC_Recv (helper_fns.c:74) ==1369== by 0x878C049: MPIR_Bcast (bcast.c:195) ==1369== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1369== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1369== by 0x809D14A: PetscOptionsInsert (options.c:522) ==1369== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==1369== by 0x87CA206: MPIDI_PG_Find (mpidi_pg.c:341) ==1369== by 0x87CB66D: MPIDI_CH3_Sockconn_handle_connopen_event (ch3u_connect_sock.c:883) ==1369== by 0x87BD3FA: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:639) ==1369== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) ==1369== by 0x8793060: MPIC_Wait (helper_fns.c:269) ==1369== by 0x8793626: MPIC_Recv (helper_fns.c:74) ==1369== by 0x878C049: MPIR_Bcast (bcast.c:195) ==1369== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1369== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1369== by 0x809D14A: PetscOptionsInsert (options.c:522) ==1369== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==1369== by 0x87CA206: MPIDI_PG_Find (mpidi_pg.c:341) ==1369== by 0x87CB66D: MPIDI_CH3_Sockconn_handle_connopen_event (ch3u_connect_sock.c:883) ==1369== by 0x87BD3FA: MPIDI_CH3I_Progress_handle_sock_event (ch3_progress.c:639) ==1369== by 0x87BD7BD: MPIDI_CH3I_Progress (ch3_progress.c:212) ==1369== by 0x8793060: MPIC_Wait (helper_fns.c:269) ==1369== by 0x8793626: MPIC_Recv (helper_fns.c:74) ==1369== by 0x878C049: MPIR_Bcast (bcast.c:195) ==1369== by 0x878C6A1: PMPI_Bcast (bcast.c:761) ==1369== by 0x809C467: PetscOptionsInsertFile (options.c:436) ==1369== by 0x809D14A: PetscOptionsInsert (options.c:522) ==1369== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1369== ==1368== Invalid read of size 8 ==1368== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) ==1368== by 0x809F18B: PetscOptionsSetValue (options.c:803) ==1368== by 0x809DD95: PetscOptionsInsert (options.c:588) ==1368== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== Address 0x44c0d88 is 8 bytes inside a block of size 10 alloc'd ==1368== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==1368== by 0x809F5F2: PetscOptionsSetValue (options.c:829) ==1368== by 0x809DD95: PetscOptionsInsert (options.c:588) ==1368== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) ==1368== by 0x809F18B: PetscOptionsSetValue (options.c:803) ==1368== by 0x809DD95: PetscOptionsInsert (options.c:588) ==1368== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== Address 0x44c0d90 is 6 bytes after a block of size 10 alloc'd ==1368== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==1368== by 0x809F5F2: PetscOptionsSetValue (options.c:829) ==1368== by 0x809DD95: PetscOptionsInsert (options.c:588) ==1368== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==1368== by 0x80AF535: PetscInitialize (pinit.c:635) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1369== Invalid read of size 8 ==1369== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) ==1369== by 0x809F18B: PetscOptionsSetValue (options.c:803) ==1369== by 0x809DD95: PetscOptionsInsert (options.c:588) ==1369== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== Address 0x4433c38 is 8 bytes inside a block of size 10 alloc'd ==1369== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==1369== by 0x809F5F2: PetscOptionsSetValue (options.c:829) ==1369== by 0x809DD95: PetscOptionsInsert (options.c:588) ==1369== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Invalid read of size 8 ==1369== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) ==1369== by 0x809F18B: PetscOptionsSetValue (options.c:803) ==1369== by 0x809DD95: PetscOptionsInsert (options.c:588) ==1369== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== Address 0x4433c40 is 6 bytes after a block of size 10 alloc'd ==1369== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==1369== by 0x809F5F2: PetscOptionsSetValue (options.c:829) ==1369== by 0x809DD95: PetscOptionsInsert (options.c:588) ==1369== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1368== by 0x80A8875: PetscOptionsCheckInitial_Private (init.c:242) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x4319785: __strlen_sse2 (strlen.S:87) ==1368== by 0x8099BCC: PetscOptionsAtol (options.c:152) ==1368== by 0x80A1E9D: PetscOptionsGetTruth (options.c:1310) ==1368== by 0x80A88F7: PetscOptionsCheckInitial_Private (init.c:244) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== Address 0x44c0d40 is 8 bytes before a block of size 4 alloc'd ==1368== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==1368== by 0x809F6EB: PetscOptionsSetValue (options.c:833) ==1368== by 0x809DD95: PetscOptionsInsert (options.c:588) ==1368== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1368== by 0x80A8A66: PetscOptionsCheckInitial_Private (init.c:257) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1368== by 0x80A8BC3: PetscOptionsCheckInitial_Private (init.c:264) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==1369== by 0x80AF535: PetscInitialize (pinit.c:635) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1368== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1368== by 0x80A8C5B: PetscOptionsCheckInitial_Private (init.c:267) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1369== by 0x80A8875: PetscOptionsCheckInitial_Private (init.c:242) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1368== by 0x80C395E: PetscSetDisplay (pdisplay.c:99) ==1368== by 0x80A8D23: PetscOptionsCheckInitial_Private (init.c:276) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1369== Invalid read of size 8 ==1369== at 0x4319785: __strlen_sse2 (strlen.S:87) ==1369== by 0x8099BCC: PetscOptionsAtol (options.c:152) ==1369== by 0x80A1E9D: PetscOptionsGetTruth (options.c:1310) ==1369== by 0x80A88F7: PetscOptionsCheckInitial_Private (init.c:244) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== Address 0x4433bf0 is 8 bytes before a block of size 4 alloc'd ==1369== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==1369== by 0x809F6EB: PetscOptionsSetValue (options.c:833) ==1369== by 0x809DD95: PetscOptionsInsert (options.c:588) ==1369== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==1368== by 0x80A8D9B: PetscOptionsCheckInitial_Private (init.c:281) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) ==1368== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==1368== by 0x80A8D9B: PetscOptionsCheckInitial_Private (init.c:281) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==1368== by 0x80A8E13: PetscOptionsCheckInitial_Private (init.c:282) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1369== by 0x80A8A66: PetscOptionsCheckInitial_Private (init.c:257) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1368== by 0x80A8E8B: PetscOptionsCheckInitial_Private (init.c:283) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1369== by 0x80A8BC3: PetscOptionsCheckInitial_Private (init.c:264) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1369== by 0x80A8C5B: PetscOptionsCheckInitial_Private (init.c:267) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1368== by 0x80A94FC: PetscOptionsCheckInitial_Private (init.c:320) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1368== by 0x80A95F6: PetscOptionsCheckInitial_Private (init.c:323) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1369== by 0x80C395E: PetscSetDisplay (pdisplay.c:99) ==1369== by 0x80A8D23: PetscOptionsCheckInitial_Private (init.c:276) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1368== by 0x80A96F8: PetscOptionsCheckInitial_Private (init.c:326) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1368== by 0x80A97FA: PetscOptionsCheckInitial_Private (init.c:329) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1368== by 0x80A98FE: PetscOptionsCheckInitial_Private (init.c:334) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==1369== by 0x80A8D9B: PetscOptionsCheckInitial_Private (init.c:281) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) ==1369== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==1369== by 0x80A8D9B: PetscOptionsCheckInitial_Private (init.c:281) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==1369== by 0x80A8E13: PetscOptionsCheckInitial_Private (init.c:282) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==1369== by 0x80A8E8B: PetscOptionsCheckInitial_Private (init.c:283) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1369== by 0x80A94FC: PetscOptionsCheckInitial_Private (init.c:320) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1369== by 0x80A95F6: PetscOptionsCheckInitial_Private (init.c:323) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1369== by 0x80A96F8: PetscOptionsCheckInitial_Private (init.c:326) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1369== by 0x80A97FA: PetscOptionsCheckInitial_Private (init.c:329) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1369== by 0x80A98FE: PetscOptionsCheckInitial_Private (init.c:334) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1368== by 0x80A9A5C: PetscOptionsCheckInitial_Private (init.c:341) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1369== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1369== by 0x80A9A5C: PetscOptionsCheckInitial_Private (init.c:341) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1369== by 0x80A9CA7: PetscOptionsCheckInitial_Private (init.c:350) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1369== by 0x80A9D9D: PetscOptionsCheckInitial_Private (init.c:352) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1369== by 0x80A9E24: PetscOptionsCheckInitial_Private (init.c:353) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1368== by 0x80A9CA7: PetscOptionsCheckInitial_Private (init.c:350) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1368== by 0x80A9D9D: PetscOptionsCheckInitial_Private (init.c:352) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1368== by 0x80A9E24: PetscOptionsCheckInitial_Private (init.c:353) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1368== by 0x80AA655: PetscOptionsCheckInitial_Private (init.c:402) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1369== by 0x80AA655: PetscOptionsCheckInitial_Private (init.c:402) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==1369== by 0x80AA74F: PetscOptionsCheckInitial_Private (init.c:409) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==1369== by 0x80AA7C7: PetscOptionsCheckInitial_Private (init.c:410) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1369== by 0x80AAABF: PetscOptionsCheckInitial_Private (init.c:439) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1369== by 0x80AAC26: PetscOptionsCheckInitial_Private (init.c:452) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==1368== by 0x80AA74F: PetscOptionsCheckInitial_Private (init.c:409) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==1368== by 0x80AA7C7: PetscOptionsCheckInitial_Private (init.c:410) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1368== by ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1369== by 0x80AADBF: PetscOptionsCheckInitial_Private (init.c:468) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1369== by 0x80AAE3F: PetscOptionsCheckInitial_Private (init.c:469) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) ==1369== 0x80AAABF: PetscOptionsCheckInitial_Private (init.c:439) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1368== by 0x80AAC26: PetscOptionsCheckInitial_Private (init.c:452) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1368== by 0x80AADBF: PetscOptionsCheckInitial_Private (init.c:468) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1369== by 0x80AAE3F: PetscOptionsCheckInitial_Private (init.c:469) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==1369== by 0x80AAEB7: PetscOptionsCheckInitial_Private (init.c:470) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1369== by 0x80AB021: PetscOptionsCheckInitial_Private (init.c:474) ==1369== by 0x80AF19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1368== by 0x80AAE3F: PetscOptionsCheckInitial_Private (init.c:469) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) ==1368== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1368== by 0x80AAE3F: PetscOptionsCheckInitial_Private (init.c:469) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (optio618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ns.c:987) ==1368== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==1368== by 0x80AAEB7: PetscOptionsCheckInitial_Private (init.c:470) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1368== by 0x80AB021: PetscOptionsCheckInitial_Private (init.c:474) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1369== by 0x80AB29D: PetscOptionsCheckInitial_Private (init.c:499) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==1369== by 0x80AB315: PetscOptionsCheckInitial_Private (init.c:504) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A1E0E: PetscOptionsGetTruth (options.c:1304) ==1368== by 0x80AB29D: PetscOptionsCheckInitial_Private (init.c:499) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A27DF: PetscOptionsGetReal (options.c:1419) ==1369== by 0x80AC610: PetscOptionsCheckInitial_Private (init.c:554) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1369== by 0x80AC70C: PetscOptionsCheckInitial_Private (init.c:559) ==1369== by 0x80AF618: PetscInitialize (pinit.c:639) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1368== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==1368== by 0x80AB315: PetscOptionsCheckInitial_Private (init.c:504) ==1368== by 0x80AF618: PetscInitialize (pinit.c:639) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== ==1368== More than 100 errors detected. Subsequent errors ==1368== will still be recorded, but in less detail than before. ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==1369== by 0x80C52B1: PetscLogBegin_Private (plog.c:196) ==1369== by 0x80AF677: PetscInitialize (pinit.c:643) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==1369== by 0x80C532E: PetscLogBegin_Private (plog.c:200) ==1369== by 0x80AF677: PetscInitialize (pinit.c:643) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A420A: PetscOptionsGetStringArray (options.c:1756) ==1369== by 0x808387F: PetscInitialize_DynamicLibraries (reg.c:80) ==1369== by 0x80AF6D6: PetscInitialize (pinit.c:650) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1369== by 0x81DE8F5: PetscInitializePackage (dlregispetsc.c:58) ==1369== by 0x8083A20: PetscInitialize_DynamicLibraries (reg.c:93) ==1369== by 0x80AF6D6: PetscInitialize (pinit.c:650) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1369== by 0x81DEA77: PetscInitializePackage (dlregispetsc.c:66) ==1369== by 0x8083A20: PetscInitialize_DynamicLibraries (reg.c:93) ==1369== by 0x80AF6D6: PetscInitialize (pinit.c:650) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A420A: PetscOptionsGetStringArray (options.c:1756) ==1369== by 0x8083AA5: PetscInitialize_DynamicLibraries (reg.c:117) ==1369== by 0x80AF6D6: PetscInitialize (pinit.c:650) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==1369== by 0x80AC9CD: PetscOptionsCheckInitial_Components (pinit.c:57) ==1369== by 0x80AF916: PetscInitialize (pinit.c:657) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A0906: PetscOptionsFindPair_Private (options.c:987) ==1369== by 0x80A1074: PetscOptionsHasName (options.c:1092) ==1369== by 0x80B290F: PetscOptionsBegin_Private (aoptions.c:44) ==1369== by 0x80A4EED: PetscOptionsSetFromOptions (options.c:1890) ==1369== by 0x80AF975: PetscInitialize (pinit.c:659) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80A066B: PetscOptionsFindPair_Private (options.c:967) ==1369== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1369== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) ==1369== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) ==1369== by 0x80AF975: PetscInitialize (pinit.c:659) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== ==1369== More than 100 errors detected. Subsequent errors ==1369== will still be recorded, but in less detail than before. ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x40249EA: strncat (mc_replace_strmem.c:202) ==1369== by 0x80BE6E2: PetscStrncat (str.c:205) ==1369== by 0x80A06F1: PetscOptionsFindPair_Private (options.c:968) ==1369== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1369== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) ==1369== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) ==1369== by 0x80AF975: PetscInitialize (pinit.c:659) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4024A18: strncat (mc_replace_strmem.c:202) ==1369== by 0x80BE6E2: PetscStrncat (str.c:205) ==1369== by 0x80A06F1: PetscOptionsFindPair_Private (options.c:968) ==1369== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1369== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) ==1369== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) ==1369== by 0x80AF975: PetscInitialize (pinit.c:659) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80A066B: PetscOptionsFindPair_Private (options.c:967) ==1368== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1368== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) ==1368== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) ==1368== by 0x80AF975: PetscInitialize (pinit.c:659) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x40249EA: strncat (mc_replace_strmem.c:202) ==1368== by 0x80BE6E2: PetscStrncat (str.c:205) ==1368== by 0x80A06F1: PetscOptionsFindPair_Private (options.c:968) ==1368== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1368== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) ==1368== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) ==1368== by 0x80AF975: PetscInitialize (pinit.c:659) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4024A18: strncat (mc_replace_strmem.c:202) ==1368== by 0x80BE6E2: PetscStrncat (str.c:205) ==1368== by 0x80A06F1: PetscOptionsFindPair_Private (options.c:968) ==1368== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1368== by 0x80B5ECA: PetscOptionsString (aoptions.c:522) ==1368== by 0x80A4FDB: PetscOptionsSetFromOptions (options.c:1891) ==1368== by 0x80AF975: PetscInitialize (pinit.c:659) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1368== by 0x80BDE83: PetscStrallocpy (str.c:79) ==1368== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==1368== by 0x80841CC: PetscFListAdd (reg.c:201) ==1368== by 0x81843D0: DARegister (dareg.c:104) ==1368== by 0x818476B: DARegisterAll (daregall.c:32) ==1368== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) ==1368== by 0x815F1D1: DACreate (dacreate.c:173) ==1368== by 0x81558E2: DACreate2d (da2.c:1837) ==1368== by 0x804BE2A: main (ex19.c:107) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43197AD: __strlen_sse2 (strlen.S:104) ==1368== by 0x80BDE83: PetscStrallocpy (str.c:79) ==1368== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==1368== by 0x80841CC: PetscFListAdd (reg.c:201) ==1368== by 0x81843D0: DARegister (dareg.c:104) ==1368== by 0x818476B: DARegisterAll (daregall.c:32) ==1368== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) ==1368== by 0x815F1D1: DACreate (dacreate.c:173) ==1368== by 0x81558E2: DACreate2d (da2.c:1837) ==1368== by 0x804BE2A: main (ex19.c:107) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x80BDEDB: PetscStrallocpy (str.c:80) ==1368== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==1368== by 0x80841CC: PetscFListAdd (reg.c:201) ==1368== by 0x81843D0: DARegister (dareg.c:104) ==1368== by 0x818476B: DARegisterAll (daregall.c:32) ==1368== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) ==1368== by 0x815F1D1: DACreate (dacreate.c:173) ==1368== by 0x81558E2: DACreate2d (da2.c:1837) ==1368== by 0x804BE2A: main (ex19.c:107) ==1368== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x4319794: __strlen_sse2 (strlen.S:93) ==1369== by 0x80BDE83: PetscStrallocpy (str.c:79) ==1369== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==1369== by 0x80841CC: PetscFListAdd (reg.c:201) ==1369== by 0x81843D0: DARegister (dareg.c:104) ==1369== by 0x818476B: DARegisterAll (daregall.c:32) ==1369== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) ==1369== by 0x815F1D1: DACreate (dacreate.c:173) ==1369== by 0x81558E2: DACreate2d (da2.c:1837) ==1369== by 0x804BE2A: main (ex19.c:107) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43197AD: __strlen_sse2 (strlen.S:104) ==1369== by 0x80BDE83: PetscStrallocpy (str.c:79) ==1369== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==1369== by 0x80841CC: PetscFListAdd (reg.c:201) ==1369== by 0x81843D0: DARegister (dareg.c:104) ==1369== by 0x818476B: DARegisterAll (daregall.c:32) ==1369== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) ==1369== by 0x815F1D1: DACreate (dacreate.c:173) ==1369== by 0x81558E2: DACreate2d (da2.c:1837) ==1369== by 0x804BE2A: main (ex19.c:107) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x80BDEDB: PetscStrallocpy (str.c:80) ==1369== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==1369== by 0x80841CC: PetscFListAdd (reg.c:201) ==1369== by 0x81843D0: DARegister (dareg.c:104) ==1369== by 0x818476B: DARegisterAll (daregall.c:32) ==1369== by 0x855F08B: DMInitializePackage (dlregisdm.c:80) ==1369== by 0x815F1D1: DACreate (dacreate.c:173) ==1369== by 0x81558E2: DACreate2d (da2.c:1837) ==1369== by 0x804BE2A: main (ex19.c:107) ==1369== ==1368== Invalid read of size 8 ==1368== at 0x43197A0: __strlen_sse2 (strlen.S:99) ==1368== by 0x8099209: PetscOptionsAtoi (options.c:70) ==1368== by 0x80A13CE: PetscOptionsGetInt (options.c:1138) ==1368== by 0x80B5AA5: PetscOptionsInt (aoptions.c:473) ==1368== by 0x815E8BC: DASetFromOptions (dacreate.c:109) ==1368== by 0x8155C96: DACreate2d (da2.c:1847) ==1368== by 0x804BE2A: main (ex19.c:107) ==1368== Address 0x44c0dc0 is 0 bytes inside a block of size 3 alloc'd ==1368== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==1368== by 0x809F6EB: PetscOptionsSetValue (options.c:833) ==1368== by 0x809DD95: PetscOptionsInsert (options.c:588) ==1368== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x4319785: __strlen_sse2 (strlen.S:87) ==1368== by 0x8099209: PetscOptionsAtoi (options.c:70) ==1368== by 0x80A13CE: PetscOptionsGetInt (options.c:1138) ==1368== by 0x80B5AA5: PetscOptionsInt (aoptions.c:473) ==1368== by 0x815E96A: DASetFromOptions (dacreate.c:114) ==1368== by 0x8155C96: DACreate2d (da2.c:1847) ==1368== by 0x804BE2A: main (ex19.c:107) ==1368== Address 0x44c0e30 is 8 bytes before a block of size 3 alloc'd ==1368== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==1368== by 0x809F6EB: PetscOptionsSetValue (options.c:833) ==1368== by 0x809DD95: PetscOptionsInsert (options.c:588) ==1368== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1368== by 0x804BA0C: main (ex19.c:96) ==1368== ==1369== Invalid read of size 8 ==1369== at 0x43197A0: __strlen_sse2 (strlen.S:99) ==1369== by 0x8099209: PetscOptionsAtoi (options.c:70) ==1369== by 0x80A13CE: PetscOptionsGetInt (options.c:1138) ==1369== by 0x80B5AA5: PetscOptionsInt (aoptions.c:473) ==1369== by 0x815E8BC: DASetFromOptions (dacreate.c:109) ==1369== by 0x8155C96: DACreate2d (da2.c:1847) ==1369== by 0x804BE2A: main (ex19.c:107) ==1369== Address 0x4433c70 is 0 bytes inside a block of size 3 alloc'd ==1369== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==1369== by 0x809F6EB: PetscOptionsSetValue (options.c:833) ==1369== by 0x809DD95: PetscOptionsInsert (options.c:588) ==1369== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1369== Invalid read of size 8 ==1369== at 0x4319785: __strlen_sse2 (strlen.S:87) ==1369== by 0x8099209: PetscOptionsAtoi (options.c:70) ==1369== by 0x80A13CE: PetscOptionsGetInt (options.c:1138) ==1369== by 0x80B5AA5: PetscOptionsInt (aoptions.c:473) ==1369== by 0x815E96A: DASetFromOptions (dacreate.c:114) ==1369== by 0x8155C96: DACreate2d (da2.c:1847) ==1369== by 0x804BE2A: main (ex19.c:107) ==1369== Address 0x4433ce0 is 8 bytes before a block of size 3 alloc'd ==1369== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==1369== by 0x809F6EB: PetscOptionsSetValue (options.c:833) ==1369== by 0x809DD95: PetscOptionsInsert (options.c:588) ==1369== by 0x80AF4BD: PetscInitialize (pinit.c:629) ==1369== by 0x804BA0C: main (ex19.c:96) ==1369== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43197BC: __strlen_sse2 (strlen.S:110) ==1368== by 0x80BDE83: PetscStrallocpy (str.c:79) ==1368== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==1368== by 0x80846F3: PetscFListAdd (reg.c:238) ==1368== by 0x829D9AC: MatRegister (matreg.c:139) ==1368== by 0x86C6837: MatRegisterAll (matregis.c:85) ==1368== by 0x82A0B71: MatInitializePackage (dlregismat.c:80) ==1368== by 0x8542386: MatCreate (gcreate.c:72) ==1368== by 0x8171B37: DAGetInterpolation_2D_Q1 (dainterp.c:308) ==1368== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1368== by 0x8197E93: DMGetInterpolation (dm.c:144) ==1368== by 0x81BD897: DMMGSetDM (damg.c:250) ==1368== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43197BC: __strlen_sse2 (strlen.S:110) ==1369== by 0x80BDE83: PetscStrallocpy (str.c:79) ==1369== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==1369== by 0x80846F3: PetscFListAdd (reg.c:238) ==1369== by 0x829D9AC: MatRegister (matreg.c:139) ==1369== by 0x86C6837: MatRegisterAll (matregis.c:85) ==1369== by 0x82A0B71: MatInitializePackage (dlregismat.c:80) ==1369== by 0x8542386: MatCreate (gcreate.c:72) ==1369== by 0x8171B37: DAGetInterpolation_2D_Q1 (dainterp.c:308) ==1369== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1369== by 0x8197E93: DMGetInterpolation (dm.c:144) ==1369== by 0x81BD897: DMMGSetDM (damg.c:250) ==1369== ==1368== Invalid read of size 8 ==1368== at 0x43BF10D: __strcmp_ssse3 (strcmp-ssse3.S:1446) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x834D30B: MatCreate_MPIAIJ (mpiaij.c:5096) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) ==1368== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1368== by 0x8197E93: DMGetInterpolation (dm.c:144) ==1368== by 0x81BD897: DMMGSetDM (damg.c:250) ==1368== Address 0x456b7c8 is 24 bytes inside a block of size 28 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x8084689: PetscFListAdd (reg.c:237) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) ==1368== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1368== ==1369== Invalid read of size 8 ==1369== at 0x43BF10D: __strcmp_ssse3 (strcmp-ssse3.S:1446) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x834D30B: MatCreate_MPIAIJ (mpiaij.c:5096) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) ==1369== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1369== by 0x8197E93: DMGetInterpolation (dm.c:144) ==1369== by 0x81BD897: DMMGSetDM (damg.c:250) ==1369== Address 0x44db848 is 24 bytes inside a block of size 28 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x8084689: PetscFListAdd (reg.c:237) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) ==1369== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1369== ==1368== Invalid read of size 8 ==1368== at 0x43BE415: __strcmp_ssse3 (strcmp-ssse3.S:225) ==1368== by 0x8085399: PetscFListFind (reg.c:375) ==1368== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==1368== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==1368== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) ==1368== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1368== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1368== by 0x8197E93: DMGetInterpolation (dm.c:144) ==1368== by 0x81BD897: DMMGSetDM (damg.c:250) ==1368== by 0x804BEA0: main (ex19.c:108) ==1368== Address 0x456b7c8 is 24 bytes inside a block of size 28 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x8084689: PetscFListAdd (reg.c:237) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) ==1368== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) ==1368== by 0x8085399: PetscFListFind (reg.c:375) ==1368== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==1368== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==1368== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) ==1368== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1368== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1368== by 0x8197E93: DMGetInterpolation (dm.c:144) ==1368== by 0x81BD897: DMMGSetDM (damg.c:250) ==1368== by 0x804BEA0: main (ex19.c:108) ==1368== Address 0x456c758 is 24 bytes inside a block of size 28 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==1368== by 0x8085186: PetscFListFind (reg.c:356) ==1368== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==1368== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==1368== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) ==1368== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1368== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1368== by 0x8197E93: DMGetInterpolation (dm.c:144) ==1368== by 0x81BD897: DMMGSetDM (damg.c:250) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) ==1368== by 0x8085406: PetscFListFind (reg.c:376) ==1368== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==1368== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==1368== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) ==1368== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1368== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1368== by 0x8197E93: DMGetInterpolation (dm.c:144) ==1368== by 0x81BD897: DMMGSetDM (damg.c:250) ==1368== by 0x804BEA0: main (ex19.c:108) ==1368== Address 0x456c758 is 24 bytes inside a block of size 28 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==1368== by 0x8085186: PetscFListFind (reg.c:356) ==1368== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==1368== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==1368== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) ==1368== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1368== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1368== by 0x8197E93: DMGetInterpolation (dm.c:144) ==1368== by 0x81BD897: DMMGSetDM (damg.c:250) ==1368== ==1369== Invalid read of size 8 ==1369== at 0x43BE415: __strcmp_ssse3 (strcmp-ssse3.S:225) ==1369== by 0x8085399: PetscFListFind (reg.c:375) ==1369== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==1369== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==1369== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) ==1369== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1369== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1369== by 0x8197E93: DMGetInterpolation (dm.c:144) ==1369== by 0x81BD897: DMMGSetDM (damg.c:250) ==1369== by 0x804BEA0: main (ex19.c:108) ==1369== Address 0x44db848 is 24 bytes inside a block of size 28 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x8084689: PetscFListAdd (reg.c:237) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) ==1369== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1369== ==1369== Invalid read of size 8 ==1369== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) ==1369== by 0x8085399: PetscFListFind (reg.c:375) ==1369== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==1369== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==1369== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) ==1369== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1369== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1369== by 0x8197E93: DMGetInterpolation (dm.c:144) ==1369== by 0x81BD897: DMMGSetDM (damg.c:250) ==1369== by 0x804BEA0: main (ex19.c:108) ==1369== Address 0x44dc7d8 is 24 bytes inside a block of size 28 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==1369== by 0x8085186: PetscFListFind (reg.c:356) ==1369== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==1369== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==1369== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) ==1369== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1369== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1369== by 0x8197E93: DMGetInterpolation (dm.c:144) ==1369== by 0x81BD897: DMMGSetDM (damg.c:250) ==1369== ==1369== Invalid read of size 8 ==1369== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) ==1369== by 0x8085406: PetscFListFind (reg.c:376) ==1369== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==1369== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==1369== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) ==1369== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1369== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1369== by 0x8197E93: DMGetInterpolation (dm.c:144) ==1369== by 0x81BD897: DMMGSetDM (damg.c:250) ==1369== by 0x804BEA0: main (ex19.c:108) ==1369== Address 0x44dc7d8 is 24 bytes inside a block of size 28 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==1369== by 0x8085186: PetscFListFind (reg.c:356) ==1369== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==1369== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==1369== by 0x833E82B: MatMPIAIJSetPreallocation (mpiaij.c:3486) ==1369== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1369== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1369== by 0x8197E93: DMGetInterpolation (dm.c:144) ==1369== by 0x81BD897: DMMGSetDM (damg.c:250) ==1369== ==1368== Invalid read of size 8 ==1368== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x82BE3A6: MatCreate_SeqAIJ (aij.c:3363) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1368== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1368== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1368== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1368== by 0x8197E93: DMGetInterpolation (dm.c:144) ==1368== by 0x81BD897: DMMGSetDM (damg.c:250) ==1368== Address 0x456d3a0 is 16 bytes inside a block of size 21 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x8084162: PetscFListAdd (reg.c:200) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x82BE331: MatCreate_SeqAIJ (aij.c:3360) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1368== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1368== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1368== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1368== ==1369== Invalid read of size 8 ==1368== Invalid read of size 8 ==1369== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x82BE3A6: MatCreate_SeqAIJ (aij.c:3363) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1369== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1369== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1369== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1369== by 0x8197E93: DMGetInterpolation (dm.c:144) ==1369== by 0x81BD897: DMMGSetDM (damg.c:250) ==1369== Address 0x44dd420 is 16 bytes inside a block of size 21 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x8084162: PetscFListAdd (reg.c:200) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x82BE331: MatCreate_SeqAIJ (aij.c:3360) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1369== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1369== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1369== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1369== ==1368== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x82BE8D4: MatCreate_SeqAIJ (aij.c:3399) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1368== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1368== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1368== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1368== by 0x8197E93: DMGetInterpolation (dm.c:144) ==1368== by 0x81BD897: DMMGSetDM (damg.c:250) ==1368== Address 0x456e758 is 24 bytes inside a block of size 28 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x8084689: PetscFListAdd (reg.c:237) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x82BE85F: MatCreate_SeqAIJ (aij.c:3396) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1368== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1368== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1368== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1368== ==1369== Invalid read of size 8 ==1369== at 0x43BE421: __strcmp_ssse3 (strcmp-ssse3.S:228) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x82BE8D4: MatCreate_SeqAIJ (aij.c:3399) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1369== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1369== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1369== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1369== by 0x8197E93: DMGetInterpolation (dm.c:144) ==1369== by 0x81BD897: DMMGSetDM (damg.c:250) ==1369== Address 0x44de7d8 is 24 bytes inside a block of size 28 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x8084689: PetscFListAdd (reg.c:237) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x82BE85F: MatCreate_SeqAIJ (aij.c:3396) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1369== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1369== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1369== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1369== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x80BDEDB: PetscStrallocpy (str.c:80) ==1368== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) ==1368== by 0x80A6564: PetscObjectAppendOptionsPrefix (prefix.c:70) ==1368== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) ==1368== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) ==1368== by 0x85A5056: PCMGSetLevels (mg.c:195) ==1368== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) ==1368== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) ==1368== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1368== by 0x804C4FF: main (ex19.c:140) ==1368== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x80BDEDB: PetscStrallocpy (str.c:80) ==1369== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) ==1369== by 0x80A6564: PetscObjectAppendOptionsPrefix (prefix.c:70) ==1369== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) ==1369== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) ==1369== by 0x85A5056: PCMGSetLevels (mg.c:195) ==1369== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) ==1369== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) ==1369== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1369== by 0x804C4FF: main (ex19.c:140) ==1369== ==1368== Invalid read of size 8 ==1368== at 0x43BEC7D: __strcmp_ssse3 (strcmp-ssse3.S:1021) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x8668C0F: KSPDestroy_GMRES (gmres.c:299) ==1368== by 0x8675E35: KSPDestroy_FGMRES (fgmres.c:341) ==1368== by 0x8635CA0: KSPSetType (itcreate.c:569) ==1368== by 0x81C5A98: DMMGSetSNES (damgsnes.c:668) ==1368== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1368== by 0x804C4FF: main (ex19.c:140) ==1368== Address 0x462eb88 is 24 bytes inside a block of size 31 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== Invalid read of size 8 ==1369== at 0x43BEC7D: __strcmp_ssse3 (strcmp-ssse3.S:1021) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x8668C0F: KSPDestroy_GMRES (gmres.c:299) ==1369== by 0x8675E35: KSPDestroy_FGMRES (fgmres.c:341) ==1369== by 0x8635CA0: KSPSetType (itcreate.c:569) ==1369== by 0x81C5A98: DMMGSetSNES (damgsnes.c:668) ==1369== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1369== by 0x804C4FF: main (ex19.c:140) ==1369== Address 0x459ca88 is 24 bytes inside a block of size 31 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x8084689: PetscFListAdd (reg.c:237) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x8678393: KSPCreate_FGMRES (fgmres.c:744) ==1369== by 0x8635D92: KSPSetType (itcreate.c:576) ==1369== by 0x81BE7AA: DMMGSetUpLevel (damg.c:372) ==1369== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) ==1369== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1369== by 0x804C4FF: main (ex19.c:140) ==1369== ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x8084689: PetscFListAdd (reg.c:237) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x8678393: KSPCreate_FGMRES (fgmres.c:744) ==1368== by 0x8635D92: KSPSetType (itcreate.c:576) ==1368== by 0x81BE7AA: DMMGSetUpLevel (damg.c:372) ==1368== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) ==1368== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1368== by 0x804C4FF: main (ex19.c:140) ==1368== ==1369== Invalid read of size 8 ==1369== at 0x43BE320: __strcmp_ssse3 (strcmp-ssse3.S:141) ==1369== by 0x8085399: PetscFListFind (reg.c:375) ==1369== by 0x858986E: PCSetType (pcset.c:66) ==1369== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==1369== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1369== by 0x804C4FF: main (ex19.c:140) ==1369== Address 0x450e678 is 8 bytes inside a block of size 10 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x8084689: PetscFListAdd (reg.c:237) ==1369== by 0x8718EF5: PCRegister (precon.c:1537) ==1369== by 0x858B07B: PCRegisterAll (pcregis.c:95) ==1369== by 0x8642C37: PCInitializePackage (dlregisksp.c:60) ==1369== by 0x870E72D: PCCreate (precon.c:299) ==1369== by 0x862ADD8: KSPGetPC (itfunc.c:1251) ==1369== by 0x861BF6A: KSPSetOptionsPrefix (itcl.c:87) ==1369== by 0x81AD4A9: SNESSetOptionsPrefix (snes.c:2529) ==1369== by 0x81C52C6: DMMGSetSNES (damgsnes.c:612) ==1369== ==1369== Invalid read of size 8 ==1369== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) ==1369== by 0x8085399: PetscFListFind (reg.c:375) ==1369== by 0x858986E: PCSetType (pcset.c:66) ==1369== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==1369== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1369== by 0x804C4FF: main (ex19.c:140) ==1369== Address 0x459e088 is 8 bytes inside a block of size 10 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==1369== by 0x8085186: PetscFListFind (reg.c:356) ==1369== by 0x858986E: PCSetType (pcset.c:66) ==1369== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==1369== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1369== by 0x804C4FF: main (ex19.c:140) ==1369== ==1369== Invalid read of size 8 ==1369== at 0x43BE328: __strcmp_ssse3 (strcmp-ssse3.S:143) ==1369== by 0x8085399: PetscFListFind (reg.c:375) ==1369== by 0x858986E: PCSetType (pcset.c:66) ==1369== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==1369== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1369== by 0x804C4FF: main (ex19.c:140) ==1369== Address 0x450e680 is 6 bytes after a block of size 10 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x8084689: PetscFListAdd (reg.c:237) ==1369== by 0x8718EF5: PCRegister (precon.c:1537) ==1369== by 0x858B07B: PCRegisterAll (pcregis.c:95) ==1369== by 0x8642C37: PCInitializePackage (dlregisksp.c:60) ==1369== by 0x870E72D: PCCreate (precon.c:299) ==1369== by 0x862ADD8: KSPGetPC (itfunc.c:1251) ==1369== by 0x861BF6A: KSPSetOptionsPrefix (itcl.c:87) ==1369== by 0x81AD4A9: SNESSetOptionsPrefix (snes.c:2529) ==1369== by 0x81C52C6: DMMGSetSNES (damgsnes.c:612) ==1369== ==1369== Invalid read of size 8 ==1369== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) ==1369== by 0x8085399: PetscFListFind (reg.c:375) ==1369== by 0x858986E: PCSetType (pcset.c:66) ==1369== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==1369== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1369== by 0x804C4FF: main (ex19.c:140) ==1369== Address 0x459e090 is 6 bytes after a block of size 10 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==1369== by 0x8085186: PetscFListFind (reg.c:356) ==1369== by 0x858986E: PCSetType (pcset.c:66) ==1369== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==1369== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1369== by 0x804C4FF: main (ex19.c:140) ==1369== ==1368== Invalid read of size 8 ==1368== at 0x43BE320: __strcmp_ssse3 (strcmp-ssse3.S:141) ==1368== by 0x8085399: PetscFListFind (reg.c:375) ==1368== by 0x858986E: PCSetType (pcset.c:66) ==1368== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==1368== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1368== by 0x804C4FF: main (ex19.c:140) ==1368== Address 0x45a0778 is 8 bytes inside a block of size 10 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x8084689: PetscFListAdd (reg.c:237) ==1368== by 0x8718EF5: PCRegister (precon.c:1537) ==1368== by 0x858B07B: PCRegisterAll (pcregis.c:95) ==1368== by 0x8642C37: PCInitializePackage (dlregisksp.c:60) ==1368== by 0x870E72D: PCCreate (precon.c:299) ==1368== by 0x862ADD8: KSPGetPC (itfunc.c:1251) ==1368== by 0x861BF6A: KSPSetOptionsPrefix (itcl.c:87) ==1368== by 0x81AD4A9: SNESSetOptionsPrefix (snes.c:2529) ==1368== by 0x81C52C6: DMMGSetSNES (damgsnes.c:612) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) ==1368== by 0x8085399: PetscFListFind (reg.c:375) ==1368== by 0x858986E: PCSetType (pcset.c:66) ==1368== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==1368== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1368== by 0x804C4FF: main (ex19.c:140) ==1368== Address 0x4630188 is 8 bytes inside a block of size 10 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==1368== by 0x8085186: PetscFListFind (reg.c:356) ==1368== by 0x858986E: PCSetType (pcset.c:66) ==1368== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==1368== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1368== by 0x804C4FF: main (ex19.c:140) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x43BE328: __strcmp_ssse3 (strcmp-ssse3.S:143) ==1368== by 0x8085399: PetscFListFind (reg.c:375) ==1368== by 0x858986E: PCSetType (pcset.c:66) ==1368== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==1368== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1368== by 0x804C4FF: main (ex19.c:140) ==1368== Address 0x45a0780 is 6 bytes after a block of size 10 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x8084689: PetscFListAdd (reg.c:237) ==1368== by 0x8718EF5: PCRegister (precon.c:1537) ==1368== by 0x858B07B: PCRegisterAll (pcregis.c:95) ==1368== by 0x8642C37: PCInitializePackage (dlregisksp.c:60) ==1368== by 0x870E72D: PCCreate (precon.c:299) ==1368== by 0x862ADD8: KSPGetPC (itfunc.c:1251) ==1368== by 0x861BF6A: KSPSetOptionsPrefix (itcl.c:87) ==1368== by 0x81AD4A9: SNESSetOptionsPrefix (snes.c:2529) ==1368== by 0x81C52C6: DMMGSetSNES (damgsnes.c:612) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) ==1368== by 0x8085399: PetscFListFind (reg.c:375) ==1368== by 0x858986E: PCSetType (pcset.c:66) ==1368== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==1368== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1368== by 0x804C4FF: main (ex19.c:140) ==1368== Address 0x4630190 is 6 bytes after a block of size 10 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==1368== by 0x8085186: PetscFListFind (reg.c:356) ==1368== by 0x858986E: PCSetType (pcset.c:66) ==1368== by 0x81C5BCA: DMMGSetSNES (damgsnes.c:672) ==1368== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1368== by 0x804C4FF: main (ex19.c:140) ==1368== ==1369== Invalid read of size 8 ==1369== at 0x43B674F: __memcpy_ssse3 (memcpy-ssse3.S:703) ==1369== by 0x87923CB: MPIR_Allgatherv (allgatherv.c:340) ==1369== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) ==1369== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) ==1369== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) ==1369== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) ==1369== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1369== by 0x804C4FF: main (ex19.c:140) ==1369== Address 0x47cd128 is 0 bytes after a block of size 720 alloc'd ==1369== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==1369== by 0x8791B68: MPIR_Allgatherv (allgatherv.c:143) ==1369== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) ==1369== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) ==1369== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) ==1369== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) ==1369== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1369== by 0x804C4FF: main (ex19.c:140) ==1369== ==1368== Invalid read of size 8 ==1368== at 0x43B677F: __memcpy_ssse3 (memcpy-ssse3.S:715) ==1368== by 0x8791BD2: MPIR_Allgatherv (allgatherv.c:160) ==1368== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) ==1368== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) ==1368== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) ==1368== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) ==1368== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1368== by 0x804C4FF: main (ex19.c:140) ==1368== Address 0x4920208 is 0 bytes after a block of size 1,416 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x8132724: ISCreateGeneral (general.c:342) ==1368== by 0x813BE92: ISColoringGetIS (iscoloring.c:161) ==1368== by 0x836235A: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:30) ==1368== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) ==1368== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) ==1368== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1368== by 0x804C4FF: main (ex19.c:140) ==1368== lid velocity = 0.000287274, prandtl # = 1, grashof # = 1 ==1368== Invalid read of size 8 ==1368== at 0x43197A0: __strlen_sse2 (strlen.S:99) ==1368== by 0x80BDE83: PetscStrallocpy (str.c:79) ==1368== by 0x80B2778: PetscOptionsBegin_Private (aoptions.c:38) ==1368== by 0x858A3C6: PCSetFromOptions (pcset.c:170) ==1368== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) ==1368== by 0x85A8052: PCSetUp_MG (mg.c:490) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x8622C90: KSPSolve (itfunc.c:353) ==1368== by 0x81B0295: SNES_KSPSolve (snes.c:2944) ==1368== by 0x86A2C12: SNESSolve_LS (ls.c:191) ==1368== by 0x81AB5EC: SNESSolve (snes.c:2255) ==1368== Address 0x4858f38 is 8 bytes inside a block of size 11 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) ==1368== by 0x80A6564: PetscObjectAppendOptionsPrefix (prefix.c:70) ==1368== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) ==1368== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) ==1368== by 0x85A4D63: PCMGSetLevels (mg.c:180) ==1368== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) ==1368== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) ==1368== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1368== by 0x804C4FF: main (ex19.c:140) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) ==1368== by 0x8085406: PetscFListFind (reg.c:376) ==1368== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==1368== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==1368== by 0x870C60A: PCGetDefaultType_Private (precon.c:25) ==1368== by 0x858A499: PCSetFromOptions (pcset.c:172) ==1368== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) ==1368== by 0x85A8052: PCSetUp_MG (mg.c:490) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x8622C90: KSPSolve (itfunc.c:353) ==1368== by 0x81B0295: SNES_KSPSolve (snes.c:2944) ==1368== Address 0x4cfa520 is 16 bytes inside a block of size 22 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==1368== by 0x8085186: PetscFListFind (reg.c:356) ==1368== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==1368== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==1368== by 0x870C60A: PCGetDefaultType_Private (precon.c:25) ==1368== by 0x858A499: PCSetFromOptions (pcset.c:172) ==1368== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) ==1368== by 0x85A8052: PCSetUp_MG (mg.c:490) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== ==1369== Invalid read of size 8 ==1369== at 0x43197A0: __strlen_sse2 (strlen.S:99) ==1369== by 0x80BDE83: PetscStrallocpy (str.c:79) ==1369== by 0x80B2778: PetscOptionsBegin_Private (aoptions.c:38) ==1369== by 0x858A3C6: PCSetFromOptions (pcset.c:170) ==1369== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) ==1369== by 0x85A8052: PCSetUp_MG (mg.c:490) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x8622C90: KSPSolve (itfunc.c:353) ==1369== by 0x81B0295: SNES_KSPSolve (snes.c:2944) ==1369== by 0x86A2C12: SNESSolve_LS (ls.c:191) ==1369== by 0x81AB5EC: SNESSolve (snes.c:2255) ==1369== Address 0x4797eb8 is 8 bytes inside a block of size 11 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) ==1369== by 0x80A6564: PetscObjectAppendOptionsPrefix (prefix.c:70) ==1369== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) ==1369== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) ==1369== by 0x85A4D63: PCMGSetLevels (mg.c:180) ==1369== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) ==1369== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) ==1369== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1369== by 0x804C4FF: main (ex19.c:140) ==1369== ==1369== Invalid read of size 8 ==1369== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) ==1369== by 0x8085406: PetscFListFind (reg.c:376) ==1369== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==1369== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==1369== by 0x870C60A: PCGetDefaultType_Private (precon.c:25) ==1369== by 0x858A499: PCSetFromOptions (pcset.c:172) ==1369== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) ==1369== by 0x85A8052: PCSetUp_MG (mg.c:490) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x8622C90: KSPSolve (itfunc.c:353) ==1369== by 0x81B0295: SNES_KSPSolve (snes.c:2944) ==1369== Address 0x4c39110 is 16 bytes inside a block of size 22 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==1369== by 0x8085186: PetscFListFind (reg.c:356) ==1369== by 0x80968C7: PetscObjectQueryFunction_Petsc (inherit.c:238) ==1369== by 0x80978B5: PetscObjectQueryFunction (inherit.c:376) ==1369== by 0x870C60A: PCGetDefaultType_Private (precon.c:25) ==1369== by 0x858A499: PCSetFromOptions (pcset.c:172) ==1369== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) ==1369== by 0x85A8052: PCSetUp_MG (mg.c:490) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== ==1368== Invalid read of size 8 ==1368== at 0x43197A0: __strlen_sse2 (strlen.S:99) ==1368== by 0x80A671F: PetscObjectAppendOptionsPrefix (prefix.c:76) ==1368== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) ==1368== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) ==1368== by 0x859F3D9: PCSetUp_BJacobi_Singleblock (bjacobi.c:905) ==1368== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x85A8E40: PCSetUp_MG (mg.c:556) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x8622C90: KSPSolve (itfunc.c:353) ==1368== Address 0x4d567c8 is 8 bytes inside a block of size 13 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) ==1368== by 0x87166FA: PCSetOptionsPrefix (precon.c:1209) ==1368== by 0x861BFD3: KSPSetOptionsPrefix (itcl.c:88) ==1368== by 0x859F375: PCSetUp_BJacobi_Singleblock (bjacobi.c:904) ==1368== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x85A8E40: PCSetUp_MG (mg.c:556) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== ==1369== Invalid read of size 8 ==1369== at 0x43197A0: __strlen_sse2 (strlen.S:99) ==1369== by 0x80A671F: PetscObjectAppendOptionsPrefix (prefix.c:76) ==1369== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) ==1369== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) ==1369== by 0x859F3D9: PCSetUp_BJacobi_Singleblock (bjacobi.c:905) ==1369== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x85A8E40: PCSetUp_MG (mg.c:556) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x8622C90: KSPSolve (itfunc.c:353) ==1369== Address 0x4c42198 is 8 bytes inside a block of size 13 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x80A6193: PetscObjectSetOptionsPrefix (prefix.c:37) ==1369== by 0x87166FA: PCSetOptionsPrefix (precon.c:1209) ==1369== by 0x861BFD3: KSPSetOptionsPrefix (itcl.c:88) ==1369== by 0x859F375: PCSetUp_BJacobi_Singleblock (bjacobi.c:904) ==1369== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x85A8E40: PCSetUp_MG (mg.c:556) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== ==1368== Invalid read of size 8 ==1368== at 0x43197AF: __strlen_sse2 (strlen.S:106) ==1368== by 0x80BDE83: PetscStrallocpy (str.c:79) ==1368== by 0x80B2778: PetscOptionsBegin_Private (aoptions.c:38) ==1368== by 0x858A3C6: PCSetFromOptions (pcset.c:170) ==1368== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) ==1368== by 0x859F978: PCSetUp_BJacobi_Singleblock (bjacobi.c:944) ==1368== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x85A8E40: PCSetUp_MG (mg.c:556) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== Address 0x4d568d0 is 16 bytes inside a block of size 17 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80A67BF: PetscObjectAppendOptionsPrefix (prefix.c:77) ==1368== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) ==1368== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) ==1368== by 0x859F3D9: PCSetUp_BJacobi_Singleblock (bjacobi.c:905) ==1368== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x85A8E40: PCSetUp_MG (mg.c:556) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== ==1369== Invalid read of size 8 ==1369== at 0x43197AF: __strlen_sse2 (strlen.S:106) ==1369== by 0x80BDE83: PetscStrallocpy (str.c:79) ==1369== by 0x80B2778: PetscOptionsBegin_Private (aoptions.c:38) ==1369== by 0x858A3C6: PCSetFromOptions (pcset.c:170) ==1369== by 0x861D3B6: KSPSetFromOptions (itcl.c:323) ==1369== by 0x859F978: PCSetUp_BJacobi_Singleblock (bjacobi.c:944) ==1369== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x85A8E40: PCSetUp_MG (mg.c:556) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== Address 0x4c422a0 is 16 bytes inside a block of size 17 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80A67BF: PetscObjectAppendOptionsPrefix (prefix.c:77) ==1369== by 0x87169FB: PCAppendOptionsPrefix (precon.c:1242) ==1369== by 0x861C3B2: KSPAppendOptionsPrefix (itcl.c:121) ==1369== by 0x859F3D9: PCSetUp_BJacobi_Singleblock (bjacobi.c:905) ==1369== by 0x85992B8: PCSetUp_BJacobi (bjacobi.c:180) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x85A8E40: PCSetUp_MG (mg.c:556) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== ==1368== Invalid read of size 8 ==1368== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) ==1368== by 0x8093FC2: PetscTypeCompare (destroy.c:254) ==1368== by 0x85A956A: PCSetUp_MG (mg.c:585) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x8622C90: KSPSolve (itfunc.c:353) ==1368== by 0x81B0295: SNES_KSPSolve (snes.c:2944) ==1368== by 0x86A2C12: SNESSolve_LS (ls.c:191) ==1368== by 0x81AB5EC: SNESSolve (snes.c:2255) ==1368== by 0x81C4919: DMMGSolveSNES (damgsnes.c:510) ==1368== by 0x81BDF6C: DMMGSolve (damg.c:313) ==1368== by 0x804C9A7: main (ex19.c:155) ==1368== Address 0x48598d8 is 8 bytes inside a block of size 10 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x80902DB: PetscObjectChangeTypeName (pname.c:114) ==1368== by 0x8589AEE: PCSetType (pcset.c:79) ==1368== by 0x85A4F3C: PCMGSetLevels (mg.c:187) ==1368== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) ==1368== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) ==1368== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1368== by 0x804C4FF: main (ex19.c:140) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) ==1368== by 0x8093FC2: PetscTypeCompare (destroy.c:254) ==1368== by 0x85A956A: PCSetUp_MG (mg.c:585) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x8622C90: KSPSolve (itfunc.c:353) ==1368== by 0x81B0295: SNES_KSPSolve (snes.c:2944) ==1368== by 0x86A2C12: SNESSolve_LS (ls.c:191) ==1368== by 0x81AB5EC: SNESSolve (snes.c:2255) ==1368== by 0x81C4919: DMMGSolveSNES (damgsnes.c:510) ==1368== by 0x81BDF6C: DMMGSolve (damg.c:313) ==1368== by 0x804C9A7: main (ex19.c:155) ==1368== Address 0x48598e0 is 6 bytes after a block of size 10 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x80902DB: PetscObjectChangeTypeName (pname.c:114) ==1368== by 0x8589AEE: PCSetType (pcset.c:79) ==1368== by 0x85A4F3C: PCMGSetLevels (mg.c:187) ==1368== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) ==1368== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) ==1368== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1368== by 0x804C4FF: main (ex19.c:140) ==1368== ==1369== Invalid read of size 8 ==1369== at 0x43BE324: __strcmp_ssse3 (strcmp-ssse3.S:142) ==1369== by 0x8093FC2: PetscTypeCompare (destroy.c:254) ==1369== by 0x85A956A: PCSetUp_MG (mg.c:585) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x8622C90: KSPSolve (itfunc.c:353) ==1369== by 0x81B0295: SNES_KSPSolve (snes.c:2944) ==1369== by 0x86A2C12: SNESSolve_LS (ls.c:191) ==1369== by 0x81AB5EC: SNESSolve (snes.c:2255) ==1369== by 0x81C4919: DMMGSolveSNES (damgsnes.c:510) ==1369== by 0x81BDF6C: DMMGSolve (damg.c:313) ==1369== by 0x804C9A7: main (ex19.c:155) ==1369== Address 0x4798858 is 8 bytes inside a block of size 10 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x80902DB: PetscObjectChangeTypeName (pname.c:114) ==1369== by 0x8589AEE: PCSetType (pcset.c:79) ==1369== by 0x85A4F3C: PCMGSetLevels (mg.c:187) ==1369== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) ==1369== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) ==1369== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1369== by 0x804C4FF: main (ex19.c:140) ==1369== ==1369== Invalid read of size 8 ==1369== at 0x43BE32D: __strcmp_ssse3 (strcmp-ssse3.S:144) ==1369== by 0x8093FC2: PetscTypeCompare (destroy.c:254) ==1369== by 0x85A956A: PCSetUp_MG (mg.c:585) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x8622C90: KSPSolve (itfunc.c:353) ==1369== by 0x81B0295: SNES_KSPSolve (snes.c:2944) ==1369== by 0x86A2C12: SNESSolve_LS (ls.c:191) ==1369== by 0x81AB5EC: SNESSolve (snes.c:2255) ==1369== by 0x81C4919: DMMGSolveSNES (damgsnes.c:510) ==1369== by 0x81BDF6C: DMMGSolve (damg.c:313) ==1369== by 0x804C9A7: main (ex19.c:155) ==1369== Address 0x4798860 is 6 bytes after a block of size 10 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x80902DB: PetscObjectChangeTypeName (pname.c:114) ==1369== by 0x8589AEE: PCSetType (pcset.c:79) ==1369== by 0x85A4F3C: PCMGSetLevels (mg.c:187) ==1369== by 0x81BE9AD: DMMGSetUpLevel (damg.c:379) ==1369== by 0x81C5787: DMMGSetSNES (damgsnes.c:648) ==1369== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1369== by 0x804C4FF: main (ex19.c:140) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1369== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1369== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1369== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1369== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1369== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1369== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1369== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1369== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1369== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1369== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1369== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1369== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF669: __strcmp_ssse3 (strcmp-ssse3.S:2007) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1369== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1369== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1369== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1369== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1369== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1369== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF66E: __strcmp_ssse3 (strcmp-ssse3.S:2010) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1369== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1369== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1369== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1369== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1369== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1369== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF673: __strcmp_ssse3 (strcmp-ssse3.S:2013) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1369== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1369== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1369== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1369== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1369== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1369== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF678: __strcmp_ssse3 (strcmp-ssse3.S:2016) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1369== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1369== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1369== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1369== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1369== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1369== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF67D: __strcmp_ssse3 (strcmp-ssse3.S:2019) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1369== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1369== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1369== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1369== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1369== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1369== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF12F: __strcmp_ssse3 (strcmp-ssse3.S:1456) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1369== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1369== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1369== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1369== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1369== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1369== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1369== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1369== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1369== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1369== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1369== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1369== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1369== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1369== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1369== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1369== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1369== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1369== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1369== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1369== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1369== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1369== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1369== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1369== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1369== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1369== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1369== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1369== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1369== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1369== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5D4: __strcmp_ssse3 (strcmp-ssse3.S:1902) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1369== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1369== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1369== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ===1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1368== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1368== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1368== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1368== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1368== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1368== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8086A53: Pets=1369== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1369== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1369== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5D9: __strcmp_ssse3 (strcmp-ssse3.S:1905) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1369== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1369== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1369== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1369== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1369== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) =cFListDuplicate (reg.c:596) ==1368== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1368== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1368== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1368== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1368== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1368== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF669: __strcmp_ssse3 (strcmp-ssse3.S:2007) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1368== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1368== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1368== by 0x82D2300: MatILUFactorSym=1369== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1369== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1369== ==1369== Invalid read of size 8 ==1369== at 0x43197BE: __strlen_sse2 (strlen.S:112) ==1369== by 0x80BDE83: PetscStrallocpy (str.c:79) ==1369== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==1369== by 0x808432D: PetscFListAdd (reg.c:225) ==1369== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1369== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1369== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1369== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1369== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1369== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== Address 0x45afcd0 is 32 bytes inside a block of size 35 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1bolic_SeqAIJ (aijfact.c:1731) ==1368== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1368== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1368== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF66E: __strcmp_ssse3 (strcmp-ssse3.S:2010) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1368== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1368== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1368== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1368== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1368== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x862369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==1369== by 0x80846F3: PetscFListAdd (reg.c:238) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x82BE3A6: MatCreate_SeqAIJ (aij.c:3363) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1369== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1369== by 0x818A96F: DAGetMatrix2d_MPIAIJ (fdda.c:779) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5DE: __strcmp_ssse3 (strcmp-ssse3.S:1908) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1369== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1369== by 0x82D1840: MatI1C54: KSPSetUp (itfunc.c:237) ==1368== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1368== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF673: __strcmp_ssse3 (strcmp-ssse3.S:2013) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1368== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1368== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1368== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1368== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1368== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1368== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1368== ==1368== Conditional jump or move depLUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1369== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1369== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1369== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1369== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1369== ==1369== Invalid read of size 8 ==1369== at 0x43BF165: __strcmp_ssse3 (strcmp-ssse3.S:1474) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1369== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1369== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1369== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1369== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1369== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1369== by 0x8ends on uninitialised value(s) ==1368== at 0x43BF678: __strcmp_ssse3 (strcmp-ssse3.S:2016) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1368== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1368== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1368== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1368== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1368== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1368== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF67D: __strcmp_ssse3 (strcmp-ssse3.S:2019) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1368==714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1369== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1369== Address 0x541be90 is 32 bytes inside a block of size 37 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x8084689: PetscFListAdd (reg.c:237) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x82BEA33: MatCreate_SeqAIJ (aij.c:3408) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x82C3982: MatGetFactor_seqaij_petsc (aijfact.c:118) ==1369== by 0x8276893: MatGetFactor (matrix.c:3649) ==1369== by 0x85E7687: PCSetUp_ILU (ilu.c:202) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== ==1369== Conditional jump or mo by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1368== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1368== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1368== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1368== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1368== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF12F: __strcmp_ssse3 (strcmp-ssse3.S:1456) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1368== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1368== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1368== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1368ve depends on uninitialised value(s) ==1369== at 0x43BF18F: __strcmp_ssse3 (strcmp-ssse3.S:1485) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1369== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1369== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1369== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1369== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1369== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1369== by 0x8714039: PCSetUp (precon.c:795) ==1369== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1369== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1369== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1369== == by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1368== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1368== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1368== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1368== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1368== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1368== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1368== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1368== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1368== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1368== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1368== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1368== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1368== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1368== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1368== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1368== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1368== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1368== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1368== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1368== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1368== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1368== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1368== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1368== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1368== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1368== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5D4: __strcmp_ssse3 (strcmp-ssse3.S:1902) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1368== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1368== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1368== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1368== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1368== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1368== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5D9: __strcmp_ssse3 (strcmp-ssse3.S:1905) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1368== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1368== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1368== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1368== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1368== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1368== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x43197BE: __strlen_sse2 (strlen.S:112) ==1368== by 0x80BDE83: PetscStrallocpy (str.c:79) ==1368== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==1368== by 0x808432D: PetscFListAdd (reg.c:225) ==1368== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1368== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1368== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1368== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1368== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1368== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== Address 0x4642530 is 32 bytes inside a block of size 35 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x808368E: PetscFListGetPathAndFunction (reg.c:24) ==1368== by 0x80846F3: PetscFListAdd (reg.c:238) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x82BE3A6: MatCreate_SeqAIJ (aij.c:3363) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1368== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1368== by 0x818A96F: DAGetMatrix2d_MPIAIJ (fdda.c:779) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5DE: __strcmp_ssse3 (strcmp-ssse3.S:1908) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1368== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1368== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1368== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1368== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1368== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1368== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x43BF165: __strcmp_ssse3 (strcmp-ssse3.S:1474) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1368== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1368== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1368== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1368== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1368== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1368== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1368== Address 0x54138b0 is 32 bytes inside a block of size 37 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x8084689: PetscFListAdd (reg.c:237) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x82BEA33: MatCreate_SeqAIJ (aij.c:3408) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x82C3982: MatGetFactor_seqaij_petsc (aijfact.c:118) ==1368== by 0x8276893: MatGetFactor (matrix.c:3649) ==1368== by 0x85E7687: PCSetUp_ILU (ilu.c:202) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF18F: __strcmp_ssse3 (strcmp-ssse3.S:1485) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8086A53: PetscFListDuplicate (reg.c:596) ==1368== by 0x82BFA68: MatDuplicateNoCreate_SeqAIJ (aij.c:3511) ==1368== by 0x82D1840: MatILUFactorSymbolic_SeqAIJ_ilu0 (aijfact.c:1630) ==1368== by 0x82D2300: MatILUFactorSymbolic_SeqAIJ (aijfact.c:1731) ==1368== by 0x82866FB: MatILUFactorSymbolic (matrix.c:5464) ==1368== by 0x85E7785: PCSetUp_ILU (ilu.c:204) ==1368== by 0x8714039: PCSetUp (precon.c:795) ==1368== by 0x8621C54: KSPSetUp (itfunc.c:237) ==1368== by 0x859D9CE: PCSetUpOnBlocks_BJacobi_Singleblock (bjacobi.c:753) ==1368== by 0x8714602: PCSetUpOnBlocks (precon.c:828) ==1368== Number of Newton iterations = 2 ==1369== Invalid read of size 8 ==1369== at 0x43BEFE9: __strcmp_ssse3 (strcmp-ssse3.S:1339) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x86A4028: SNESDestroy_LS (ls.c:322) ==1369== by 0x81A578E: SNESDestroy (snes.c:1406) ==1369== by 0x8093606: PetscObjectDestroy (destroy.c:172) ==1369== by 0x81BCE39: DMMGDestroy (damg.c:179) ==1369== by 0x804CBD4: main (ex19.c:174) ==1369== Address 0x4b28898 is 24 bytes inside a block of size 28 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x8084689: PetscFListAdd (reg.c:237) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x86AA093: SNESCreate_LS (ls.c:1199) ==1369== by 0x81AC1EF: SNESSetType (snes.c:2353) ==1369== by 0x819BDE2: SNESSetFromOptions (snes.c:306) ==1369== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) ==1369== by 0x804C56B: main (ex19.c:141) ==1369== ==1369== Invalid read of size 8 ==1369== at 0x43BF015: __strcmp_ssse3 (strcmp-ssse3.S:1354) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x86A4028: SNESDestroy_LS (ls.c:322) ==1369== by 0x81A578E: SNESDestroy (snes.c:1406) ==1369== by 0x8093606: PetscObjectDestroy (destroy.c:172) ==1369== by 0x81BCE39: DMMGDestroy (damg.c:179) ==1369== by 0x804CBD4: main (ex19.c:174) ==1369== Address 0x4b28898 is 24 bytes inside a block of size 28 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x8084689: PetscFListAdd (reg.c:237) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x86AA093: SNESCreate_LS (ls.c:1199) ==1369== by 0x81AC1EF: SNESSetType (snes.c:2353) ==1369== by 0x819BDE2: SNESSetFromOptions (snes.c:306) ==1369== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) ==1369== by 0x804C56B: main (ex19.c:141) ==1369== ==1369== Invalid read of size 8 ==1369== at 0x43BF04A: __strcmp_ssse3 (strcmp-ssse3.S:1369) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x86A4028: SNESDestroy_LS (ls.c:322) ==1369== by 0x81A578E: SNESDestroy (snes.c:1406) ==1369== by 0x8093606: PetscObjectDestroy (destroy.c:172) ==1369== by 0x81BCE39: DMMGDestroy (damg.c:179) ==1369== by 0x804CBD4: main (ex19.c:174) ==1369== Address 0x4b288a0 is 4 bytes after a block of size 28 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x8084689: PetscFListAdd (reg.c:237) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x86AA093: SNESCreate_LS (ls.c:1199) ==1369== by 0x81AC1EF: SNESSetType (snes.c:2353) ==1369== by 0x819BDE2: SNESSetFromOptions (snes.c:306) ==1369== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) ==1369== by 0x804C56B: main (ex19.c:141) ==1369== ==1368== Invalid read of size 8 ==1368== at 0x43BEFE9: __strcmp_ssse3 (strcmp-ssse3.S:1339) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x86A4028: SNESDestroy_LS (ls.c:322) ==1368== by 0x81A578E: SNESDestroy (snes.c:1406) ==1368== by 0x8093606: PetscObjectDestroy (destroy.c:172) ==1368== by 0x81BCE39: DMMGDestroy (damg.c:179) ==1368== by 0x804CBD4: main (ex19.c:174) ==1368== Address 0x4be1e28 is 24 bytes inside a block of size 28 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x8084689: PetscFListAdd (reg.c:237) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x86AA093: SNESCreate_LS (ls.c:1199) ==1368== by 0x81AC1EF: SNESSetType (snes.c:2353) ==1368== by 0x819BDE2: SNESSetFromOptions (snes.c:306) ==1368== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) ==1368== by 0x804C56B: main (ex19.c:141) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x43BF015: __strcmp_ssse3 (strcmp-ssse3.S:1354) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x86A4028: SNESDestroy_LS (ls.c:322) ==1368== by 0x81A578E: SNESDestroy (snes.c:1406) ==1368== by 0x8093606: PetscObjectDestroy (destroy.c:172) ==1368== by 0x81BCE39: DMMGDestroy (damg.c:179) ==1368== by 0x804CBD4: main (ex19.c:174) ==1368== Address 0x4be1e28 is 24 bytes inside a block of size 28 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x8084689: PetscFListAdd (reg.c:237) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x86AA093: SNESCreate_LS (ls.c:1199) ==1368== by 0x81AC1EF: SNESSetType (snes.c:2353) ==1368== by 0x819BDE2: SNESSetFromOptions (snes.c:306) ==1368== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) ==1368== by 0x804C56B: main (ex19.c:141) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x43BF04A: __strcmp_ssse3 (strcmp-ssse3.S:1369) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x86A4028: SNESDestroy_LS (ls.c:322) ==1368== by 0x81A578E: SNESDestroy (snes.c:1406) ==1368== by 0x8093606: PetscObjectDestroy (destroy.c:172) ==1368== by 0x81BCE39: DMMGDestroy (damg.c:179) ==1368== by 0x804CBD4: main (ex19.c:174) ==1368== Address 0x4be1e30 is 4 bytes after a block of size 28 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x8084689: PetscFListAdd (reg.c:237) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x86AA093: SNESCreate_LS (ls.c:1199) ==1368== by 0x81AC1EF: SNESSetType (snes.c:2353) ==1368== by 0x819BDE2: SNESSetFromOptions (snes.c:306) ==1368== by 0x81C69A8: DMMGSetFromOptions (damgsnes.c:818) ==1368== by 0x804C56B: main (ex19.c:141) ==1368== ==1369== Invalid read of size 8 ==1369== at 0x43BEDAD: __strcmp_ssse3 (strcmp-ssse3.S:1128) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x830D1DF: MatDestroy_SeqAIJ_Inode (inode2.c:62) ==1369== by 0x82AB1E1: MatDestroy_SeqAIJ (aij.c:810) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==1369== Address 0x44df298 is 24 bytes inside a block of size 26 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x8084689: PetscFListAdd (reg.c:237) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x830D905: MatCreate_SeqAIJ_Inode (inode2.c:101) ==1369== by 0x82BEB04: MatCreate_SeqAIJ (aij.c:3414) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1369== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1369== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1369== ==1369== Invalid read of size 8 ==1369== at 0x43BEDA9: __strcmp_ssse3 (strcmp-ssse3.S:1127) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x82AB357: MatDestroy_SeqAIJ (aij.c:815) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==1369== by 0x85A5A49: PCDestroy_MG (mg.c:257) ==1369== Address 0x44dd918 is 24 bytes inside a block of size 28 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x8084689: PetscFListAdd (reg.c:237) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x82BE442: MatCreate_SeqAIJ (aij.c:3369) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1369== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1369== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1369== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1369== ==1369== Invalid read of size 8 ==1369== at 0x43BEDD5: __strcmp_ssse3 (strcmp-ssse3.S:1142) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x82AB357: MatDestroy_SeqAIJ (aij.c:815) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==1369== by 0x85A5A49: PCDestroy_MG (mg.c:257) ==1369== Address 0x44dd918 is 24 bytes inside a block of size 28 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x8084689: PetscFListAdd (reg.c:237) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x82BE442: MatCreate_SeqAIJ (aij.c:3369) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1369== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1369== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1369== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1369== ==1369== Invalid read of size 8 ==1369== at 0x43BEE0A: __strcmp_ssse3 (strcmp-ssse3.S:1157) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x82AB357: MatDestroy_SeqAIJ (aij.c:815) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==1369== by 0x85A5A49: PCDestroy_MG (mg.c:257) ==1369== Address 0x44dd920 is 4 bytes after a block of size 28 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x8084689: PetscFListAdd (reg.c:237) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x82BE442: MatCreate_SeqAIJ (aij.c:3369) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1369== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1369== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1369== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1369== ==1369== Invalid read of size 8 ==1369== at 0x43BF109: __strcmp_ssse3 (strcmp-ssse3.S:1445) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x82AB59B: MatDestroy_SeqAIJ (aij.c:820) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==1369== by 0x85A5A49: PCDestroy_MG (mg.c:257) ==1369== Address 0x44de148 is 24 bytes inside a block of size 31 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x8084689: PetscFListAdd (reg.c:237) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x82BE68B: MatCreate_SeqAIJ (aij.c:3384) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1369== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1369== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1369== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1369== ==1369== Invalid read of size 8 ==1369== at 0x43BF135: __strcmp_ssse3 (strcmp-ssse3.S:1460) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x82AB59B: MatDestroy_SeqAIJ (aij.c:820) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==1369== by 0x85A5A49: PCDestroy_MG (mg.c:257) ==1369== Address 0x44de148 is 24 bytes inside a block of size 31 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x8084689: PetscFListAdd (reg.c:237) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x82BE68B: MatCreate_SeqAIJ (aij.c:3384) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1369== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1369== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1369== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1369== ==1369== Invalid read of size 8 ==1369== at 0x43BF16A: __strcmp_ssse3 (strcmp-ssse3.S:1475) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x82AB59B: MatDestroy_SeqAIJ (aij.c:820) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==1369== by 0x85A5A49: PCDestroy_MG (mg.c:257) ==1369== Address 0x44de150 is 1 bytes after a block of size 31 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x8084689: PetscFListAdd (reg.c:237) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x82BE68B: MatCreate_SeqAIJ (aij.c:3384) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1369== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1369== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1369== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1369== ==1369== Invalid read of size 8 ==1369== at 0x43BF46D: __strcmp_ssse3 (strcmp-ssse3.S:1766) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x8327F64: MatDestroy_MPIAIJ (mpiaij.c:919) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==1369== by 0x85A5A49: PCDestroy_MG (mg.c:257) ==1369== by 0x870CC31: PCDestroy (precon.c:83) ==1369== by 0x8627601: KSPDestroy (itfunc.c:695) ==1369== Address 0x44db848 is 24 bytes inside a block of size 28 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x8084689: PetscFListAdd (reg.c:237) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) ==1369== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1369== ==1369== Invalid read of size 8 ==1369== at 0x43BF229: __strcmp_ssse3 (strcmp-ssse3.S:1552) ==1369== by 0x80842B8: PetscFListAdd (reg.c:223) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x83280C0: MatDestroy_MPIAIJ (mpiaij.c:922) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==1369== by 0x82595D1: MatDestroy (matrix.c:876) ==1369== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==1369== by 0x85A5A49: PCDestroy_MG (mg.c:257) ==1369== by 0x870CC31: PCDestroy (precon.c:83) ==1369== by 0x8627601: KSPDestroy (itfunc.c:695) ==1369== Address 0x44dc098 is 24 bytes inside a block of size 29 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1369== by 0x8084689: PetscFListAdd (reg.c:237) ==1369== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1369== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1369== by 0x834D4DF: MatCreate_MPIAIJ (mpiaij.c:5108) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) ==1369== by 0x829D1E0: MatSetType (matreg.c:65) ==1369== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) ==1369== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1369== ==1368== Invalid read of size 8 ==1368== at 0x43BEDAD: __strcmp_ssse3 (strcmp-ssse3.S:1128) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x830D1DF: MatDestroy_SeqAIJ_Inode (inode2.c:62) ==1368== by 0x82AB1E1: MatDestroy_SeqAIJ (aij.c:810) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==1368== Address 0x456f218 is 24 bytes inside a block of size 26 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x8084689: PetscFListAdd (reg.c:237) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x830D905: MatCreate_SeqAIJ_Inode (inode2.c:101) ==1368== by 0x82BEB04: MatCreate_SeqAIJ (aij.c:3414) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1368== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1368== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x43BEDA9: __strcmp_ssse3 (strcmp-ssse3.S:1127) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x82AB357: MatDestroy_SeqAIJ (aij.c:815) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==1368== by 0x85A5A49: PCDestroy_MG (mg.c:257) ==1368== Address 0x456d898 is 24 bytes inside a block of size 28 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x8084689: PetscFListAdd (reg.c:237) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x82BE442: MatCreate_SeqAIJ (aij.c:3369) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1368== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1368== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1368== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x43BEDD5: __strcmp_ssse3 (strcmp-ssse3.S:1142) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x82AB357: MatDestroy_SeqAIJ (aij.c:815) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==1368== by 0x85A5A49: PCDestroy_MG (mg.c:257) ==1368== Address 0x456d898 is 24 bytes inside a block of size 28 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x8084689: PetscFListAdd (reg.c:237) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x82BE442: MatCreate_SeqAIJ (aij.c:3369) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1368== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1368== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1368== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x43BEE0A: __strcmp_ssse3 (strcmp-ssse3.S:1157) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x82AB357: MatDestroy_SeqAIJ (aij.c:815) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==1368== by 0x85A5A49: PCDestroy_MG (mg.c:257) ==1368== Address 0x456d8a0 is 4 bytes after a block of size 28 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x8084689: PetscFListAdd (reg.c:237) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x82BE442: MatCreate_SeqAIJ (aij.c:3369) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1368== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1368== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1368== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x43BF109: __strcmp_ssse3 (strcmp-ssse3.S:1445) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x82AB59B: MatDestroy_SeqAIJ (aij.c:820) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==1368== by 0x85A5A49: PCDestroy_MG (mg.c:257) ==1368== Address 0x456e0c8 is 24 bytes inside a block of size 31 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x8084689: PetscFListAdd (reg.c:237) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x82BE68B: MatCreate_SeqAIJ (aij.c:3384) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1368== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1368== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1368== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x43BF135: __strcmp_ssse3 (strcmp-ssse3.S:1460) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x82AB59B: MatDestroy_SeqAIJ (aij.c:820) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==1368== by 0x85A5A49: PCDestroy_MG (mg.c:257) ==1368== Address 0x456e0c8 is 24 bytes inside a block of size 31 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x8084689: PetscFListAdd (reg.c:237) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x82BE68B: MatCreate_SeqAIJ (aij.c:3384) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1368== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1368== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1368== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x43BF16A: __strcmp_ssse3 (strcmp-ssse3.S:1475) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x82AB59B: MatDestroy_SeqAIJ (aij.c:820) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x8327843: MatDestroy_MPIAIJ (mpiaij.c:900) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==1368== by 0x85A5A49: PCDestroy_MG (mg.c:257) ==1368== Address 0x456e0d0 is 1 bytes after a block of size 31 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x8084689: PetscFListAdd (reg.c:237) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x82BE68B: MatCreate_SeqAIJ (aij.c:3384) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x83393A9: MatMPIAIJSetPreallocation_MPIAIJ (mpiaij.c:2789) ==1368== by 0x833E8AA: MatMPIAIJSetPreallocation (mpiaij.c:3488) ==1368== by 0x8171D37: DAGetInterpolation_2D_Q1 (dainterp.c:312) ==1368== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x43BF46D: __strcmp_ssse3 (strcmp-ssse3.S:1766) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x8327F64: MatDestroy_MPIAIJ (mpiaij.c:919) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==1368== by 0x85A5A49: PCDestroy_MG (mg.c:257) ==1368== by 0x870CC31: PCDestroy (precon.c:83) ==1368== by 0x8627601: KSPDestroy (itfunc.c:695) ==1368== Address 0x456b7c8 is 24 bytes inside a block of size 28 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x8084689: PetscFListAdd (reg.c:237) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x834D296: MatCreate_MPIAIJ (mpiaij.c:5093) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) ==1368== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x43BF229: __strcmp_ssse3 (strcmp-ssse3.S:1552) ==1368== by 0x80842B8: PetscFListAdd (reg.c:223) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x83280C0: MatDestroy_MPIAIJ (mpiaij.c:922) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x8470F9F: MatDestroy_MPIMAIJ (maij.c:152) ==1368== by 0x82595D1: MatDestroy (matrix.c:876) ==1368== by 0x85A560D: PCDestroy_MG_Private (mg.c:232) ==1368== by 0x85A5A49: PCDestroy_MG (mg.c:257) ==1368== by 0x870CC31: PCDestroy (precon.c:83) ==1368== by 0x8627601: KSPDestroy (itfunc.c:695) ==1368== Address 0x456c018 is 24 bytes inside a block of size 29 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x80BDF14: PetscStrallocpy (str.c:80) ==1368== by 0x8084689: PetscFListAdd (reg.c:237) ==1368== by 0x8096702: PetscObjectComposeFunction_Petsc (inherit.c:227) ==1368== by 0x8097550: PetscObjectComposeFunction (inherit.c:340) ==1368== by 0x834D4DF: MatCreate_MPIAIJ (mpiaij.c:5108) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x86EE543: MatCreate_AIJ (aijtype.c:35) ==1368== by 0x829D1E0: MatSetType (matreg.c:65) ==1368== by 0x8171C4C: DAGetInterpolation_2D_Q1 (dainterp.c:310) ==1368== by 0x8179179: DAGetInterpolation (dainterp.c:879) ==1368== ==1369== Invalid read of size 8 ==1369== at 0x43B674F: __memcpy_ssse3 (memcpy-ssse3.S:703) ==1369== by 0x8791BD2: MPIR_Allgatherv (allgatherv.c:160) ==1369== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) ==1369== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) ==1369== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) ==1369== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) ==1369== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1369== by 0x804C4FF: main (ex19.c:140) ==1369== Address 0x4c37f38 is 0 bytes after a block of size 360 alloc'd ==1369== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1369== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1369== by 0x8132724: ISCreateGeneral (general.c:342) ==1369== by 0x813BE92: ISColoringGetIS (iscoloring.c:161) ==1369== by 0x836235A: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:30) ==1369== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) ==1369== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) ==1369== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1369== by 0x804C4FF: main (ex19.c:140) ==1369== ==1368== Invalid read of size 8 ==1368== at 0x43B674F: __memcpy_ssse3 (memcpy-ssse3.S:703) ==1368== by 0x8791BD2: MPIR_Allgatherv (allgatherv.c:160) ==1368== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) ==1368== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) ==1368== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) ==1368== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) ==1368== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1368== by 0x804C4FF: main (ex19.c:140) ==1368== Address 0x4cb0c68 is 0 bytes after a block of size 360 alloc'd ==1368== at 0x4022E01: memalign (vg_replace_malloc.c:532) ==1368== by 0x808B1A0: PetscMallocAlign (mal.c:30) ==1368== by 0x8132724: ISCreateGeneral (general.c:342) ==1368== by 0x813BE92: ISColoringGetIS (iscoloring.c:161) ==1368== by 0x836235A: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:30) ==1368== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) ==1368== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) ==1368== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1368== by 0x804C4FF: main (ex19.c:140) ==1368== ==1368== Invalid read of size 8 ==1368== at 0x43B674F: __memcpy_ssse3 (memcpy-ssse3.S:703) ==1368== by 0x87923CB: MPIR_Allgatherv (allgatherv.c:340) ==1368== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) ==1368== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) ==1368== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) ==1368== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) ==1368== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1368== by 0x804C4FF: main (ex19.c:140) ==1368== Address 0x4461998 is 0 bytes after a block of size 720 alloc'd ==1368== at 0x4023BF3: malloc (vg_replace_malloc.c:195) ==1368== by 0x8791B68: MPIR_Allgatherv (allgatherv.c:143) ==1368== by 0x879274D: PMPI_Allgatherv (allgatherv.c:997) ==1368== by 0x83631F0: MatFDColoringCreate_MPIAIJ (fdmpiaij.c:89) ==1368== by 0x854A377: MatFDColoringCreate (fdmatrix.c:385) ==1368== by 0x81C5E65: DMMGSetSNES (damgsnes.c:712) ==1368== by 0x81C71AE: DMMGSetSNESLocal_Private (damgsnes.c:952) ==1368== by 0x804C4FF: main (ex19.c:140) ==1368== lid velocity = 0.000287274, prandtl # = 1, grashof # = 1 Number of Newton iterations = 2 ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) ==1368== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1368== by 0x80B0A16: PetscFinalize (pinit.c:829) ==1368== by 0x804CCA7: main (ex19.c:181) ==1368== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x80A0A4A: PetscOptionsFindPair_Private (options.c:989) ==1369== by 0x80A3E53: PetscOptionsGetString (options.c:1693) ==1369== by 0x80B0A16: PetscFinalize (pinit.c:829) ==1369== by 0x804CCA7: main (ex19.c:181) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==1369== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==1369== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==1369== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==1369== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==1369== by 0x80B2276: PetscFinalize (pinit.c:973) ==1369== by 0x804CCA7: main (ex19.c:181) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==1369== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==1369== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==1369== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==1369== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==1369== by 0x80B2276: PetscFinalize (pinit.c:973) ==1369== by 0x804CCA7: main (ex19.c:181) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==1369== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==1369== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==1369== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==1369== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==1369== by 0x80B2276: PetscFinalize (pinit.c:973) ==1369== by 0x804CCA7: main (ex19.c:181) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==1369== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==1369== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==1369== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==1369== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==1369== by 0x80B2276: PetscFinalize (pinit.c:973) ==1369== by 0x804CCA7: main (ex19.c:181) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==1369== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==1369== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==1369== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==1369== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==1369== by 0x80B2276: PetscFinalize (pinit.c:973) ==1369== by 0x804CCA7: main (ex19.c:181) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) ==1369== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==1369== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==1369== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==1369== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==1369== by 0x80B2276: PetscFinalize (pinit.c:973) ==1369== by 0x804CCA7: main (ex19.c:181) ==1369== ==1369== Conditional jump or move depends on uninitialised value(s) ==1369== at 0x43BF5D4: __strcmp_ssse3 (strcmp-ssse3.S:1902) ==1369== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==1369== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==1369== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==1369== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==1369== by 0x80B2276: PetscFinalize (pinit.c:973) ==1369== by 0x804CCA7: main (ex19.c:181) ==1369== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BE348: __strcmp_ssse3 (strcmp-ssse3.S:150) ==1368== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==1368== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==1368== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==1368== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==1368== by 0x80B2276: PetscFinalize (pinit.c:973) ==1368== by 0x804CCA7: main (ex19.c:181) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5B7: __strcmp_ssse3 (strcmp-ssse3.S:1887) ==1368== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==1368== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==1368== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==1368== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==1368== by 0x80B2276: PetscFinalize (pinit.c:973) ==1368== by 0x804CCA7: main (ex19.c:181) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5C0: __strcmp_ssse3 (strcmp-ssse3.S:1890) ==1368== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==1368== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==1368== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==1368== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==1368== by 0x80B2276: PetscFinalize (pinit.c:973) ==1368== by 0x804CCA7: main (ex19.c:181) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5C5: __strcmp_ssse3 (strcmp-ssse3.S:1893) ==1368== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==1368== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==1368== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==1368== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==1368== by 0x80B2276: PetscFinalize (pinit.c:973) ==1368== by 0x804CCA7: main (ex19.c:181) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5CA: __strcmp_ssse3 (strcmp-ssse3.S:1896) ==1368== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==1368== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==1368== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==1368== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==1368== by 0x80B2276: PetscFinalize (pinit.c:973) ==1368== by 0x804CCA7: main (ex19.c:181) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5CF: __strcmp_ssse3 (strcmp-ssse3.S:1899) ==1368== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==1368== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==1368== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==1368== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==1368== by 0x80B2276: PetscFinalize (pinit.c:973) ==1368== by 0x804CCA7: main (ex19.c:181) ==1368== ==1368== Conditional jump or move depends on uninitialised value(s) ==1368== at 0x43BF5D4: __strcmp_ssse3 (strcmp-ssse3.S:1902) ==1368== by 0x87E2922: PMI_Finalize (simple_pmi.c:398) ==1368== by 0x87CB316: MPIDI_PG_Finalize (mpidi_pg.c:92) ==1368== by 0x87C728D: MPID_Finalize (mpid_finalize.c:141) ==1368== by 0x879FA8C: PMPI_Finalize (finalize.c:158) ==1368== by 0x80B2276: PetscFinalize (pinit.c:973) ==1368== by 0x804CCA7: main (ex19.c:181) ==1368== ==1369== ==1369== HEAP SUMMARY: ==1369== in use at exit: 160 bytes in 11 blocks ==1369== total heap usage: 59,069 allocs, 59,058 frees, 49,630,900 bytes allocated ==1369== ==1369== LEAK SUMMARY: ==1369== definitely lost: 40 bytes in 1 blocks ==1369== indirectly lost: 120 bytes in 10 blocks ==1369== possibly lost: 0 bytes in 0 blocks ==1369== still reachable: 0 bytes in 0 blocks ==1369== suppressed: 0 bytes in 0 blocks ==1369== Rerun with --leak-check=full to see details of leaked memory ==1369== ==1369== For counts of detected and suppressed errors, rerun with: -v ==1369== Use --track-origins=yes to see where uninitialised values come from ==1369== ERROR SUMMARY: 15830 errors from 166 contexts (suppressed: 0 from 0) ==1368== ==1368== HEAP SUMMARY: ==1368== in use at exit: 160 bytes in 11 blocks ==1368== total heap usage: 60,266 allocs, 60,255 frees, 51,015,236 bytes allocated ==1368== ==1368== LEAK SUMMARY: ==1368== definitely lost: 40 bytes in 1 blocks ==1368== indirectly lost: 120 bytes in 10 blocks ==1368== possibly lost: 0 bytes in 0 blocks ==1368== still reachable: 0 bytes in 0 blocks ==1368== suppressed: 0 bytes in 0 blocks ==1368== Rerun with --leak-check=full to see details of leaked memory ==1368== ==1368== For counts of detected and suppressed errors, rerun with: -v ==1368== Use --track-origins=yes to see where uninitialised values come from ==1368== ERROR SUMMARY: 15856 errors from 168 contexts (suppressed: 0 from 0) -------------------------------------------------------------- If it is workable for ubunbu 10.0.4, I might have to switch back to 8.04... Thanks a lot! Rebecca Quoting Satish Balay : > also - I'll suggest 'rm -rf $PETSC_ARCH externalpackages' and then > reinstall petsc > with '--with-cc=gcc --with-fc=gfortran' and retry. > > Satish > > On Tue, 7 Sep 2010, Satish Balay wrote: > >> BTW: One difference I see is: you have an old version of valgrind. >> >> Are you sure the upgrade from 8.04 to 10.04 happened correctly? Have >> you tried 'apt-get update; apt-get upgrade' to grab all the 10.04 >> updates? >> >> The version of valgrind I have is: >> >> balay at petsc:~ $ valgrind --version >> valgrind-3.6.0.SVN-Debian >> >> Satish >> >> On Tue, 7 Sep 2010, Satish Balay wrote: >> >> > I can't say why your build is behaving differently than mine. >> > >> > Perhaps there are clues in configure.log. If you send it to >> petsc-maint - I can take >> > a look. >> >> > > > > > I upgrade my laptop from ubuntu 8.04 LTS to 10.04, after >> the upgrade, I >> > > > > > reinstalled PETSc, but there are tons of valgrind errors >> coming out >> > > > > > even the code is unchanged. Then I tried with >> >> > > > > > rebecca at YuanWork:~/linux/code/twoway/twoway_brandnew/trunk/set_a$ >> > > > > > >> ~/soft/petsc-3.1-p4/externalpackages/mpich2-1.0.8/bin/mpiexec -np 2 >> > > > > > valgrind --tool=memcheck ./ex19.exe -malloc off -da_grid_x 30 >> > > > > > -da_grid_y 30 >> > > > > > ==2175== Memcheck, a memory error detector >> > > > > > ==2175== Copyright (C) 2002-2009, and GNU GPL'd, by >> Julian Seward et al. >> > > > > > ==2175== Using Valgrind-3.5.0 and LibVEX; rerun with -h >> for copyright >> > > > > > info >> > > > Rebecca Xuefei YUAN Department of Applied Physics and Applied Mathematics Columbia University Tel:917-399-8032 www.columbia.edu/~xy2102 From B.Sanderse at cwi.nl Tue Sep 7 10:53:15 2010 From: B.Sanderse at cwi.nl (Benjamin Sanderse) Date: Tue, 7 Sep 2010 09:53:15 -0600 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: <30D58A3F-D8D9-4692-8BE9-BB8168201BD3@mcs.anl.gov> References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> <30D58A3F-D8D9-4692-8BE9-BB8168201BD3@mcs.anl.gov> Message-ID: <6F879BC3-2E8E-4B30-B4A8-CDB039BD3A1E@cwi.nl> Hi Barry, I am still not too happy with the execution in parallel. I am working under Linux (64 bits) and still using your approach with two command windows (since it gives the best debugging possibility). As I said, sometimes things work, but most of the time not. Here is the output of two successive runs -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info [1] PetscInitialize(): PETSc successfully started: number of processors = 2 [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl [0] PetscInitialize(): PETSc successfully started: number of processors = 2 [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 [1] PetscCommDuplicate(): returning tag 2147483647 [0] PetscCommDuplicate(): returning tag 2147483647 [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl [0] PetscCommDuplicate(): returning tag 2147483646 [1] PetscCommDuplicate(): returning tag 2147483646 [1] PetscCommDuplicate(): returning tag 2147483641 [0] PetscCommDuplicate(): returning tag 2147483641 [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. [1] PetscFinalize(): PetscFinalize() called [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 [0] PetscFinalize(): PetscFinalize() called [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 -bash-4.0$ netstat | grep 5005 -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info [1] PetscInitialize(): PETSc successfully started: number of processors = 2 [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl [0] PetscInitialize(): PETSc successfully started: number of processors = 2 [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 [0] PetscCommDuplicate(): returning tag 2147483647 [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 [1] PetscCommDuplicate(): returning tag 2147483647 [1] PetscCommDuplicate(): returning tag 2147483646 [0] PetscCommDuplicate(): returning tag 2147483646 [0] PetscCommDuplicate(): returning tag 2147483641 [1] PetscCommDuplicate(): returning tag 2147483641 [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. ^C -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt [0]1:Return code = 0, signaled with Interrupt In both cases I first started the Matlab program. I am currently starting Matlab without a GUI, but with a GUI I have the same problems. As you can see, in the first case everything works fine, and Petsc finalizes and closes. Matlab gives me the correct output. The second case, run just a couple of seconds later, does not reach PetscFinalize and Matlab does not give the correct output. In between the two cases I checked if port 5005 was in use, and it was not. Do you have any more suggestions on how to get this to work properly? Benjamin Op 3 sep 2010, om 21:11 heeft Barry Smith het volgende geschreven: > > On Sep 3, 2010, at 4:32 PM, Benjamin Sanderse wrote: > >> Hi Barry, >> >> Thanks for your help! However, there are still some issues left. In other to test things, I simplified the program even more and now I am just sending a vector back and forth: matlab->petsc->matlab: >> >> fd = PETSC_VIEWER_SOCKET_WORLD; >> >> // load rhs vector >> ierr = VecLoad(fd,VECMPI,&b);CHKERRQ(ierr); >> >> // send to matlab >> ierr = VecView(b,fd);CHKERRQ(ierr); >> ierr = VecDestroy(b);CHKERRQ(ierr); >> >> >> - Your approach with two windows works *sometimes*. I removed the 'launch' statement and I executed my program 10 times, the first 2 times worked, and in all other cases I got this: >> >> petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >> [1] PetscCommDuplicate(): returning tag 2147483647 >> [0] PetscCommDuplicate(): returning tag 2147483647 >> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again >> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again^C >> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >> [0]1:Return code = 0, signaled with Interrupt >> >> Every time I start the program I use close(socket) and clear all in Matlab, so the socket from the previous run should not be present anymore. It seems that the port gets corrupted after a couple of times? Matlab does not respond and I have to kill it and restart it manually. > > Sometimes when you close a socket connection it doesn't close for a very long time so that if you try to open it again it doesn't work. When it appears the socket can not be used try using netstat | grep 5005 to see if the socket is still active. > >> >> - If I include the launch statement, or just type >> system('mpiexec -n 2 ./petsc_poisson_par_barry2 &') >> the program never works. > > Are you sure mpiexec is in the path of system and it is the right one? The problem is that we are kind of cheating with system because we start a new job in the background and have no idea what the output is. Are you using unix and running Matlab on the command line or in a GUI? > > Barry > > >> >> Hope you can figure out what is going wrong. >> >> Ben >> >> >> Op 3 sep 2010, om 13:25 heeft Barry Smith het volgende geschreven: >> >>> >>> Ben >>> >>> Ok, I figured out the problem. It is not fundamental and mostly comes from not having a create way to debug this. >>> >>> The test vector you create is sequential then you try to view it back to Matlab with the parallel fd viewer. If you change to >>> ierr = VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,1,&test);CHKERRQ(ierr); >>> then the code runs. >>> >>> I've found (just now) that when I use launch all the output from the .c program gets lost which makes it impossible to figure out what has gone wrong. You can debug by running the two parts of the computation in two different windows. So comment out the launch from the matlab script and then in Matlab run the script (it will hang waiting for the socket to work) and in a separate terminal window run the .c program; for example petscmpiexec -n 2 ./ex1 -info Now you see exactly what is happening in the PETSc program. You can even use -start_in_debugger on the PETSc side to run the debugger on crashes. >>> >>> I'll add this to the docs for launch >>> >>> Barry >>> >>> >>> On Sep 2, 2010, at 3:28 PM, Benjamin Sanderse wrote: >>> >>>> Hi Barry, >>>> >>>> I attached my matlab file, c file and makefile. First I generate the executable with 'make petsc_poisson_par_barry' and then I run test_petsc_par_barry.m. >>>> If you change MATMPIAIJ to MATAIJ and VECMPI to VECSEQ the code works fine. >>>> >>>> Thanks a lot, >>>> >>>> Benjamin >>>> >>>> >>>> >>>> Op 2 sep 2010, om 13:45 heeft Barry Smith het volgende geschreven: >>>> >>>>> >>>>> >>>>> Matlab is never aware the vector is parallel. Please send me the code and I'll figure out what is going on. >>>>> >>>>> Barry >>>>> >>>>> On Sep 2, 2010, at 2:07 PM, Benjamin Sanderse wrote: >>>>> >>>>>> That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: >>>>>> >>>>>> Found unrecogonized header 0 in file. If your file contains complex numbers >>>>>> then call PetscBinaryRead() with "complex" as the second argument >>>>>> Error in ==> PetscBinaryRead at 27 >>>>>> if nargin < 2 >>>>>> >>>>>> ??? Output argument "varargout" (and maybe others) not assigned during call to >>>>>> "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". >>>>>> >>>>>> Error in ==> test_petsc_par at 57 >>>>>> x4 = PetscBinaryReady(PS); >>>>>> >>>>>> Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: >>>>>> >>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>> ... >>>>>> KSPSolve(ksp,b,x); >>>>>> ... >>>>>> VecView(fd,x); >>>>>> >>>>>> Thanks for the help! >>>>>> >>>>>> Ben >>>>>> >>>>>> Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: >>>>>> >>>>>>> >>>>>>> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: >>>>>>> >>>>>>>> Hello all, >>>>>>>> >>>>>>>> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. >>>>>>>> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? >>>>>>> >>>>>>> In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ >>>>>>> >>>>>>> Barry >>>>>>> >>>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Ben >>>>>>> >>>>>> >>>>> >>>> >>> >> > From balay at mcs.anl.gov Tue Sep 7 11:11:22 2010 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 7 Sep 2010 11:11:22 -0500 (CDT) Subject: [petsc-users] valgrind error comes out when upgrade from ubuntu 8.04 LTS to 10.04 In-Reply-To: <20100907114445.yr96rmlf4s4kscog@cubmail.cc.columbia.edu> References: <20100906203101.9i11ikcdkos4gss4@cubmail.cc.columbia.edu> <196CED0F-3AD9-487C-AC88-FD94624407ED@mcs.anl.gov> <20100906214634.fvvfvn63sos8gsgs@cubmail.cc.columbia.edu> <20100907114445.yr96rmlf4s4kscog@cubmail.cc.columbia.edu> Message-ID: On Tue, 7 Sep 2010, Rebecca Xuefei Yuan wrote: > Dear Satish, > > I reinstalled PETSc by "rm -rf petsc-3.1-p4" and downloaded > "petsc-lite-3.1-p4.tar.gz" to start over. > > The commands are > > > ./config/configure.py --with-cc=gcc --with-fc=gfortran > --download-f-blas-lapack=1 --download-mpich=1 > make PETSC_DIR=/home/rebecca/soft/petsc-3.1-p4 PETSC_ARCH=linux-gnu-c-debug > all > make PETSC_DIR=/home/rebecca/soft/petsc-3.1-p4 PETSC_ARCH=linux-gnu-c-debug > test > > When I check the valgrind version > > rebecca at YuanWork:~/soft$ valgrind --version > valgrind-3.5.0 > > When I check at > http://valgrind.org/ > > the current release is 3.5.0. How could you get 3.6? Are you using your own install of valgrind? [and not ubuntu install?] If so - did you reinstall it after your OS upgrade? Looks like valgrind packaged by ubuntu/debian is an SVN snapshot of upcoming 3.6 >>>>>>>> balay at petsc:~ $ dpkg -l valgrind Desired=Unknown/Install/Remove/Purge/Hold | Status=Not/Inst/Cfg-files/Unpacked/Failed-cfg/Half-inst/trig-aWait/Trig-pend |/ Err?=(none)/Reinst-required (Status,Err: uppercase=bad) ||/ Name Version Description +++-======================-======================-============================================================ ii valgrind 1:3.6.0~svn20100212-0u A memory debugger and profiler balay at petsc:~ $ <<<<<<< So your choice is to either: 1. install ubuntu valgrind [sudo apt-get install valgrind] - and delete personal copy 2. reinstall valgrind from source [for this upgraded OS] Satish From balay at mcs.anl.gov Tue Sep 7 12:56:30 2010 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 7 Sep 2010 12:56:30 -0500 (CDT) Subject: [petsc-users] valgrind error comes out when upgrade from ubuntu 8.04 LTS to 10.04 In-Reply-To: <20100907135226.kvq7npkxs0c88wsg@cubmail.cc.columbia.edu> References: <20100906203101.9i11ikcdkos4gss4@cubmail.cc.columbia.edu> <196CED0F-3AD9-487C-AC88-FD94624407ED@mcs.anl.gov> <20100906214634.fvvfvn63sos8gsgs@cubmail.cc.columbia.edu> <20100907114445.yr96rmlf4s4kscog@cubmail.cc.columbia.edu> <20100907121620.7kemh74dckggsoco@cubmail.cc.columbia.edu> <20100907135226.kvq7npkxs0c88wsg@cubmail.cc.columbia.edu> Message-ID: Ubuntu valgrind is at /usr/bin/valgrind. /usr/local/bin/valgrind might be a personal install. [perhaps compiled on 8.04] Try: ~/soft/petsc-3.1-p4/externalpackages/mpich2-1.0.8/bin/mpiexec -np 2 /usr/bin/valgrind --tool=memcheck ./ex19.exe -malloc off -da_grid_x 30 -da_grid_y 30 [and then delete valgrind installed at /usr/local] Satish On Tue, 7 Sep 2010, Rebecca Xuefei Yuan wrote: > It is > > > rebecca at YuanWork:~/soft$ which valgrind > /usr/local/bin/valgrind > > > R > Quoting Satish Balay : > > > what do you have for > > > > which valgrind > > > > Satish > > > > On Tue, 7 Sep 2010, Rebecca Xuefei Yuan wrote: > > > > > Dear Satish, > > > > > > I checked the update the valgrind as > > > > > > > > > rebecca at YuanWork:~/soft$ sudo apt-get install valgrind > > > [sudo] password for rebecca: > > > Reading package lists... Done > > > Building dependency tree > > > Reading state information... Done > > > valgrind is already the newest version. > > > The following packages were automatically installed and are no longer > > > required: > > > nabi ttf-wqy-zenhei ttf-takao-mincho libgcj8-dev ttf-alee gcj-4.2 > > > ttf-takao-gothic gij xfonts-wqy ttf-unfonts-extra ttf-kacst > > > Use 'apt-get autoremove' to remove them. > > > 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. > > > rebecca at YuanWork:~/soft$ dpkg -l valgrind > > > Desired=Unknown/Install/Remove/Purge/Hold > > > | > > > Status=Not/Inst/Cfg-files/Unpacked/Failed-cfg/Half-inst/trig-aWait/Trig-pend > > > |/ Err?=(none)/Reinst-required (Status,Err: uppercase=bad) > > > ||/ Name Version Description > > > +++-==============-==============-============================================ > > > ii valgrind 1:3.6.0~svn201 A memory debugger and profiler > > > > > > > > > It is a self installed valgrind or installed through ubuntu 10.04? > > > > > > Thanks a lot! > > > > > > Rebecca > > > > > > > > > > > > > > > Quoting Satish Balay : > > > > > > > On Tue, 7 Sep 2010, Rebecca Xuefei Yuan wrote: > > > > > > > > > Dear Satish, > > > > > > > > > > I reinstalled PETSc by "rm -rf petsc-3.1-p4" and downloaded > > > > > "petsc-lite-3.1-p4.tar.gz" to start over. > > > > > > > > > > The commands are > > > > > > > > > > > > > > > ./config/configure.py --with-cc=gcc --with-fc=gfortran > > > > > --download-f-blas-lapack=1 --download-mpich=1 > > > > > make PETSC_DIR=/home/rebecca/soft/petsc-3.1-p4 > > > > > PETSC_ARCH=linux-gnu-c-debug > > > > > all > > > > > make PETSC_DIR=/home/rebecca/soft/petsc-3.1-p4 > > > > > PETSC_ARCH=linux-gnu-c-debug > > > > > test > > > > > > > > > > When I check the valgrind version > > > > > > > > > > rebecca at YuanWork:~/soft$ valgrind --version > > > > > valgrind-3.5.0 > > > > > > > > > > When I check at > > > > > http://valgrind.org/ > > > > > > > > > > the current release is 3.5.0. How could you get 3.6? > > > > > > > > Are you using your own install of valgrind? [and not ubuntu install?] > > > > If so - did you reinstall it after your OS upgrade? > > > > > > > > Looks like valgrind packaged by ubuntu/debian is an SVN snapshot of > > > > upcoming 3.6 > > > > > > > > > > > > > > > > > > > > balay at petsc:~ $ dpkg -l valgrind > > > > Desired=Unknown/Install/Remove/Purge/Hold > > > > | > > > > Status=Not/Inst/Cfg-files/Unpacked/Failed-cfg/Half-inst/trig-aWait/Trig-pend > > > > |/ Err?=(none)/Reinst-required (Status,Err: uppercase=bad) > > > > ||/ Name Version Description > > > > +++-======================-======================-============================================================ > > > > ii valgrind 1:3.6.0~svn20100212-0u A memory debugger and > > > > profiler > > > > balay at petsc:~ $ > > > > <<<<<<< > > > > > > > > So your choice is to either: > > > > > > > > 1. install ubuntu valgrind [sudo apt-get install valgrind] - and delete > > > > personal copy > > > > 2. reinstall valgrind from source [for this upgraded OS] > > > > > > > > > > > > Satish > > > > > > > > > > > > > > > > > > > > > > > > Rebecca Xuefei YUAN > > > Department of Applied Physics and Applied Mathematics > > > Columbia University > > > Tel:917-399-8032 > > > www.columbia.edu/~xy2102 > > > > > > > > > > > > > Rebecca Xuefei YUAN > Department of Applied Physics and Applied Mathematics > Columbia University > Tel:917-399-8032 > www.columbia.edu/~xy2102 > From xy2102 at columbia.edu Tue Sep 7 13:08:20 2010 From: xy2102 at columbia.edu (Rebecca Xuefei Yuan) Date: Tue, 07 Sep 2010 14:08:20 -0400 Subject: [petsc-users] valgrind error comes out when upgrade from ubuntu 8.04 LTS to 10.04 In-Reply-To: References: <20100906203101.9i11ikcdkos4gss4@cubmail.cc.columbia.edu> <196CED0F-3AD9-487C-AC88-FD94624407ED@mcs.anl.gov> <20100906214634.fvvfvn63sos8gsgs@cubmail.cc.columbia.edu> <20100907114445.yr96rmlf4s4kscog@cubmail.cc.columbia.edu> <20100907121620.7kemh74dckggsoco@cubmail.cc.columbia.edu> <20100907135226.kvq7npkxs0c88wsg@cubmail.cc.columbia.edu> Message-ID: <20100907140820.e6bczqpmasg40c0k@cubmail.cc.columbia.edu> Dear Satish, Thanks very much for your kind help! Those tons of errors did not show up when I use the valgrind located at /usr/bin/valgrind. Cheers, Rebecca Quoting Satish Balay : > Ubuntu valgrind is at /usr/bin/valgrind. > > /usr/local/bin/valgrind might be a personal install. [perhaps > compiled on 8.04] > > Try: > > ~/soft/petsc-3.1-p4/externalpackages/mpich2-1.0.8/bin/mpiexec -np 2 > /usr/bin/valgrind --tool=memcheck ./ex19.exe -malloc off -da_grid_x > 30 -da_grid_y 30 > > [and then delete valgrind installed at /usr/local] > > Satish > > On Tue, 7 Sep 2010, Rebecca Xuefei Yuan wrote: > >> It is >> >> >> rebecca at YuanWork:~/soft$ which valgrind >> /usr/local/bin/valgrind >> >> >> R >> Quoting Satish Balay : >> >> > what do you have for >> > >> > which valgrind >> > >> > Satish >> > >> > On Tue, 7 Sep 2010, Rebecca Xuefei Yuan wrote: >> > >> > > Dear Satish, >> > > >> > > I checked the update the valgrind as >> > > >> > > >> > > rebecca at YuanWork:~/soft$ sudo apt-get install valgrind >> > > [sudo] password for rebecca: >> > > Reading package lists... Done >> > > Building dependency tree >> > > Reading state information... Done >> > > valgrind is already the newest version. >> > > The following packages were automatically installed and are no longer >> > > required: >> > > nabi ttf-wqy-zenhei ttf-takao-mincho libgcj8-dev ttf-alee gcj-4.2 >> > > ttf-takao-gothic gij xfonts-wqy ttf-unfonts-extra ttf-kacst >> > > Use 'apt-get autoremove' to remove them. >> > > 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. >> > > rebecca at YuanWork:~/soft$ dpkg -l valgrind >> > > Desired=Unknown/Install/Remove/Purge/Hold >> > > | >> > > >> Status=Not/Inst/Cfg-files/Unpacked/Failed-cfg/Half-inst/trig-aWait/Trig-pend >> > > |/ Err?=(none)/Reinst-required (Status,Err: uppercase=bad) >> > > ||/ Name Version Description >> > > >> +++-==============-==============-============================================ >> > > ii valgrind 1:3.6.0~svn201 A memory debugger and profiler >> > > >> > > >> > > It is a self installed valgrind or installed through ubuntu 10.04? >> > > >> > > Thanks a lot! >> > > >> > > Rebecca >> > > >> > > >> > > >> > > >> > > Quoting Satish Balay : >> > > >> > > > On Tue, 7 Sep 2010, Rebecca Xuefei Yuan wrote: >> > > > >> > > > > Dear Satish, >> > > > > >> > > > > I reinstalled PETSc by "rm -rf petsc-3.1-p4" and downloaded >> > > > > "petsc-lite-3.1-p4.tar.gz" to start over. >> > > > > >> > > > > The commands are >> > > > > >> > > > > >> > > > > ./config/configure.py --with-cc=gcc --with-fc=gfortran >> > > > > --download-f-blas-lapack=1 --download-mpich=1 >> > > > > make PETSC_DIR=/home/rebecca/soft/petsc-3.1-p4 >> > > > > PETSC_ARCH=linux-gnu-c-debug >> > > > > all >> > > > > make PETSC_DIR=/home/rebecca/soft/petsc-3.1-p4 >> > > > > PETSC_ARCH=linux-gnu-c-debug >> > > > > test >> > > > > >> > > > > When I check the valgrind version >> > > > > >> > > > > rebecca at YuanWork:~/soft$ valgrind --version >> > > > > valgrind-3.5.0 >> > > > > >> > > > > When I check at >> > > > > http://valgrind.org/ >> > > > > >> > > > > the current release is 3.5.0. How could you get 3.6? >> > > > >> > > > Are you using your own install of valgrind? [and not ubuntu install?] >> > > > If so - did you reinstall it after your OS upgrade? >> > > > >> > > > Looks like valgrind packaged by ubuntu/debian is an SVN snapshot of >> > > > upcoming 3.6 >> > > > >> > > > > > > > > > > > >> > > > balay at petsc:~ $ dpkg -l valgrind >> > > > Desired=Unknown/Install/Remove/Purge/Hold >> > > > | >> > > > >> Status=Not/Inst/Cfg-files/Unpacked/Failed-cfg/Half-inst/trig-aWait/Trig-pend >> > > > |/ Err?=(none)/Reinst-required (Status,Err: uppercase=bad) >> > > > ||/ Name Version Description >> > > > >> +++-======================-======================-============================================================ >> > > > ii valgrind 1:3.6.0~svn20100212-0u A memory >> debugger and >> > > > profiler >> > > > balay at petsc:~ $ >> > > > <<<<<<< >> > > > >> > > > So your choice is to either: >> > > > >> > > > 1. install ubuntu valgrind [sudo apt-get install valgrind] - >> and delete >> > > > personal copy >> > > > 2. reinstall valgrind from source [for this upgraded OS] >> > > > >> > > > >> > > > Satish >> > > > >> > > > >> > > > >> > > >> > > >> > > >> > > Rebecca Xuefei YUAN >> > > Department of Applied Physics and Applied Mathematics >> > > Columbia University >> > > Tel:917-399-8032 >> > > www.columbia.edu/~xy2102 >> > > >> > >> > >> > >> >> >> >> Rebecca Xuefei YUAN >> Department of Applied Physics and Applied Mathematics >> Columbia University >> Tel:917-399-8032 >> www.columbia.edu/~xy2102 >> > > > Rebecca Xuefei YUAN Department of Applied Physics and Applied Mathematics Columbia University Tel:917-399-8032 www.columbia.edu/~xy2102 From stali at purdue.edu Tue Sep 7 15:24:04 2010 From: stali at purdue.edu (stali) Date: Tue, 7 Sep 2010 15:24:04 -0500 Subject: [petsc-users] nnz's in finite element stiffness matrix Message-ID: <757D09D9-A885-40CA-8D5D-806ECECB3CCE@purdue.edu> Petsc-users How can I efficiently calculate the _exact_ number of non-zeros that would be in the global sparse (stiffness) matrix given an unstructured mesh? Thanks From zhaonanavril at gmail.com Tue Sep 7 17:56:59 2010 From: zhaonanavril at gmail.com (NAN ZHAO) Date: Tue, 7 Sep 2010 16:56:59 -0600 Subject: [petsc-users] Error message after call VecSet(R_, zero) Message-ID: Dear all, I got a strange petsc error when I am running my code with petsc as a linear sover: [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Object is in wrong state! [0]PETSC ERROR: You cannot call this after you have called VecSetValues() but before you have called VecAssemblyBegin/End()! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 2.3.3, Patch 15, Tue Sep 23 10:02:49 CDT 2008 HG revision: 31306062cd1a6f6a2496fccb4878f485c9b91760 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. I* think this error is happening when I want to reset the vector and matrix to zero before solving the Ax=b,* *this is my code:* *PetscScalar zero = 0.;* * * * // Reset J, R and Y* * cout<<"reset jacobian to zero"< From bsmith at mcs.anl.gov Tue Sep 7 18:27:56 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 7 Sep 2010 18:27:56 -0500 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: <6F879BC3-2E8E-4B30-B4A8-CDB039BD3A1E@cwi.nl> References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> <30D58A3F-D8D9-4692-8BE9-BB8168201BD3@mcs.anl.gov> <6F879BC3-2E8E-4B30-B4A8-CDB039BD3A1E@cwi.nl> Message-ID: Are you closing the socket on Matlab between to the two sets? Just checking. You can try running with a different port number each time to see if it is related to trying to reuse the port. Run with PetscOpenSocket(5006) and the PETSc program with -viewer_socket_port 5006 then run both with 5007 then with 5008 etc does this work smoothly? Let me know and the will tell me the next step to try, Barry On Sep 7, 2010, at 10:53 AM, Benjamin Sanderse wrote: > Hi Barry, > > I am still not too happy with the execution in parallel. I am working under Linux (64 bits) and still using your approach with two command windows (since it gives the best debugging possibility). > As I said, sometimes things work, but most of the time not. Here is the output of two successive runs > > -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info > [1] PetscInitialize(): PETSc successfully started: number of processors = 2 > [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl > [0] PetscInitialize(): PETSc successfully started: number of processors = 2 > [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl > [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 > [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 > [1] PetscCommDuplicate(): returning tag 2147483647 > [0] PetscCommDuplicate(): returning tag 2147483647 > [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl > [0] PetscCommDuplicate(): returning tag 2147483646 > [1] PetscCommDuplicate(): returning tag 2147483646 > [1] PetscCommDuplicate(): returning tag 2147483641 > [0] PetscCommDuplicate(): returning tag 2147483641 > [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. > [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. > [1] PetscFinalize(): PetscFinalize() called > [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 > [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 > [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 > [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 > [0] PetscFinalize(): PetscFinalize() called > [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 > [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 > [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 > [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 > > > -bash-4.0$ netstat | grep 5005 > > > -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info > [1] PetscInitialize(): PETSc successfully started: number of processors = 2 > [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl > [0] PetscInitialize(): PETSc successfully started: number of processors = 2 > [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl > [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 > [0] PetscCommDuplicate(): returning tag 2147483647 > [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl > [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 > [1] PetscCommDuplicate(): returning tag 2147483647 > [1] PetscCommDuplicate(): returning tag 2147483646 > [0] PetscCommDuplicate(): returning tag 2147483646 > [0] PetscCommDuplicate(): returning tag 2147483641 > [1] PetscCommDuplicate(): returning tag 2147483641 > [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. > [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. > ^C > -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt > [0]1:Return code = 0, signaled with Interrupt > > > In both cases I first started the Matlab program. I am currently starting Matlab without a GUI, but with a GUI I have the same problems. > As you can see, in the first case everything works fine, and Petsc finalizes and closes. Matlab gives me the correct output. The second case, run just a couple of seconds later, does not reach PetscFinalize and Matlab does not give the correct output. In between the two cases I checked if port 5005 was in use, and it was not. > Do you have any more suggestions on how to get this to work properly? > > Benjamin > > Op 3 sep 2010, om 21:11 heeft Barry Smith het volgende geschreven: > >> >> On Sep 3, 2010, at 4:32 PM, Benjamin Sanderse wrote: >> >>> Hi Barry, >>> >>> Thanks for your help! However, there are still some issues left. In other to test things, I simplified the program even more and now I am just sending a vector back and forth: matlab->petsc->matlab: >>> >>> fd = PETSC_VIEWER_SOCKET_WORLD; >>> >>> // load rhs vector >>> ierr = VecLoad(fd,VECMPI,&b);CHKERRQ(ierr); >>> >>> // send to matlab >>> ierr = VecView(b,fd);CHKERRQ(ierr); >>> ierr = VecDestroy(b);CHKERRQ(ierr); >>> >>> >>> - Your approach with two windows works *sometimes*. I removed the 'launch' statement and I executed my program 10 times, the first 2 times worked, and in all other cases I got this: >>> >>> petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>> [1] PetscCommDuplicate(): returning tag 2147483647 >>> [0] PetscCommDuplicate(): returning tag 2147483647 >>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again >>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again^C >>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>> [0]1:Return code = 0, signaled with Interrupt >>> >>> Every time I start the program I use close(socket) and clear all in Matlab, so the socket from the previous run should not be present anymore. It seems that the port gets corrupted after a couple of times? Matlab does not respond and I have to kill it and restart it manually. >> >> Sometimes when you close a socket connection it doesn't close for a very long time so that if you try to open it again it doesn't work. When it appears the socket can not be used try using netstat | grep 5005 to see if the socket is still active. >> >>> >>> - If I include the launch statement, or just type >>> system('mpiexec -n 2 ./petsc_poisson_par_barry2 &') >>> the program never works. >> >> Are you sure mpiexec is in the path of system and it is the right one? The problem is that we are kind of cheating with system because we start a new job in the background and have no idea what the output is. Are you using unix and running Matlab on the command line or in a GUI? >> >> Barry >> >> >>> >>> Hope you can figure out what is going wrong. >>> >>> Ben >>> >>> >>> Op 3 sep 2010, om 13:25 heeft Barry Smith het volgende geschreven: >>> >>>> >>>> Ben >>>> >>>> Ok, I figured out the problem. It is not fundamental and mostly comes from not having a create way to debug this. >>>> >>>> The test vector you create is sequential then you try to view it back to Matlab with the parallel fd viewer. If you change to >>>> ierr = VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,1,&test);CHKERRQ(ierr); >>>> then the code runs. >>>> >>>> I've found (just now) that when I use launch all the output from the .c program gets lost which makes it impossible to figure out what has gone wrong. You can debug by running the two parts of the computation in two different windows. So comment out the launch from the matlab script and then in Matlab run the script (it will hang waiting for the socket to work) and in a separate terminal window run the .c program; for example petscmpiexec -n 2 ./ex1 -info Now you see exactly what is happening in the PETSc program. You can even use -start_in_debugger on the PETSc side to run the debugger on crashes. >>>> >>>> I'll add this to the docs for launch >>>> >>>> Barry >>>> >>>> >>>> On Sep 2, 2010, at 3:28 PM, Benjamin Sanderse wrote: >>>> >>>>> Hi Barry, >>>>> >>>>> I attached my matlab file, c file and makefile. First I generate the executable with 'make petsc_poisson_par_barry' and then I run test_petsc_par_barry.m. >>>>> If you change MATMPIAIJ to MATAIJ and VECMPI to VECSEQ the code works fine. >>>>> >>>>> Thanks a lot, >>>>> >>>>> Benjamin >>>>> >>>>> >>>>> >>>>> Op 2 sep 2010, om 13:45 heeft Barry Smith het volgende geschreven: >>>>> >>>>>> >>>>>> >>>>>> Matlab is never aware the vector is parallel. Please send me the code and I'll figure out what is going on. >>>>>> >>>>>> Barry >>>>>> >>>>>> On Sep 2, 2010, at 2:07 PM, Benjamin Sanderse wrote: >>>>>> >>>>>>> That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: >>>>>>> >>>>>>> Found unrecogonized header 0 in file. If your file contains complex numbers >>>>>>> then call PetscBinaryRead() with "complex" as the second argument >>>>>>> Error in ==> PetscBinaryRead at 27 >>>>>>> if nargin < 2 >>>>>>> >>>>>>> ??? Output argument "varargout" (and maybe others) not assigned during call to >>>>>>> "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". >>>>>>> >>>>>>> Error in ==> test_petsc_par at 57 >>>>>>> x4 = PetscBinaryReady(PS); >>>>>>> >>>>>>> Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: >>>>>>> >>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>>> ... >>>>>>> KSPSolve(ksp,b,x); >>>>>>> ... >>>>>>> VecView(fd,x); >>>>>>> >>>>>>> Thanks for the help! >>>>>>> >>>>>>> Ben >>>>>>> >>>>>>> Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: >>>>>>> >>>>>>>> >>>>>>>> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: >>>>>>>> >>>>>>>>> Hello all, >>>>>>>>> >>>>>>>>> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. >>>>>>>>> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? >>>>>>>> >>>>>>>> In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ >>>>>>>> >>>>>>>> Barry >>>>>>>> >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Ben >>>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>> >> > From bsmith at mcs.anl.gov Tue Sep 7 18:29:47 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 7 Sep 2010 18:29:47 -0500 Subject: [petsc-users] Error message after call VecSet(R_, zero) In-Reply-To: References: Message-ID: The error messages says it all. You must have calls to VecSetValues() before the VecSet() but not have a VecAssemblyBegin/End(). After your VecSetValues() you always need to have VecAssemblyBegin/End() Barry On Sep 7, 2010, at 5:56 PM, NAN ZHAO wrote: > Dear all, > > I got a strange petsc error when I am running my code with petsc as a linear sover: > [0]PETSC ERROR: --------------------- Error Message ------------------------------------ > [0]PETSC ERROR: Object is in wrong state! > [0]PETSC ERROR: You cannot call this after you have called VecSetValues() but > before you have called VecAssemblyBegin/End()! > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 2.3.3, Patch 15, Tue Sep 23 10:02:49 CDT 2008 HG revision: 31306062cd1a6f6a2496fccb4878f485c9b91760 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > I think this error is happening when I want to reset the vector and matrix to zero before solving the Ax=b, > this is my code: > PetscScalar zero = 0.; > > // Reset J, R and Y > cout<<"reset jacobian to zero"< MatZeroEntries(J_); > cout<<"reset R to zero"< VecSet(R_, zero); > cout<<"reset Y to zero"< VecSet(Y_, zero); > > Is anyone have some clue to solve this problem. > > Thanks, > > Nan -------------- next part -------------- An HTML attachment was scrubbed... URL: From makhijad at colorado.edu Tue Sep 7 19:28:07 2010 From: makhijad at colorado.edu (Dave Makhija) Date: Tue, 7 Sep 2010 18:28:07 -0600 Subject: [petsc-users] nnz's in finite element stiffness matrix In-Reply-To: <757D09D9-A885-40CA-8D5D-806ECECB3CCE@purdue.edu> References: <757D09D9-A885-40CA-8D5D-806ECECB3CCE@purdue.edu> Message-ID: I used to have a decent setup that worked as follows: 1. Build node->element table (For a given node, which elements contain this node. You may already have this) OR build a node connectivity table (For a given node, which nodes are connected). 2. Build element->node table (For a given element, which nodes are contained in this element. You probably already have this) 4. Loop over nodes and get the global DOFs contained in that node. For those DOF id rows, add the number of nodal DOFs for each unique node connected to the current node using the node->element and element->node table OR the node connectivity table. 5. Loop over elements and add the number of elemental DOFs to each nodal DOF global id row contained in this element using element->node table. Also add the number of elemental DOF's and the sum of the nodal DOF's to the elemental global DOF id rows. 6. Add contributions of multi-point constraints, i.e. Lagrange multiplier DOF's. Note that in parallel you may have to scatter values to global DOF ids owned by off-processors to get an accurate Onz. This setup as a whole can be pretty fast but can scale poorly if you don't have a good way of developing the node-element table or node connectivity since it requires some loops within loops. Another way is to use the PETSc preallocation macros such as MatPreallocateSet. You can essentially do a "dry run" of a Jacobian Matrix assembly into the preallocation macros. They can be tricky to use, so if you have problems you can simply look at the PETSc documentation for those macros and hand code them yourself. This strategy will overestimate the memory, but a matrix inversion will dwarf how much this will waste. I vaguely remember a PETSc archive asking how to free the unneeded memory if this is absolutely necessary, but I don't think anything really worked without a full matrix copy. If someone by chance knows how the Trilinos "OptimizeStorage" routine works for Epetra matricies they could potentially shed some light on how to do this - if it is even possible. Dave Makhija On Tue, Sep 7, 2010 at 2:24 PM, stali wrote: > Petsc-users > > How can I efficiently calculate the _exact_ number of non-zeros that would > be in the global sparse (stiffness) matrix given an unstructured mesh? > > Thanks > From knepley at gmail.com Wed Sep 8 01:29:00 2010 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 8 Sep 2010 08:29:00 +0200 Subject: [petsc-users] nnz's in finite element stiffness matrix In-Reply-To: References: <757D09D9-A885-40CA-8D5D-806ECECB3CCE@purdue.edu> Message-ID: On Wed, Sep 8, 2010 at 2:28 AM, Dave Makhija wrote: > I used to have a decent setup that worked as follows: > > 1. Build node->element table (For a given node, which elements contain > this node. You may already have this) OR build a node connectivity > table (For a given node, which nodes are connected). > 2. Build element->node table (For a given element, which nodes are > contained in this element. You probably already have this) > 4. Loop over nodes and get the global DOFs contained in that node. For > those DOF id rows, add the number of nodal DOFs for each unique node > connected to the current node using the node->element and > element->node table OR the node connectivity table. > 5. Loop over elements and add the number of elemental DOFs to each > nodal DOF global id row contained in this element using element->node > table. Also add the number of elemental DOF's and the sum of the nodal > DOF's to the elemental global DOF id rows. > 6. Add contributions of multi-point constraints, i.e. Lagrange multiplier > DOF's. > > Note that in parallel you may have to scatter values to global DOF ids > owned by off-processors to get an accurate Onz. This setup as a whole > can be pretty fast but can scale poorly if you don't have a good way > of developing the node-element table or node connectivity since it > requires some loops within loops. > > Another way is to use the PETSc preallocation macros such as > MatPreallocateSet. You can essentially do a "dry run" of a Jacobian > Matrix assembly into the preallocation macros. They can be tricky to > use, so if you have problems you can simply look at the PETSc > documentation for those macros and hand code them yourself. This > strategy will overestimate the memory, but a matrix inversion will > dwarf how much this will waste. I vaguely remember a PETSc archive > asking how to free the unneeded memory if this is absolutely > necessary, but I don't think anything really worked without a full > matrix copy. If someone by chance knows how the Trilinos > "OptimizeStorage" routine works for Epetra matricies they could > potentially shed some light on how to do this - if it is even > possible. If you can assemble the Jacobian, you can count the nonzeros. The only subtlety here is nonzeros coming from other processes. However, for finite elements with a normal cell division or finite differences with a normal vertex division, you need only count the nonzeros for rows you own since the partitions overlap on the boundary. Distinguishing the diagonal from off-diagonal block is easy since you know the rows you own. The whole process takes a vanishing small time compared to assembly since you are not constructing the values. Matt > Dave Makhija > > > > On Tue, Sep 7, 2010 at 2:24 PM, stali wrote: > > Petsc-users > > > > How can I efficiently calculate the _exact_ number of non-zeros that > would > > be in the global sparse (stiffness) matrix given an unstructured mesh? > > > > Thanks > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From florian.rathgeber at googlemail.com Wed Sep 8 05:13:27 2010 From: florian.rathgeber at googlemail.com (Florian Rathgeber) Date: Wed, 08 Sep 2010 12:13:27 +0200 Subject: [petsc-users] Divergence termination of KSP iteration despite disabled convergence test Message-ID: <4C8761C7.7050708@gmail.com> -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Hi, I instructed KSP (cg, jacobi) to run for a fixed number of CG iterations (for benchmarking purposes) by disabling convergence testing and norm computation as so: KSPSetConvergenceTest(ksp, KSPSkipConverged, PETSC_NULL, PETSC_NULL); KSPSetNormType(ksp, KSP_NORM_NO); This worked fine for all my tests but Poisson. I tried solving Poisson on P1, P2, and P3 elements with the right hand side 1 and no Dirichlet condition (the result was not supposed to be meaningful, it was just for benchmarking) and every time iteration terminated prematurely with reason KSP_DIVERGED_INDEFINITE_MAT (-10). To me, there are several strange issues with this: 1) I would expect KSPSkipConverged to disable the divergence check as well 2) I don't understand the divergence reason, how would the matrix suddenly become indefinite? 3) Increasing the divergence limit to a very high value (1e50) doesn't change anything Could someone elaborate on these issues? So my question is: how to safely disable this divergence test so that KSP runs for the number of iterations I expect it to run? Thank you for any hints Florian -----BEGIN PGP SIGNATURE----- Version: GnuPG v2.0.12 (MingW32) iEYEARECAAYFAkyHYccACgkQ8Z6llsctAxafpQCgmZsMxvCc5UYEVsO+ibRXLM+j 1cMAoMeBD/iz7JL0VZaz8sCVCqTFrOVE =dMKQ -----END PGP SIGNATURE----- -------------- next part -------------- A non-text attachment was scrubbed... Name: smime.p7s Type: application/pkcs7-signature Size: 3690 bytes Desc: S/MIME Cryptographic Signature URL: From jed at 59A2.org Wed Sep 8 05:28:33 2010 From: jed at 59A2.org (Jed Brown) Date: Wed, 08 Sep 2010 12:28:33 +0200 Subject: [petsc-users] Divergence termination of KSP iteration despite disabled convergence test In-Reply-To: <4C8761C7.7050708@gmail.com> References: <4C8761C7.7050708@gmail.com> Message-ID: <871v94a5ri.fsf@59A2.org> On Wed, 08 Sep 2010 12:13:27 +0200, Florian Rathgeber wrote: > -----BEGIN PGP SIGNED MESSAGE----- > Hash: SHA1 > > Hi, > > I instructed KSP (cg, jacobi) to run for a fixed number of CG iterations > (for benchmarking purposes) by disabling convergence testing and norm > computation as so: > > KSPSetConvergenceTest(ksp, KSPSkipConverged, PETSC_NULL, PETSC_NULL); > KSPSetNormType(ksp, KSP_NORM_NO); > > This worked fine for all my tests but Poisson. I tried solving Poisson > on P1, P2, and P3 elements with the right hand side 1 and no Dirichlet > condition (the result was not supposed to be meaningful, it was just for > benchmarking) and every time iteration terminated prematurely with > reason KSP_DIVERGED_INDEFINITE_MAT (-10). This does not come from a convergence test, CG computes the A-inner product and the algorithm doesn't make sense if this is not an inner product (detected when the algorithm computes (x,x)_A = x\cdot A\cdot x <= 0 for some x). > 1) I would expect KSPSkipConverged to disable the divergence check as well > 2) I don't understand the divergence reason, how would the matrix > suddenly become indefinite? Perhaps an assembly problem (didn't assembly what you thought), or (because it sounds like your matrix is singular), you got unlucky in your problem choice so as to find an x such that (x,x)_A = 0. You can check the matrix in a crude way with GMRES and -ksp_compute_eigenvalues, or write it out with -ksp_view_binary, read it in with Matlab/Octave, and check the spectrum there. > So my question is: how to safely disable this divergence test so that > KSP runs for the number of iterations I expect it to run? Change the matrix or change the Krylov method. Jed From B.Sanderse at cwi.nl Wed Sep 8 10:13:50 2010 From: B.Sanderse at cwi.nl (Benjamin Sanderse) Date: Wed, 8 Sep 2010 09:13:50 -0600 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> <30D58A3F-D8D9-4692-8BE9-BB8168201BD3@mcs.anl.gov> <6F879BC3-2E8E-4B30-B4A8-CDB039BD3A1E@cwi.nl> Message-ID: <9587193F-E30F-40F2-BF6A-4B690DA5066B@cwi.nl> Hi Barry, I am indeed closing the socket in Matlab between the two sets, using close(PS), where PS=PetscOpenSocket. I have tried different port numbers, but without guarantee of success. Sometimes it works, sometimes it doesn't. Often times the first time calling the PetscOpenSocket(portnumber) works, but even that is not guaranteed. I think there should be another solution. By the way, all these problems do not appear when using serial vectors instead of parallel. Ben Op 7 sep 2010, om 17:27 heeft Barry Smith het volgende geschreven: > > Are you closing the socket on Matlab between to the two sets? Just checking. > > You can try running with a different port number each time to see if it is related to trying to reuse the port. Run with PetscOpenSocket(5006) and the PETSc program with -viewer_socket_port 5006 > then run both with 5007 then with 5008 etc does this work smoothly? > > Let me know and the will tell me the next step to try, > > Barry > > On Sep 7, 2010, at 10:53 AM, Benjamin Sanderse wrote: > >> Hi Barry, >> >> I am still not too happy with the execution in parallel. I am working under Linux (64 bits) and still using your approach with two command windows (since it gives the best debugging possibility). >> As I said, sometimes things work, but most of the time not. Here is the output of two successive runs >> >> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >> [1] PetscCommDuplicate(): returning tag 2147483647 >> [0] PetscCommDuplicate(): returning tag 2147483647 >> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >> [0] PetscCommDuplicate(): returning tag 2147483646 >> [1] PetscCommDuplicate(): returning tag 2147483646 >> [1] PetscCommDuplicate(): returning tag 2147483641 >> [0] PetscCommDuplicate(): returning tag 2147483641 >> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >> [1] PetscFinalize(): PetscFinalize() called >> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >> [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >> [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >> [0] PetscFinalize(): PetscFinalize() called >> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >> [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >> >> >> -bash-4.0$ netstat | grep 5005 >> >> >> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >> [0] PetscCommDuplicate(): returning tag 2147483647 >> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >> [1] PetscCommDuplicate(): returning tag 2147483647 >> [1] PetscCommDuplicate(): returning tag 2147483646 >> [0] PetscCommDuplicate(): returning tag 2147483646 >> [0] PetscCommDuplicate(): returning tag 2147483641 >> [1] PetscCommDuplicate(): returning tag 2147483641 >> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >> ^C >> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >> [0]1:Return code = 0, signaled with Interrupt >> >> >> In both cases I first started the Matlab program. I am currently starting Matlab without a GUI, but with a GUI I have the same problems. >> As you can see, in the first case everything works fine, and Petsc finalizes and closes. Matlab gives me the correct output. The second case, run just a couple of seconds later, does not reach PetscFinalize and Matlab does not give the correct output. In between the two cases I checked if port 5005 was in use, and it was not. >> Do you have any more suggestions on how to get this to work properly? >> >> Benjamin >> >> Op 3 sep 2010, om 21:11 heeft Barry Smith het volgende geschreven: >> >>> >>> On Sep 3, 2010, at 4:32 PM, Benjamin Sanderse wrote: >>> >>>> Hi Barry, >>>> >>>> Thanks for your help! However, there are still some issues left. In other to test things, I simplified the program even more and now I am just sending a vector back and forth: matlab->petsc->matlab: >>>> >>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>> >>>> // load rhs vector >>>> ierr = VecLoad(fd,VECMPI,&b);CHKERRQ(ierr); >>>> >>>> // send to matlab >>>> ierr = VecView(b,fd);CHKERRQ(ierr); >>>> ierr = VecDestroy(b);CHKERRQ(ierr); >>>> >>>> >>>> - Your approach with two windows works *sometimes*. I removed the 'launch' statement and I executed my program 10 times, the first 2 times worked, and in all other cases I got this: >>>> >>>> petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again >>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again^C >>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>>> [0]1:Return code = 0, signaled with Interrupt >>>> >>>> Every time I start the program I use close(socket) and clear all in Matlab, so the socket from the previous run should not be present anymore. It seems that the port gets corrupted after a couple of times? Matlab does not respond and I have to kill it and restart it manually. >>> >>> Sometimes when you close a socket connection it doesn't close for a very long time so that if you try to open it again it doesn't work. When it appears the socket can not be used try using netstat | grep 5005 to see if the socket is still active. >>> >>>> >>>> - If I include the launch statement, or just type >>>> system('mpiexec -n 2 ./petsc_poisson_par_barry2 &') >>>> the program never works. >>> >>> Are you sure mpiexec is in the path of system and it is the right one? The problem is that we are kind of cheating with system because we start a new job in the background and have no idea what the output is. Are you using unix and running Matlab on the command line or in a GUI? >>> >>> Barry >>> >>> >>>> >>>> Hope you can figure out what is going wrong. >>>> >>>> Ben >>>> >>>> >>>> Op 3 sep 2010, om 13:25 heeft Barry Smith het volgende geschreven: >>>> >>>>> >>>>> Ben >>>>> >>>>> Ok, I figured out the problem. It is not fundamental and mostly comes from not having a create way to debug this. >>>>> >>>>> The test vector you create is sequential then you try to view it back to Matlab with the parallel fd viewer. If you change to >>>>> ierr = VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,1,&test);CHKERRQ(ierr); >>>>> then the code runs. >>>>> >>>>> I've found (just now) that when I use launch all the output from the .c program gets lost which makes it impossible to figure out what has gone wrong. You can debug by running the two parts of the computation in two different windows. So comment out the launch from the matlab script and then in Matlab run the script (it will hang waiting for the socket to work) and in a separate terminal window run the .c program; for example petscmpiexec -n 2 ./ex1 -info Now you see exactly what is happening in the PETSc program. You can even use -start_in_debugger on the PETSc side to run the debugger on crashes. >>>>> >>>>> I'll add this to the docs for launch >>>>> >>>>> Barry >>>>> >>>>> >>>>> On Sep 2, 2010, at 3:28 PM, Benjamin Sanderse wrote: >>>>> >>>>>> Hi Barry, >>>>>> >>>>>> I attached my matlab file, c file and makefile. First I generate the executable with 'make petsc_poisson_par_barry' and then I run test_petsc_par_barry.m. >>>>>> If you change MATMPIAIJ to MATAIJ and VECMPI to VECSEQ the code works fine. >>>>>> >>>>>> Thanks a lot, >>>>>> >>>>>> Benjamin >>>>>> >>>>>> >>>>>> >>>>>> Op 2 sep 2010, om 13:45 heeft Barry Smith het volgende geschreven: >>>>>> >>>>>>> >>>>>>> >>>>>>> Matlab is never aware the vector is parallel. Please send me the code and I'll figure out what is going on. >>>>>>> >>>>>>> Barry >>>>>>> >>>>>>> On Sep 2, 2010, at 2:07 PM, Benjamin Sanderse wrote: >>>>>>> >>>>>>>> That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: >>>>>>>> >>>>>>>> Found unrecogonized header 0 in file. If your file contains complex numbers >>>>>>>> then call PetscBinaryRead() with "complex" as the second argument >>>>>>>> Error in ==> PetscBinaryRead at 27 >>>>>>>> if nargin < 2 >>>>>>>> >>>>>>>> ??? Output argument "varargout" (and maybe others) not assigned during call to >>>>>>>> "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". >>>>>>>> >>>>>>>> Error in ==> test_petsc_par at 57 >>>>>>>> x4 = PetscBinaryReady(PS); >>>>>>>> >>>>>>>> Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: >>>>>>>> >>>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>>>> ... >>>>>>>> KSPSolve(ksp,b,x); >>>>>>>> ... >>>>>>>> VecView(fd,x); >>>>>>>> >>>>>>>> Thanks for the help! >>>>>>>> >>>>>>>> Ben >>>>>>>> >>>>>>>> Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: >>>>>>>> >>>>>>>>> >>>>>>>>> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: >>>>>>>>> >>>>>>>>>> Hello all, >>>>>>>>>> >>>>>>>>>> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. >>>>>>>>>> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? >>>>>>>>> >>>>>>>>> In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ >>>>>>>>> >>>>>>>>> Barry >>>>>>>>> >>>>>>>>>> >>>>>>>>>> Thanks, >>>>>>>>>> >>>>>>>>>> Ben >>>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>> >> > From keita at cray.com Wed Sep 8 12:10:30 2010 From: keita at cray.com (Keita Teranishi) Date: Wed, 8 Sep 2010 12:10:30 -0500 Subject: [petsc-users] PETSC_DLLEXPORT? Message-ID: <5D6E0DF460ACF34C88644E1EA91DCD0D01ABA090F6@CFEXMBX.americas.cray.com> Hi, I have just a quick question. Does "-with-dynamic" option assign any special symbol to PETSC_DLLEXPORT? What is the intention of the use? Thanks, ================================ Keita Teranishi Scientific Library Group Cray, Inc. keita at cray.com ================================ -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Sep 8 12:14:27 2010 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 8 Sep 2010 19:14:27 +0200 Subject: [petsc-users] PETSC_DLLEXPORT? In-Reply-To: <5D6E0DF460ACF34C88644E1EA91DCD0D01ABA090F6@CFEXMBX.americas.cray.com> References: <5D6E0DF460ACF34C88644E1EA91DCD0D01ABA090F6@CFEXMBX.americas.cray.com> Message-ID: On Wed, Sep 8, 2010 at 7:10 PM, Keita Teranishi wrote: > Hi, > > > > I have just a quick question. Does ??with-dynamic? option assign any > special symbol to PETSC_DLLEXPORT? What is the intention of the use? > Now it is --with-dynamic-loading in dev. It instructs PETSc to load its own library dynamically (and any other specified by the user). Matt > > > Thanks, > > ================================ > Keita Teranishi > Scientific Library Group > Cray, Inc. > keita at cray.com > ================================ > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Wed Sep 8 12:15:59 2010 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 8 Sep 2010 12:15:59 -0500 (CDT) Subject: [petsc-users] PETSC_DLLEXPORT? In-Reply-To: References: <5D6E0DF460ACF34C88644E1EA91DCD0D01ABA090F6@CFEXMBX.americas.cray.com> Message-ID: On Wed, 8 Sep 2010, Matthew Knepley wrote: > On Wed, Sep 8, 2010 at 7:10 PM, Keita Teranishi wrote: > > > Hi, > > > > > > > > I have just a quick question. Does ??with-dynamic? option assign any > > special symbol to PETSC_DLLEXPORT? What is the intention of the use? > > > > Now it is --with-dynamic-loading in dev. It instructs PETSc to load its own > library dynamically > (and any other specified by the user). PETSC_DLLEXPORT was added as a placeholder for dlls on windows. [But we don't have petsc.dll yet..] Satish From bsmith at mcs.anl.gov Wed Sep 8 13:00:28 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 8 Sep 2010 13:00:28 -0500 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: <9587193F-E30F-40F2-BF6A-4B690DA5066B@cwi.nl> References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> <30D58A3F-D8D9-4692-8BE9-BB8168201BD3@mcs.anl.gov> <6F879BC3-2E8E-4B30-B4A8-CDB039BD3A1E@cwi.nl> <9587193F-E30F-40F2-BF6A-4B690DA5066B@cwi.nl> Message-ID: On Sep 8, 2010, at 10:13 AM, Benjamin Sanderse wrote: > Hi Barry, > > I am indeed closing the socket in Matlab between the two sets, using close(PS), where PS=PetscOpenSocket. > I have tried different port numbers, but without guarantee of success. Sometimes it works, sometimes it doesn't. Often times the first time calling the PetscOpenSocket(portnumber) works, but even that is not guaranteed. I think there should be another solution. > By the way, all these problems do not appear when using serial vectors instead of parallel. That is strange. Only the first process ever opens the socket so in theory the fact that the PETSc code is parallel should not matter at all. Please send me your test code that causes trouble again and I'll see if I can reproduce the problem. Barry > > Ben > > Op 7 sep 2010, om 17:27 heeft Barry Smith het volgende geschreven: > >> >> Are you closing the socket on Matlab between to the two sets? Just checking. >> >> You can try running with a different port number each time to see if it is related to trying to reuse the port. Run with PetscOpenSocket(5006) and the PETSc program with -viewer_socket_port 5006 >> then run both with 5007 then with 5008 etc does this work smoothly? >> >> Let me know and the will tell me the next step to try, >> >> Barry >> >> On Sep 7, 2010, at 10:53 AM, Benjamin Sanderse wrote: >> >>> Hi Barry, >>> >>> I am still not too happy with the execution in parallel. I am working under Linux (64 bits) and still using your approach with two command windows (since it gives the best debugging possibility). >>> As I said, sometimes things work, but most of the time not. Here is the output of two successive runs >>> >>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>> [1] PetscCommDuplicate(): returning tag 2147483647 >>> [0] PetscCommDuplicate(): returning tag 2147483647 >>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>> [0] PetscCommDuplicate(): returning tag 2147483646 >>> [1] PetscCommDuplicate(): returning tag 2147483646 >>> [1] PetscCommDuplicate(): returning tag 2147483641 >>> [0] PetscCommDuplicate(): returning tag 2147483641 >>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>> [1] PetscFinalize(): PetscFinalize() called >>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >>> [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >>> [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >>> [0] PetscFinalize(): PetscFinalize() called >>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >>> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >>> [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >>> >>> >>> -bash-4.0$ netstat | grep 5005 >>> >>> >>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>> [0] PetscCommDuplicate(): returning tag 2147483647 >>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>> [1] PetscCommDuplicate(): returning tag 2147483647 >>> [1] PetscCommDuplicate(): returning tag 2147483646 >>> [0] PetscCommDuplicate(): returning tag 2147483646 >>> [0] PetscCommDuplicate(): returning tag 2147483641 >>> [1] PetscCommDuplicate(): returning tag 2147483641 >>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>> ^C >>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>> [0]1:Return code = 0, signaled with Interrupt >>> >>> >>> In both cases I first started the Matlab program. I am currently starting Matlab without a GUI, but with a GUI I have the same problems. >>> As you can see, in the first case everything works fine, and Petsc finalizes and closes. Matlab gives me the correct output. The second case, run just a couple of seconds later, does not reach PetscFinalize and Matlab does not give the correct output. In between the two cases I checked if port 5005 was in use, and it was not. >>> Do you have any more suggestions on how to get this to work properly? >>> >>> Benjamin >>> >>> Op 3 sep 2010, om 21:11 heeft Barry Smith het volgende geschreven: >>> >>>> >>>> On Sep 3, 2010, at 4:32 PM, Benjamin Sanderse wrote: >>>> >>>>> Hi Barry, >>>>> >>>>> Thanks for your help! However, there are still some issues left. In other to test things, I simplified the program even more and now I am just sending a vector back and forth: matlab->petsc->matlab: >>>>> >>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>> >>>>> // load rhs vector >>>>> ierr = VecLoad(fd,VECMPI,&b);CHKERRQ(ierr); >>>>> >>>>> // send to matlab >>>>> ierr = VecView(b,fd);CHKERRQ(ierr); >>>>> ierr = VecDestroy(b);CHKERRQ(ierr); >>>>> >>>>> >>>>> - Your approach with two windows works *sometimes*. I removed the 'launch' statement and I executed my program 10 times, the first 2 times worked, and in all other cases I got this: >>>>> >>>>> petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again >>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again^C >>>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>>>> [0]1:Return code = 0, signaled with Interrupt >>>>> >>>>> Every time I start the program I use close(socket) and clear all in Matlab, so the socket from the previous run should not be present anymore. It seems that the port gets corrupted after a couple of times? Matlab does not respond and I have to kill it and restart it manually. >>>> >>>> Sometimes when you close a socket connection it doesn't close for a very long time so that if you try to open it again it doesn't work. When it appears the socket can not be used try using netstat | grep 5005 to see if the socket is still active. >>>> >>>>> >>>>> - If I include the launch statement, or just type >>>>> system('mpiexec -n 2 ./petsc_poisson_par_barry2 &') >>>>> the program never works. >>>> >>>> Are you sure mpiexec is in the path of system and it is the right one? The problem is that we are kind of cheating with system because we start a new job in the background and have no idea what the output is. Are you using unix and running Matlab on the command line or in a GUI? >>>> >>>> Barry >>>> >>>> >>>>> >>>>> Hope you can figure out what is going wrong. >>>>> >>>>> Ben >>>>> >>>>> >>>>> Op 3 sep 2010, om 13:25 heeft Barry Smith het volgende geschreven: >>>>> >>>>>> >>>>>> Ben >>>>>> >>>>>> Ok, I figured out the problem. It is not fundamental and mostly comes from not having a create way to debug this. >>>>>> >>>>>> The test vector you create is sequential then you try to view it back to Matlab with the parallel fd viewer. If you change to >>>>>> ierr = VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,1,&test);CHKERRQ(ierr); >>>>>> then the code runs. >>>>>> >>>>>> I've found (just now) that when I use launch all the output from the .c program gets lost which makes it impossible to figure out what has gone wrong. You can debug by running the two parts of the computation in two different windows. So comment out the launch from the matlab script and then in Matlab run the script (it will hang waiting for the socket to work) and in a separate terminal window run the .c program; for example petscmpiexec -n 2 ./ex1 -info Now you see exactly what is happening in the PETSc program. You can even use -start_in_debugger on the PETSc side to run the debugger on crashes. >>>>>> >>>>>> I'll add this to the docs for launch >>>>>> >>>>>> Barry >>>>>> >>>>>> >>>>>> On Sep 2, 2010, at 3:28 PM, Benjamin Sanderse wrote: >>>>>> >>>>>>> Hi Barry, >>>>>>> >>>>>>> I attached my matlab file, c file and makefile. First I generate the executable with 'make petsc_poisson_par_barry' and then I run test_petsc_par_barry.m. >>>>>>> If you change MATMPIAIJ to MATAIJ and VECMPI to VECSEQ the code works fine. >>>>>>> >>>>>>> Thanks a lot, >>>>>>> >>>>>>> Benjamin >>>>>>> >>>>>>> >>>>>>> >>>>>>> Op 2 sep 2010, om 13:45 heeft Barry Smith het volgende geschreven: >>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> Matlab is never aware the vector is parallel. Please send me the code and I'll figure out what is going on. >>>>>>>> >>>>>>>> Barry >>>>>>>> >>>>>>>> On Sep 2, 2010, at 2:07 PM, Benjamin Sanderse wrote: >>>>>>>> >>>>>>>>> That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: >>>>>>>>> >>>>>>>>> Found unrecogonized header 0 in file. If your file contains complex numbers >>>>>>>>> then call PetscBinaryRead() with "complex" as the second argument >>>>>>>>> Error in ==> PetscBinaryRead at 27 >>>>>>>>> if nargin < 2 >>>>>>>>> >>>>>>>>> ??? Output argument "varargout" (and maybe others) not assigned during call to >>>>>>>>> "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". >>>>>>>>> >>>>>>>>> Error in ==> test_petsc_par at 57 >>>>>>>>> x4 = PetscBinaryReady(PS); >>>>>>>>> >>>>>>>>> Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: >>>>>>>>> >>>>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>>>>> ... >>>>>>>>> KSPSolve(ksp,b,x); >>>>>>>>> ... >>>>>>>>> VecView(fd,x); >>>>>>>>> >>>>>>>>> Thanks for the help! >>>>>>>>> >>>>>>>>> Ben >>>>>>>>> >>>>>>>>> Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: >>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: >>>>>>>>>> >>>>>>>>>>> Hello all, >>>>>>>>>>> >>>>>>>>>>> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. >>>>>>>>>>> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? >>>>>>>>>> >>>>>>>>>> In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ >>>>>>>>>> >>>>>>>>>> Barry >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Thanks, >>>>>>>>>>> >>>>>>>>>>> Ben >>>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>> >> > From B.Sanderse at cwi.nl Wed Sep 8 14:32:39 2010 From: B.Sanderse at cwi.nl (Benjamin Sanderse) Date: Wed, 8 Sep 2010 13:32:39 -0600 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> <30D58A3F-D8D9-4692-8BE9-BB8168201BD3@mcs.anl.gov> <6F879BC3-2E8E-4B30-B4A8-CDB039BD3A1E@cwi.nl> <9587193F-E30F-40F2-BF6A-4B690DA5066B@cwi.nl> Message-ID: That's also what I thought. I checked once again, and I found out that when I use petscmpiexec -n 1 the program works, but if I increase the number of processors to 2 it only works once in a while. I attached my test code. It is extremely simple and does nothing else than passing a vector to petsc and returning it to matlab. I run it as follows: shell #1 -bash-4.0$ make petsc_poisson_par_barry2 shell #2 -bash-4.0$ matlab -nojvm -nodisplay >> test_petsc_par_barry; shell #1 -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -viewer_socket_port 5006 -info On lucky days, this works, on unlucky days, petsc will stop here: [1] PetscInitialize(): PETSc successfully started: number of processors = 2 [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl [0] PetscInitialize(): PETSc successfully started: number of processors = 2 [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 [1] PetscCommDuplicate(): returning tag 2147483647 [0] PetscCommDuplicate(): returning tag 2147483647 [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5006 machine borr.mas.cwi.nl [0] PetscCommDuplicate(): returning tag 2147483646 [1] PetscCommDuplicate(): returning tag 2147483646 [1] PetscCommDuplicate(): returning tag 2147483641 [0] PetscCommDuplicate(): returning tag 2147483641 [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. ^C -------------- next part -------------- A non-text attachment was scrubbed... Name: makefile Type: application/octet-stream Size: 1335 bytes Desc: not available URL: -------------- next part -------------- -------------- next part -------------- A non-text attachment was scrubbed... Name: test_petsc_par_barry.m Type: application/octet-stream Size: 510 bytes Desc: not available URL: -------------- next part -------------- -------------- next part -------------- A non-text attachment was scrubbed... Name: petsc_poisson_par_barry2.c Type: application/octet-stream Size: 1108 bytes Desc: not available URL: -------------- next part -------------- Op 8 sep 2010, om 12:00 heeft Barry Smith het volgende geschreven: > > On Sep 8, 2010, at 10:13 AM, Benjamin Sanderse wrote: > >> Hi Barry, >> >> I am indeed closing the socket in Matlab between the two sets, using close(PS), where PS=PetscOpenSocket. >> I have tried different port numbers, but without guarantee of success. Sometimes it works, sometimes it doesn't. Often times the first time calling the PetscOpenSocket(portnumber) works, but even that is not guaranteed. I think there should be another solution. >> By the way, all these problems do not appear when using serial vectors instead of parallel. > > That is strange. Only the first process ever opens the socket so in theory the fact that the PETSc code is parallel should not matter at all. Please send me your test code that causes trouble again and I'll see if I can reproduce the problem. > > Barry > >> >> Ben >> >> Op 7 sep 2010, om 17:27 heeft Barry Smith het volgende geschreven: >> >>> >>> Are you closing the socket on Matlab between to the two sets? Just checking. >>> >>> You can try running with a different port number each time to see if it is related to trying to reuse the port. Run with PetscOpenSocket(5006) and the PETSc program with -viewer_socket_port 5006 >>> then run both with 5007 then with 5008 etc does this work smoothly? >>> >>> Let me know and the will tell me the next step to try, >>> >>> Barry >>> >>> On Sep 7, 2010, at 10:53 AM, Benjamin Sanderse wrote: >>> >>>> Hi Barry, >>>> >>>> I am still not too happy with the execution in parallel. I am working under Linux (64 bits) and still using your approach with two command windows (since it gives the best debugging possibility). >>>> As I said, sometimes things work, but most of the time not. Here is the output of two successive runs >>>> >>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>> [1] PetscFinalize(): PetscFinalize() called >>>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >>>> [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >>>> [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >>>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >>>> [0] PetscFinalize(): PetscFinalize() called >>>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >>>> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >>>> [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >>>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >>>> >>>> >>>> -bash-4.0$ netstat | grep 5005 >>>> >>>> >>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>> ^C >>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>>> [0]1:Return code = 0, signaled with Interrupt >>>> >>>> >>>> In both cases I first started the Matlab program. I am currently starting Matlab without a GUI, but with a GUI I have the same problems. >>>> As you can see, in the first case everything works fine, and Petsc finalizes and closes. Matlab gives me the correct output. The second case, run just a couple of seconds later, does not reach PetscFinalize and Matlab does not give the correct output. In between the two cases I checked if port 5005 was in use, and it was not. >>>> Do you have any more suggestions on how to get this to work properly? >>>> >>>> Benjamin >>>> >>>> Op 3 sep 2010, om 21:11 heeft Barry Smith het volgende geschreven: >>>> >>>>> >>>>> On Sep 3, 2010, at 4:32 PM, Benjamin Sanderse wrote: >>>>> >>>>>> Hi Barry, >>>>>> >>>>>> Thanks for your help! However, there are still some issues left. In other to test things, I simplified the program even more and now I am just sending a vector back and forth: matlab->petsc->matlab: >>>>>> >>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>> >>>>>> // load rhs vector >>>>>> ierr = VecLoad(fd,VECMPI,&b);CHKERRQ(ierr); >>>>>> >>>>>> // send to matlab >>>>>> ierr = VecView(b,fd);CHKERRQ(ierr); >>>>>> ierr = VecDestroy(b);CHKERRQ(ierr); >>>>>> >>>>>> >>>>>> - Your approach with two windows works *sometimes*. I removed the 'launch' statement and I executed my program 10 times, the first 2 times worked, and in all other cases I got this: >>>>>> >>>>>> petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again >>>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again^C >>>>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>>>>> [0]1:Return code = 0, signaled with Interrupt >>>>>> >>>>>> Every time I start the program I use close(socket) and clear all in Matlab, so the socket from the previous run should not be present anymore. It seems that the port gets corrupted after a couple of times? Matlab does not respond and I have to kill it and restart it manually. >>>>> >>>>> Sometimes when you close a socket connection it doesn't close for a very long time so that if you try to open it again it doesn't work. When it appears the socket can not be used try using netstat | grep 5005 to see if the socket is still active. >>>>> >>>>>> >>>>>> - If I include the launch statement, or just type >>>>>> system('mpiexec -n 2 ./petsc_poisson_par_barry2 &') >>>>>> the program never works. >>>>> >>>>> Are you sure mpiexec is in the path of system and it is the right one? The problem is that we are kind of cheating with system because we start a new job in the background and have no idea what the output is. Are you using unix and running Matlab on the command line or in a GUI? >>>>> >>>>> Barry >>>>> >>>>> >>>>>> >>>>>> Hope you can figure out what is going wrong. >>>>>> >>>>>> Ben >>>>>> >>>>>> >>>>>> Op 3 sep 2010, om 13:25 heeft Barry Smith het volgende geschreven: >>>>>> >>>>>>> >>>>>>> Ben >>>>>>> >>>>>>> Ok, I figured out the problem. It is not fundamental and mostly comes from not having a create way to debug this. >>>>>>> >>>>>>> The test vector you create is sequential then you try to view it back to Matlab with the parallel fd viewer. If you change to >>>>>>> ierr = VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,1,&test);CHKERRQ(ierr); >>>>>>> then the code runs. >>>>>>> >>>>>>> I've found (just now) that when I use launch all the output from the .c program gets lost which makes it impossible to figure out what has gone wrong. You can debug by running the two parts of the computation in two different windows. So comment out the launch from the matlab script and then in Matlab run the script (it will hang waiting for the socket to work) and in a separate terminal window run the .c program; for example petscmpiexec -n 2 ./ex1 -info Now you see exactly what is happening in the PETSc program. You can even use -start_in_debugger on the PETSc side to run the debugger on crashes. >>>>>>> >>>>>>> I'll add this to the docs for launch >>>>>>> >>>>>>> Barry >>>>>>> >>>>>>> >>>>>>> On Sep 2, 2010, at 3:28 PM, Benjamin Sanderse wrote: >>>>>>> >>>>>>>> Hi Barry, >>>>>>>> >>>>>>>> I attached my matlab file, c file and makefile. First I generate the executable with 'make petsc_poisson_par_barry' and then I run test_petsc_par_barry.m. >>>>>>>> If you change MATMPIAIJ to MATAIJ and VECMPI to VECSEQ the code works fine. >>>>>>>> >>>>>>>> Thanks a lot, >>>>>>>> >>>>>>>> Benjamin >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> Op 2 sep 2010, om 13:45 heeft Barry Smith het volgende geschreven: >>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> Matlab is never aware the vector is parallel. Please send me the code and I'll figure out what is going on. >>>>>>>>> >>>>>>>>> Barry >>>>>>>>> >>>>>>>>> On Sep 2, 2010, at 2:07 PM, Benjamin Sanderse wrote: >>>>>>>>> >>>>>>>>>> That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: >>>>>>>>>> >>>>>>>>>> Found unrecogonized header 0 in file. If your file contains complex numbers >>>>>>>>>> then call PetscBinaryRead() with "complex" as the second argument >>>>>>>>>> Error in ==> PetscBinaryRead at 27 >>>>>>>>>> if nargin < 2 >>>>>>>>>> >>>>>>>>>> ??? Output argument "varargout" (and maybe others) not assigned during call to >>>>>>>>>> "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". >>>>>>>>>> >>>>>>>>>> Error in ==> test_petsc_par at 57 >>>>>>>>>> x4 = PetscBinaryReady(PS); >>>>>>>>>> >>>>>>>>>> Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: >>>>>>>>>> >>>>>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>>>>>> ... >>>>>>>>>> KSPSolve(ksp,b,x); >>>>>>>>>> ... >>>>>>>>>> VecView(fd,x); >>>>>>>>>> >>>>>>>>>> Thanks for the help! >>>>>>>>>> >>>>>>>>>> Ben >>>>>>>>>> >>>>>>>>>> Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: >>>>>>>>>>> >>>>>>>>>>>> Hello all, >>>>>>>>>>>> >>>>>>>>>>>> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. >>>>>>>>>>>> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? >>>>>>>>>>> >>>>>>>>>>> In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ >>>>>>>>>>> >>>>>>>>>>> Barry >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Thanks, >>>>>>>>>>>> >>>>>>>>>>>> Ben >>>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>> >> > From andreas.hauffe at tu-dresden.de Thu Sep 9 09:46:43 2010 From: andreas.hauffe at tu-dresden.de (Andreas Hauffe) Date: Thu, 9 Sep 2010 16:46:43 +0200 Subject: [petsc-users] Error during matrix output Message-ID: <201009091646.44006.andreas.hauffe@tu-dresden.de> Hi, when I try to write a matrix to a binary file, I get the following error message. Something's wrong with the matrix. What does this mean? [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Petsc has generated inconsistent data! [0]PETSC ERROR: bi[mbs]: 1589093 != 2*a->nz-mbs: 1573730 ! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 4, Fri Jul 30 14:42:02 CDT 2010 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. Thanks, -- Andreas Hauffe ---------------------------------------------------------------------------------------------------- Technische Universit?t Dresden Institut f?r Luft- und Raumfahrttechnik / Institute of Aerospace Engineering Lehrstuhl f?r Luftfahrzeugtechnik / Chair of Aircraft Engineering D-01062 Dresden Germany phone : (++49)351 463 38496 fax : (++49)351 463 37263 mail : andreas.hauffe at tu-dresden.de Website : http://tu-dresden.de/mw/ilr/lft ---------------------------------------------------------------------------------------------------- From knepley at gmail.com Thu Sep 9 09:56:15 2010 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 9 Sep 2010 16:56:15 +0200 Subject: [petsc-users] Error during matrix output In-Reply-To: <201009091646.44006.andreas.hauffe@tu-dresden.de> References: <201009091646.44006.andreas.hauffe@tu-dresden.de> Message-ID: On Thu, Sep 9, 2010 at 4:46 PM, Andreas Hauffe wrote: > Hi, > > when I try to write a matrix to a binary file, I get the following error > message. Something's wrong with the matrix. What does this mean? > Please send the entire error message (it has a stack trace at the end). Matt > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: Petsc has generated inconsistent data! > [0]PETSC ERROR: bi[mbs]: 1589093 != 2*a->nz-mbs: 1573730 > ! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 4, Fri Jul 30 14:42:02 > CDT > 2010 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > > Thanks, > -- > Andreas Hauffe > > > ---------------------------------------------------------------------------------------------------- > Technische Universit?t Dresden > Institut f?r Luft- und Raumfahrttechnik / Institute of Aerospace > Engineering > Lehrstuhl f?r Luftfahrzeugtechnik / Chair of Aircraft Engineering > > D-01062 Dresden > Germany > > phone : (++49)351 463 38496 > fax : (++49)351 463 37263 > mail : andreas.hauffe at tu-dresden.de > Website : http://tu-dresden.de/mw/ilr/lft > > ---------------------------------------------------------------------------------------------------- > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Sep 9 10:59:14 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 9 Sep 2010 10:59:14 -0500 Subject: [petsc-users] Error during matrix output In-Reply-To: <201009091646.44006.andreas.hauffe@tu-dresden.de> References: <201009091646.44006.andreas.hauffe@tu-dresden.de> Message-ID: Is it possible that your matrix does not have some diagonal entries? Barry On Sep 9, 2010, at 9:46 AM, Andreas Hauffe wrote: > Hi, > > when I try to write a matrix to a binary file, I get the following error > message. Something's wrong with the matrix. What does this mean? > > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: Petsc has generated inconsistent data! > [0]PETSC ERROR: bi[mbs]: 1589093 != 2*a->nz-mbs: 1573730 > ! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 4, Fri Jul 30 14:42:02 CDT > 2010 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > > Thanks, > -- > Andreas Hauffe > > ---------------------------------------------------------------------------------------------------- > Technische Universit?t Dresden > Institut f?r Luft- und Raumfahrttechnik / Institute of Aerospace Engineering > Lehrstuhl f?r Luftfahrzeugtechnik / Chair of Aircraft Engineering > > D-01062 Dresden > Germany > > phone : (++49)351 463 38496 > fax : (++49)351 463 37263 > mail : andreas.hauffe at tu-dresden.de > Website : http://tu-dresden.de/mw/ilr/lft > ---------------------------------------------------------------------------------------------------- From abhyshr at mcs.anl.gov Thu Sep 9 11:14:19 2010 From: abhyshr at mcs.anl.gov (Shri) Date: Thu, 9 Sep 2010 10:14:19 -0600 (GMT-06:00) Subject: [petsc-users] Fwd: Re: Question on zero sized DAs Message-ID: <868883873.632091284048859958.JavaMail.root@zimbra.anl.gov> Barry, Sorry i forgot to mention that i'm using petsc-3.1 and not petsc-dev. I'll add your change to petsc-3.1. Thanks, Shri ----- Barry Smith wrote: > > Pull and recompile in src/dm/da/src/ then try running again and report any problems to petsc-maint > Do you have a stencil width greater than 0? > If you are using a stencil width of zero we can try to bypass this error check and see if things go through. BTW: this is a petsc-maint question :-) or petsc-users Barry > > On Sep 8, 2010, at 8:56 PM, Shri wrote: > > > I have three DAs da1,da2, and da3 (1-dimensional DAs) for my power system project where da1 is for the generator subsystem, da2 is for the network subsystem, and da3 is for the load subsystem. Here,the load subsystem size is zero and hence i need to create da3 with zero nodes. However,DACreate1D does not allow zero nodes or zero degrees of freedom and gives an error > > > > [0]PETSC ERROR: --------------------- Error Message ------------------------------------ > > [0]PETSC ERROR: Argument out of range! > > [0]PETSC ERROR: More processors than data points! 1 0! > > [0]PETSC ERROR: ------------------------------------------------------------------------ > > [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 4, unknown > > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > > [0]PETSC ERROR: See docs/index.html for manual pages. > > [0]PETSC ERROR: ------------------------------------------------------------------------ > > [0]PETSC ERROR: /Users/Shri/TS-dev/TS1 on a osx-debug named dhcp55.swwn2.iit.edu by Shri Wed Sep 8 19:56:14 2010 > > [0]PETSC ERROR: Libraries linked from /Users/Shri/petsc-3.1/osx-debug/lib > > [0]PETSC ERROR: Configure run at Fri Jul 23 16:06:56 2010 > > [0]PETSC ERROR: Configure options --with-mpi-dir=/Users/Shri/packages/install/mpich2 --with-clanguage=cxx > > [0]PETSC ERROR: ------------------------------------------------------------------------ > > [0]PETSC ERROR: DACreate_1D() line 152 in src/dm/da/src/da1.c > > [0]PETSC ERROR: DASetType() line 48 in src/dm/da/src/dareg.c > > [0]PETSC ERROR: DASetTypeFromOptions_Private() line 65 in src/dm/da/src/dacreate.c > > [0]PETSC ERROR: DASetFromOptions() line 131 in src/dm/da/src/dacreate.c > > [0]PETSC ERROR: DACreate1d() line 393 in src/dm/da/src/da1.c > > [0]PETSC ERROR: CreateLoadSysDA() line 80 in createsubsysda.c > > [0]PETSC ERROR: main() line 64 in TS1.c > > application called MPI_Abort(MPI_COMM_WORLD, 63) - process 0 > > > > 1) Is it possible to create 1d DAs with zero nodes?? > > 2) I can try using vectors instead of DAs for the three subsystems since zero sized vectors can be created. However, i want to pack these three subsystem vectors later on similar to what the DMComposite object does. Did PETSc have a vector packer object similar to DMComposite in any of the previous releases? > > > > Thanks, > > Shri > From bsmith at mcs.anl.gov Thu Sep 9 13:41:33 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 9 Sep 2010 13:41:33 -0500 Subject: [petsc-users] Error during matrix output In-Reply-To: <201009091646.44006.andreas.hauffe@tu-dresden.de> References: <201009091646.44006.andreas.hauffe@tu-dresden.de> Message-ID: This was our error in the code, it did not handle sbaij matrices correct that had missing diagonal entries. It will be fixed in our next patch release (it is also fixed in petsc-dev). I've attached a new src/mat/impls/sbaij/seq/aijsbaij.c just drop it in that directory and run make in that directory and it should work. Any additional problems please report to petsc-maint at mcs.anl.gov Barry On Sep 9, 2010, at 9:46 AM, Andreas Hauffe wrote: > Hi, > > when I try to write a matrix to a binary file, I get the following error > message. Something's wrong with the matrix. What does this mean? > > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: Petsc has generated inconsistent data! > [0]PETSC ERROR: bi[mbs]: 1589093 != 2*a->nz-mbs: 1573730 > ! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 4, Fri Jul 30 14:42:02 CDT > 2010 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > > Thanks, > -- > Andreas Hauffe > > ---------------------------------------------------------------------------------------------------- > Technische Universit?t Dresden > Institut f?r Luft- und Raumfahrttechnik / Institute of Aerospace Engineering > Lehrstuhl f?r Luftfahrzeugtechnik / Chair of Aircraft Engineering > > D-01062 Dresden > Germany > > phone : (++49)351 463 38496 > fax : (++49)351 463 37263 > mail : andreas.hauffe at tu-dresden.de > Website : http://tu-dresden.de/mw/ilr/lft > ---------------------------------------------------------------------------------------------------- -------------- next part -------------- A non-text attachment was scrubbed... Name: aijsbaij.c Type: application/octet-stream Size: 9916 bytes Desc: not available URL: From bsmith at mcs.anl.gov Thu Sep 9 14:10:47 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 9 Sep 2010 14:10:47 -0500 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> <30D58A3F-D8D9-4692-8BE9-BB8168201BD3@mcs.anl.gov> <6F879BC3-2E8E-4B30-B4A8-CDB039BD3A1E@cwi.nl> <9587193F-E30F-40F2-BF6A-4B690DA5066B@cwi.nl> Message-ID: What OS are you using. On my Apple Mac I made a shell script loop calling petsc_poisson_par_barry2 multiple times and a similar loop in Matlab and start then both off (with parallel PETSc runs). It runs flawlessly, opening the socket sending and receiving, dozens of times in a row with several processes. I think that maybe you are using Linux? Barry On Sep 8, 2010, at 2:32 PM, Benjamin Sanderse wrote: > That's also what I thought. I checked once again, and I found out that when I use > > petscmpiexec -n 1 > > the program works, but if I increase the number of processors to 2 it only works once in a while. > > I attached my test code. It is extremely simple and does nothing else than passing a vector to petsc and returning it to matlab. > > I run it as follows: > > shell #1 > -bash-4.0$ make petsc_poisson_par_barry2 > > shell #2 > -bash-4.0$ matlab -nojvm -nodisplay >>> test_petsc_par_barry; > > shell #1 > -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -viewer_socket_port 5006 -info > > On lucky days, this works, on unlucky days, petsc will stop here: > > [1] PetscInitialize(): PETSc successfully started: number of processors = 2 > [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl > [0] PetscInitialize(): PETSc successfully started: number of processors = 2 > [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl > [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 > [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 > [1] PetscCommDuplicate(): returning tag 2147483647 > [0] PetscCommDuplicate(): returning tag 2147483647 > [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5006 machine borr.mas.cwi.nl > [0] PetscCommDuplicate(): returning tag 2147483646 > [1] PetscCommDuplicate(): returning tag 2147483646 > [1] PetscCommDuplicate(): returning tag 2147483641 > [0] PetscCommDuplicate(): returning tag 2147483641 > [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. > [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. > ^C > > > > > > > > Op 8 sep 2010, om 12:00 heeft Barry Smith het volgende geschreven: > >> >> On Sep 8, 2010, at 10:13 AM, Benjamin Sanderse wrote: >> >>> Hi Barry, >>> >>> I am indeed closing the socket in Matlab between the two sets, using close(PS), where PS=PetscOpenSocket. >>> I have tried different port numbers, but without guarantee of success. Sometimes it works, sometimes it doesn't. Often times the first time calling the PetscOpenSocket(portnumber) works, but even that is not guaranteed. I think there should be another solution. >>> By the way, all these problems do not appear when using serial vectors instead of parallel. >> >> That is strange. Only the first process ever opens the socket so in theory the fact that the PETSc code is parallel should not matter at all. Please send me your test code that causes trouble again and I'll see if I can reproduce the problem. >> >> Barry >> >>> >>> Ben >>> >>> Op 7 sep 2010, om 17:27 heeft Barry Smith het volgende geschreven: >>> >>>> >>>> Are you closing the socket on Matlab between to the two sets? Just checking. >>>> >>>> You can try running with a different port number each time to see if it is related to trying to reuse the port. Run with PetscOpenSocket(5006) and the PETSc program with -viewer_socket_port 5006 >>>> then run both with 5007 then with 5008 etc does this work smoothly? >>>> >>>> Let me know and the will tell me the next step to try, >>>> >>>> Barry >>>> >>>> On Sep 7, 2010, at 10:53 AM, Benjamin Sanderse wrote: >>>> >>>>> Hi Barry, >>>>> >>>>> I am still not too happy with the execution in parallel. I am working under Linux (64 bits) and still using your approach with two command windows (since it gives the best debugging possibility). >>>>> As I said, sometimes things work, but most of the time not. Here is the output of two successive runs >>>>> >>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>> [1] PetscFinalize(): PetscFinalize() called >>>>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >>>>> [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >>>>> [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >>>>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >>>>> [0] PetscFinalize(): PetscFinalize() called >>>>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >>>>> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >>>>> [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >>>>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >>>>> >>>>> >>>>> -bash-4.0$ netstat | grep 5005 >>>>> >>>>> >>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>> ^C >>>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>>>> [0]1:Return code = 0, signaled with Interrupt >>>>> >>>>> >>>>> In both cases I first started the Matlab program. I am currently starting Matlab without a GUI, but with a GUI I have the same problems. >>>>> As you can see, in the first case everything works fine, and Petsc finalizes and closes. Matlab gives me the correct output. The second case, run just a couple of seconds later, does not reach PetscFinalize and Matlab does not give the correct output. In between the two cases I checked if port 5005 was in use, and it was not. >>>>> Do you have any more suggestions on how to get this to work properly? >>>>> >>>>> Benjamin >>>>> >>>>> Op 3 sep 2010, om 21:11 heeft Barry Smith het volgende geschreven: >>>>> >>>>>> >>>>>> On Sep 3, 2010, at 4:32 PM, Benjamin Sanderse wrote: >>>>>> >>>>>>> Hi Barry, >>>>>>> >>>>>>> Thanks for your help! However, there are still some issues left. In other to test things, I simplified the program even more and now I am just sending a vector back and forth: matlab->petsc->matlab: >>>>>>> >>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>>> >>>>>>> // load rhs vector >>>>>>> ierr = VecLoad(fd,VECMPI,&b);CHKERRQ(ierr); >>>>>>> >>>>>>> // send to matlab >>>>>>> ierr = VecView(b,fd);CHKERRQ(ierr); >>>>>>> ierr = VecDestroy(b);CHKERRQ(ierr); >>>>>>> >>>>>>> >>>>>>> - Your approach with two windows works *sometimes*. I removed the 'launch' statement and I executed my program 10 times, the first 2 times worked, and in all other cases I got this: >>>>>>> >>>>>>> petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again >>>>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again^C >>>>>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>>>>>> [0]1:Return code = 0, signaled with Interrupt >>>>>>> >>>>>>> Every time I start the program I use close(socket) and clear all in Matlab, so the socket from the previous run should not be present anymore. It seems that the port gets corrupted after a couple of times? Matlab does not respond and I have to kill it and restart it manually. >>>>>> >>>>>> Sometimes when you close a socket connection it doesn't close for a very long time so that if you try to open it again it doesn't work. When it appears the socket can not be used try using netstat | grep 5005 to see if the socket is still active. >>>>>> >>>>>>> >>>>>>> - If I include the launch statement, or just type >>>>>>> system('mpiexec -n 2 ./petsc_poisson_par_barry2 &') >>>>>>> the program never works. >>>>>> >>>>>> Are you sure mpiexec is in the path of system and it is the right one? The problem is that we are kind of cheating with system because we start a new job in the background and have no idea what the output is. Are you using unix and running Matlab on the command line or in a GUI? >>>>>> >>>>>> Barry >>>>>> >>>>>> >>>>>>> >>>>>>> Hope you can figure out what is going wrong. >>>>>>> >>>>>>> Ben >>>>>>> >>>>>>> >>>>>>> Op 3 sep 2010, om 13:25 heeft Barry Smith het volgende geschreven: >>>>>>> >>>>>>>> >>>>>>>> Ben >>>>>>>> >>>>>>>> Ok, I figured out the problem. It is not fundamental and mostly comes from not having a create way to debug this. >>>>>>>> >>>>>>>> The test vector you create is sequential then you try to view it back to Matlab with the parallel fd viewer. If you change to >>>>>>>> ierr = VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,1,&test);CHKERRQ(ierr); >>>>>>>> then the code runs. >>>>>>>> >>>>>>>> I've found (just now) that when I use launch all the output from the .c program gets lost which makes it impossible to figure out what has gone wrong. You can debug by running the two parts of the computation in two different windows. So comment out the launch from the matlab script and then in Matlab run the script (it will hang waiting for the socket to work) and in a separate terminal window run the .c program; for example petscmpiexec -n 2 ./ex1 -info Now you see exactly what is happening in the PETSc program. You can even use -start_in_debugger on the PETSc side to run the debugger on crashes. >>>>>>>> >>>>>>>> I'll add this to the docs for launch >>>>>>>> >>>>>>>> Barry >>>>>>>> >>>>>>>> >>>>>>>> On Sep 2, 2010, at 3:28 PM, Benjamin Sanderse wrote: >>>>>>>> >>>>>>>>> Hi Barry, >>>>>>>>> >>>>>>>>> I attached my matlab file, c file and makefile. First I generate the executable with 'make petsc_poisson_par_barry' and then I run test_petsc_par_barry.m. >>>>>>>>> If you change MATMPIAIJ to MATAIJ and VECMPI to VECSEQ the code works fine. >>>>>>>>> >>>>>>>>> Thanks a lot, >>>>>>>>> >>>>>>>>> Benjamin >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> Op 2 sep 2010, om 13:45 heeft Barry Smith het volgende geschreven: >>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Matlab is never aware the vector is parallel. Please send me the code and I'll figure out what is going on. >>>>>>>>>> >>>>>>>>>> Barry >>>>>>>>>> >>>>>>>>>> On Sep 2, 2010, at 2:07 PM, Benjamin Sanderse wrote: >>>>>>>>>> >>>>>>>>>>> That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: >>>>>>>>>>> >>>>>>>>>>> Found unrecogonized header 0 in file. If your file contains complex numbers >>>>>>>>>>> then call PetscBinaryRead() with "complex" as the second argument >>>>>>>>>>> Error in ==> PetscBinaryRead at 27 >>>>>>>>>>> if nargin < 2 >>>>>>>>>>> >>>>>>>>>>> ??? Output argument "varargout" (and maybe others) not assigned during call to >>>>>>>>>>> "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". >>>>>>>>>>> >>>>>>>>>>> Error in ==> test_petsc_par at 57 >>>>>>>>>>> x4 = PetscBinaryReady(PS); >>>>>>>>>>> >>>>>>>>>>> Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: >>>>>>>>>>> >>>>>>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>>>>>>> ... >>>>>>>>>>> KSPSolve(ksp,b,x); >>>>>>>>>>> ... >>>>>>>>>>> VecView(fd,x); >>>>>>>>>>> >>>>>>>>>>> Thanks for the help! >>>>>>>>>>> >>>>>>>>>>> Ben >>>>>>>>>>> >>>>>>>>>>> Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: >>>>>>>>>>>> >>>>>>>>>>>>> Hello all, >>>>>>>>>>>>> >>>>>>>>>>>>> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. >>>>>>>>>>>>> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? >>>>>>>>>>>> >>>>>>>>>>>> In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ >>>>>>>>>>>> >>>>>>>>>>>> Barry >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks, >>>>>>>>>>>>> >>>>>>>>>>>>> Ben >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>> >> > From B.Sanderse at cwi.nl Thu Sep 9 14:51:25 2010 From: B.Sanderse at cwi.nl (Benjamin Sanderse) Date: Thu, 9 Sep 2010 13:51:25 -0600 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> <30D58A3F-D8D9-4692-8BE9-BB8168201BD3@mcs.anl.gov> <6F879BC3-2E8E-4B30-B4A8-CDB039BD3A1E@cwi.nl> <9587193F-E30F-40F2-BF6A-4B690DA5066B@cwi.nl> Message-ID: <261FD0FD-7085-4C71-B2BD-7DB26ACDA853@cwi.nl> I have installed Petsc on a remote 64-bits Linux (Fedora). Might there be an issue with the 64-bits? My own computer is a Macbook Pro, but I am not running Petsc on it, because in the past I have had severe problems with getting mex-files to work under 64-bits Matlab. I could try to install it, though. Another option is that I try it on another remote Linux machine, also 64-bits, running Red Hat. What do you suggest? Op 9 sep 2010, om 13:10 heeft Barry Smith het volgende geschreven: > > What OS are you using. > > On my Apple Mac I made a shell script loop calling petsc_poisson_par_barry2 multiple times and a similar loop in Matlab and start then both off (with parallel PETSc runs). It runs flawlessly, opening the socket sending and receiving, dozens of times in a row with several processes. I think that maybe you are using Linux? > > > Barry > > > On Sep 8, 2010, at 2:32 PM, Benjamin Sanderse wrote: > >> That's also what I thought. I checked once again, and I found out that when I use >> >> petscmpiexec -n 1 >> >> the program works, but if I increase the number of processors to 2 it only works once in a while. >> >> I attached my test code. It is extremely simple and does nothing else than passing a vector to petsc and returning it to matlab. >> >> I run it as follows: >> >> shell #1 >> -bash-4.0$ make petsc_poisson_par_barry2 >> >> shell #2 >> -bash-4.0$ matlab -nojvm -nodisplay >>>> test_petsc_par_barry; >> >> shell #1 >> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -viewer_socket_port 5006 -info >> >> On lucky days, this works, on unlucky days, petsc will stop here: >> >> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >> [1] PetscCommDuplicate(): returning tag 2147483647 >> [0] PetscCommDuplicate(): returning tag 2147483647 >> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5006 machine borr.mas.cwi.nl >> [0] PetscCommDuplicate(): returning tag 2147483646 >> [1] PetscCommDuplicate(): returning tag 2147483646 >> [1] PetscCommDuplicate(): returning tag 2147483641 >> [0] PetscCommDuplicate(): returning tag 2147483641 >> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >> ^C >> >> >> >> >> >> >> >> Op 8 sep 2010, om 12:00 heeft Barry Smith het volgende geschreven: >> >>> >>> On Sep 8, 2010, at 10:13 AM, Benjamin Sanderse wrote: >>> >>>> Hi Barry, >>>> >>>> I am indeed closing the socket in Matlab between the two sets, using close(PS), where PS=PetscOpenSocket. >>>> I have tried different port numbers, but without guarantee of success. Sometimes it works, sometimes it doesn't. Often times the first time calling the PetscOpenSocket(portnumber) works, but even that is not guaranteed. I think there should be another solution. >>>> By the way, all these problems do not appear when using serial vectors instead of parallel. >>> >>> That is strange. Only the first process ever opens the socket so in theory the fact that the PETSc code is parallel should not matter at all. Please send me your test code that causes trouble again and I'll see if I can reproduce the problem. >>> >>> Barry >>> >>>> >>>> Ben >>>> >>>> Op 7 sep 2010, om 17:27 heeft Barry Smith het volgende geschreven: >>>> >>>>> >>>>> Are you closing the socket on Matlab between to the two sets? Just checking. >>>>> >>>>> You can try running with a different port number each time to see if it is related to trying to reuse the port. Run with PetscOpenSocket(5006) and the PETSc program with -viewer_socket_port 5006 >>>>> then run both with 5007 then with 5008 etc does this work smoothly? >>>>> >>>>> Let me know and the will tell me the next step to try, >>>>> >>>>> Barry >>>>> >>>>> On Sep 7, 2010, at 10:53 AM, Benjamin Sanderse wrote: >>>>> >>>>>> Hi Barry, >>>>>> >>>>>> I am still not too happy with the execution in parallel. I am working under Linux (64 bits) and still using your approach with two command windows (since it gives the best debugging possibility). >>>>>> As I said, sometimes things work, but most of the time not. Here is the output of two successive runs >>>>>> >>>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>>> [1] PetscFinalize(): PetscFinalize() called >>>>>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >>>>>> [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >>>>>> [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >>>>>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >>>>>> [0] PetscFinalize(): PetscFinalize() called >>>>>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >>>>>> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >>>>>> [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >>>>>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >>>>>> >>>>>> >>>>>> -bash-4.0$ netstat | grep 5005 >>>>>> >>>>>> >>>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>>> ^C >>>>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>>>>> [0]1:Return code = 0, signaled with Interrupt >>>>>> >>>>>> >>>>>> In both cases I first started the Matlab program. I am currently starting Matlab without a GUI, but with a GUI I have the same problems. >>>>>> As you can see, in the first case everything works fine, and Petsc finalizes and closes. Matlab gives me the correct output. The second case, run just a couple of seconds later, does not reach PetscFinalize and Matlab does not give the correct output. In between the two cases I checked if port 5005 was in use, and it was not. >>>>>> Do you have any more suggestions on how to get this to work properly? >>>>>> >>>>>> Benjamin >>>>>> >>>>>> Op 3 sep 2010, om 21:11 heeft Barry Smith het volgende geschreven: >>>>>> >>>>>>> >>>>>>> On Sep 3, 2010, at 4:32 PM, Benjamin Sanderse wrote: >>>>>>> >>>>>>>> Hi Barry, >>>>>>>> >>>>>>>> Thanks for your help! However, there are still some issues left. In other to test things, I simplified the program even more and now I am just sending a vector back and forth: matlab->petsc->matlab: >>>>>>>> >>>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>>>> >>>>>>>> // load rhs vector >>>>>>>> ierr = VecLoad(fd,VECMPI,&b);CHKERRQ(ierr); >>>>>>>> >>>>>>>> // send to matlab >>>>>>>> ierr = VecView(b,fd);CHKERRQ(ierr); >>>>>>>> ierr = VecDestroy(b);CHKERRQ(ierr); >>>>>>>> >>>>>>>> >>>>>>>> - Your approach with two windows works *sometimes*. I removed the 'launch' statement and I executed my program 10 times, the first 2 times worked, and in all other cases I got this: >>>>>>>> >>>>>>>> petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again >>>>>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again^C >>>>>>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>>>>>>> [0]1:Return code = 0, signaled with Interrupt >>>>>>>> >>>>>>>> Every time I start the program I use close(socket) and clear all in Matlab, so the socket from the previous run should not be present anymore. It seems that the port gets corrupted after a couple of times? Matlab does not respond and I have to kill it and restart it manually. >>>>>>> >>>>>>> Sometimes when you close a socket connection it doesn't close for a very long time so that if you try to open it again it doesn't work. When it appears the socket can not be used try using netstat | grep 5005 to see if the socket is still active. >>>>>>> >>>>>>>> >>>>>>>> - If I include the launch statement, or just type >>>>>>>> system('mpiexec -n 2 ./petsc_poisson_par_barry2 &') >>>>>>>> the program never works. >>>>>>> >>>>>>> Are you sure mpiexec is in the path of system and it is the right one? The problem is that we are kind of cheating with system because we start a new job in the background and have no idea what the output is. Are you using unix and running Matlab on the command line or in a GUI? >>>>>>> >>>>>>> Barry >>>>>>> >>>>>>> >>>>>>>> >>>>>>>> Hope you can figure out what is going wrong. >>>>>>>> >>>>>>>> Ben >>>>>>>> >>>>>>>> >>>>>>>> Op 3 sep 2010, om 13:25 heeft Barry Smith het volgende geschreven: >>>>>>>> >>>>>>>>> >>>>>>>>> Ben >>>>>>>>> >>>>>>>>> Ok, I figured out the problem. It is not fundamental and mostly comes from not having a create way to debug this. >>>>>>>>> >>>>>>>>> The test vector you create is sequential then you try to view it back to Matlab with the parallel fd viewer. If you change to >>>>>>>>> ierr = VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,1,&test);CHKERRQ(ierr); >>>>>>>>> then the code runs. >>>>>>>>> >>>>>>>>> I've found (just now) that when I use launch all the output from the .c program gets lost which makes it impossible to figure out what has gone wrong. You can debug by running the two parts of the computation in two different windows. So comment out the launch from the matlab script and then in Matlab run the script (it will hang waiting for the socket to work) and in a separate terminal window run the .c program; for example petscmpiexec -n 2 ./ex1 -info Now you see exactly what is happening in the PETSc program. You can even use -start_in_debugger on the PETSc side to run the debugger on crashes. >>>>>>>>> >>>>>>>>> I'll add this to the docs for launch >>>>>>>>> >>>>>>>>> Barry >>>>>>>>> >>>>>>>>> >>>>>>>>> On Sep 2, 2010, at 3:28 PM, Benjamin Sanderse wrote: >>>>>>>>> >>>>>>>>>> Hi Barry, >>>>>>>>>> >>>>>>>>>> I attached my matlab file, c file and makefile. First I generate the executable with 'make petsc_poisson_par_barry' and then I run test_petsc_par_barry.m. >>>>>>>>>> If you change MATMPIAIJ to MATAIJ and VECMPI to VECSEQ the code works fine. >>>>>>>>>> >>>>>>>>>> Thanks a lot, >>>>>>>>>> >>>>>>>>>> Benjamin >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> Op 2 sep 2010, om 13:45 heeft Barry Smith het volgende geschreven: >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Matlab is never aware the vector is parallel. Please send me the code and I'll figure out what is going on. >>>>>>>>>>> >>>>>>>>>>> Barry >>>>>>>>>>> >>>>>>>>>>> On Sep 2, 2010, at 2:07 PM, Benjamin Sanderse wrote: >>>>>>>>>>> >>>>>>>>>>>> That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: >>>>>>>>>>>> >>>>>>>>>>>> Found unrecogonized header 0 in file. If your file contains complex numbers >>>>>>>>>>>> then call PetscBinaryRead() with "complex" as the second argument >>>>>>>>>>>> Error in ==> PetscBinaryRead at 27 >>>>>>>>>>>> if nargin < 2 >>>>>>>>>>>> >>>>>>>>>>>> ??? Output argument "varargout" (and maybe others) not assigned during call to >>>>>>>>>>>> "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". >>>>>>>>>>>> >>>>>>>>>>>> Error in ==> test_petsc_par at 57 >>>>>>>>>>>> x4 = PetscBinaryReady(PS); >>>>>>>>>>>> >>>>>>>>>>>> Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: >>>>>>>>>>>> >>>>>>>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>>>>>>>> ... >>>>>>>>>>>> KSPSolve(ksp,b,x); >>>>>>>>>>>> ... >>>>>>>>>>>> VecView(fd,x); >>>>>>>>>>>> >>>>>>>>>>>> Thanks for the help! >>>>>>>>>>>> >>>>>>>>>>>> Ben >>>>>>>>>>>> >>>>>>>>>>>> Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> Hello all, >>>>>>>>>>>>>> >>>>>>>>>>>>>> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. >>>>>>>>>>>>>> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? >>>>>>>>>>>>> >>>>>>>>>>>>> In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ >>>>>>>>>>>>> >>>>>>>>>>>>> Barry >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>> >>>>>>>>>>>>>> Ben >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>> >> > From bsmith at mcs.anl.gov Thu Sep 9 15:20:15 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 9 Sep 2010 15:20:15 -0500 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: <261FD0FD-7085-4C71-B2BD-7DB26ACDA853@cwi.nl> References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> <30D58A3F-D8D9-4692-8BE9-BB8168201BD3@mcs.anl.gov> <6F879BC3-2E8E-4B30-B4A8-CDB039BD3A1E@cwi.nl> <9587193F-E30F-40F2-BF6A-4B690DA5066B@cwi.nl> <261FD0FD-7085-4C71-B2BD-7DB26ACDA853@cwi.nl> Message-ID: <25591182-B391-4B29-B14F-8CC3B62ABD7F@mcs.anl.gov> On Sep 9, 2010, at 2:51 PM, Benjamin Sanderse wrote: > I have installed Petsc on a remote 64-bits Linux (Fedora). Might there be an issue with the 64-bits? Shouldn't mater. > > My own computer is a Macbook Pro, but I am not running Petsc on it, because in the past I have had severe problems with getting mex-files to work under 64-bits Matlab. Yes, that is painful, Matlab never respected the Mac > I could try to install it, though. > Another option is that I try it on another remote Linux machine, also 64-bits, running Red Hat. You could try this to see if the same problem exists. > > What do you suggest? Are you sure it is hanging on the sockets not hanging on running MPI jobs one after each other? Barry > > > Op 9 sep 2010, om 13:10 heeft Barry Smith het volgende geschreven: > >> >> What OS are you using. >> >> On my Apple Mac I made a shell script loop calling petsc_poisson_par_barry2 multiple times and a similar loop in Matlab and start then both off (with parallel PETSc runs). It runs flawlessly, opening the socket sending and receiving, dozens of times in a row with several processes. I think that maybe you are using Linux? >> >> >> Barry >> >> >> On Sep 8, 2010, at 2:32 PM, Benjamin Sanderse wrote: >> >>> That's also what I thought. I checked once again, and I found out that when I use >>> >>> petscmpiexec -n 1 >>> >>> the program works, but if I increase the number of processors to 2 it only works once in a while. >>> >>> I attached my test code. It is extremely simple and does nothing else than passing a vector to petsc and returning it to matlab. >>> >>> I run it as follows: >>> >>> shell #1 >>> -bash-4.0$ make petsc_poisson_par_barry2 >>> >>> shell #2 >>> -bash-4.0$ matlab -nojvm -nodisplay >>>>> test_petsc_par_barry; >>> >>> shell #1 >>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -viewer_socket_port 5006 -info >>> >>> On lucky days, this works, on unlucky days, petsc will stop here: >>> >>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>> [1] PetscCommDuplicate(): returning tag 2147483647 >>> [0] PetscCommDuplicate(): returning tag 2147483647 >>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5006 machine borr.mas.cwi.nl >>> [0] PetscCommDuplicate(): returning tag 2147483646 >>> [1] PetscCommDuplicate(): returning tag 2147483646 >>> [1] PetscCommDuplicate(): returning tag 2147483641 >>> [0] PetscCommDuplicate(): returning tag 2147483641 >>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>> ^C >>> >>> >>> >>> >>> >>> >>> >>> Op 8 sep 2010, om 12:00 heeft Barry Smith het volgende geschreven: >>> >>>> >>>> On Sep 8, 2010, at 10:13 AM, Benjamin Sanderse wrote: >>>> >>>>> Hi Barry, >>>>> >>>>> I am indeed closing the socket in Matlab between the two sets, using close(PS), where PS=PetscOpenSocket. >>>>> I have tried different port numbers, but without guarantee of success. Sometimes it works, sometimes it doesn't. Often times the first time calling the PetscOpenSocket(portnumber) works, but even that is not guaranteed. I think there should be another solution. >>>>> By the way, all these problems do not appear when using serial vectors instead of parallel. >>>> >>>> That is strange. Only the first process ever opens the socket so in theory the fact that the PETSc code is parallel should not matter at all. Please send me your test code that causes trouble again and I'll see if I can reproduce the problem. >>>> >>>> Barry >>>> >>>>> >>>>> Ben >>>>> >>>>> Op 7 sep 2010, om 17:27 heeft Barry Smith het volgende geschreven: >>>>> >>>>>> >>>>>> Are you closing the socket on Matlab between to the two sets? Just checking. >>>>>> >>>>>> You can try running with a different port number each time to see if it is related to trying to reuse the port. Run with PetscOpenSocket(5006) and the PETSc program with -viewer_socket_port 5006 >>>>>> then run both with 5007 then with 5008 etc does this work smoothly? >>>>>> >>>>>> Let me know and the will tell me the next step to try, >>>>>> >>>>>> Barry >>>>>> >>>>>> On Sep 7, 2010, at 10:53 AM, Benjamin Sanderse wrote: >>>>>> >>>>>>> Hi Barry, >>>>>>> >>>>>>> I am still not too happy with the execution in parallel. I am working under Linux (64 bits) and still using your approach with two command windows (since it gives the best debugging possibility). >>>>>>> As I said, sometimes things work, but most of the time not. Here is the output of two successive runs >>>>>>> >>>>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>>>> [1] PetscFinalize(): PetscFinalize() called >>>>>>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >>>>>>> [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >>>>>>> [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >>>>>>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >>>>>>> [0] PetscFinalize(): PetscFinalize() called >>>>>>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >>>>>>> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >>>>>>> [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >>>>>>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >>>>>>> >>>>>>> >>>>>>> -bash-4.0$ netstat | grep 5005 >>>>>>> >>>>>>> >>>>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>>>> ^C >>>>>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>>>>>> [0]1:Return code = 0, signaled with Interrupt >>>>>>> >>>>>>> >>>>>>> In both cases I first started the Matlab program. I am currently starting Matlab without a GUI, but with a GUI I have the same problems. >>>>>>> As you can see, in the first case everything works fine, and Petsc finalizes and closes. Matlab gives me the correct output. The second case, run just a couple of seconds later, does not reach PetscFinalize and Matlab does not give the correct output. In between the two cases I checked if port 5005 was in use, and it was not. >>>>>>> Do you have any more suggestions on how to get this to work properly? >>>>>>> >>>>>>> Benjamin >>>>>>> >>>>>>> Op 3 sep 2010, om 21:11 heeft Barry Smith het volgende geschreven: >>>>>>> >>>>>>>> >>>>>>>> On Sep 3, 2010, at 4:32 PM, Benjamin Sanderse wrote: >>>>>>>> >>>>>>>>> Hi Barry, >>>>>>>>> >>>>>>>>> Thanks for your help! However, there are still some issues left. In other to test things, I simplified the program even more and now I am just sending a vector back and forth: matlab->petsc->matlab: >>>>>>>>> >>>>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>>>>> >>>>>>>>> // load rhs vector >>>>>>>>> ierr = VecLoad(fd,VECMPI,&b);CHKERRQ(ierr); >>>>>>>>> >>>>>>>>> // send to matlab >>>>>>>>> ierr = VecView(b,fd);CHKERRQ(ierr); >>>>>>>>> ierr = VecDestroy(b);CHKERRQ(ierr); >>>>>>>>> >>>>>>>>> >>>>>>>>> - Your approach with two windows works *sometimes*. I removed the 'launch' statement and I executed my program 10 times, the first 2 times worked, and in all other cases I got this: >>>>>>>>> >>>>>>>>> petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again >>>>>>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again^C >>>>>>>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>>>>>>>> [0]1:Return code = 0, signaled with Interrupt >>>>>>>>> >>>>>>>>> Every time I start the program I use close(socket) and clear all in Matlab, so the socket from the previous run should not be present anymore. It seems that the port gets corrupted after a couple of times? Matlab does not respond and I have to kill it and restart it manually. >>>>>>>> >>>>>>>> Sometimes when you close a socket connection it doesn't close for a very long time so that if you try to open it again it doesn't work. When it appears the socket can not be used try using netstat | grep 5005 to see if the socket is still active. >>>>>>>> >>>>>>>>> >>>>>>>>> - If I include the launch statement, or just type >>>>>>>>> system('mpiexec -n 2 ./petsc_poisson_par_barry2 &') >>>>>>>>> the program never works. >>>>>>>> >>>>>>>> Are you sure mpiexec is in the path of system and it is the right one? The problem is that we are kind of cheating with system because we start a new job in the background and have no idea what the output is. Are you using unix and running Matlab on the command line or in a GUI? >>>>>>>> >>>>>>>> Barry >>>>>>>> >>>>>>>> >>>>>>>>> >>>>>>>>> Hope you can figure out what is going wrong. >>>>>>>>> >>>>>>>>> Ben >>>>>>>>> >>>>>>>>> >>>>>>>>> Op 3 sep 2010, om 13:25 heeft Barry Smith het volgende geschreven: >>>>>>>>> >>>>>>>>>> >>>>>>>>>> Ben >>>>>>>>>> >>>>>>>>>> Ok, I figured out the problem. It is not fundamental and mostly comes from not having a create way to debug this. >>>>>>>>>> >>>>>>>>>> The test vector you create is sequential then you try to view it back to Matlab with the parallel fd viewer. If you change to >>>>>>>>>> ierr = VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,1,&test);CHKERRQ(ierr); >>>>>>>>>> then the code runs. >>>>>>>>>> >>>>>>>>>> I've found (just now) that when I use launch all the output from the .c program gets lost which makes it impossible to figure out what has gone wrong. You can debug by running the two parts of the computation in two different windows. So comment out the launch from the matlab script and then in Matlab run the script (it will hang waiting for the socket to work) and in a separate terminal window run the .c program; for example petscmpiexec -n 2 ./ex1 -info Now you see exactly what is happening in the PETSc program. You can even use -start_in_debugger on the PETSc side to run the debugger on crashes. >>>>>>>>>> >>>>>>>>>> I'll add this to the docs for launch >>>>>>>>>> >>>>>>>>>> Barry >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Sep 2, 2010, at 3:28 PM, Benjamin Sanderse wrote: >>>>>>>>>> >>>>>>>>>>> Hi Barry, >>>>>>>>>>> >>>>>>>>>>> I attached my matlab file, c file and makefile. First I generate the executable with 'make petsc_poisson_par_barry' and then I run test_petsc_par_barry.m. >>>>>>>>>>> If you change MATMPIAIJ to MATAIJ and VECMPI to VECSEQ the code works fine. >>>>>>>>>>> >>>>>>>>>>> Thanks a lot, >>>>>>>>>>> >>>>>>>>>>> Benjamin >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Op 2 sep 2010, om 13:45 heeft Barry Smith het volgende geschreven: >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Matlab is never aware the vector is parallel. Please send me the code and I'll figure out what is going on. >>>>>>>>>>>> >>>>>>>>>>>> Barry >>>>>>>>>>>> >>>>>>>>>>>> On Sep 2, 2010, at 2:07 PM, Benjamin Sanderse wrote: >>>>>>>>>>>> >>>>>>>>>>>>> That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: >>>>>>>>>>>>> >>>>>>>>>>>>> Found unrecogonized header 0 in file. If your file contains complex numbers >>>>>>>>>>>>> then call PetscBinaryRead() with "complex" as the second argument >>>>>>>>>>>>> Error in ==> PetscBinaryRead at 27 >>>>>>>>>>>>> if nargin < 2 >>>>>>>>>>>>> >>>>>>>>>>>>> ??? Output argument "varargout" (and maybe others) not assigned during call to >>>>>>>>>>>>> "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". >>>>>>>>>>>>> >>>>>>>>>>>>> Error in ==> test_petsc_par at 57 >>>>>>>>>>>>> x4 = PetscBinaryReady(PS); >>>>>>>>>>>>> >>>>>>>>>>>>> Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: >>>>>>>>>>>>> >>>>>>>>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>>>>>>>>> ... >>>>>>>>>>>>> KSPSolve(ksp,b,x); >>>>>>>>>>>>> ... >>>>>>>>>>>>> VecView(fd,x); >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks for the help! >>>>>>>>>>>>> >>>>>>>>>>>>> Ben >>>>>>>>>>>>> >>>>>>>>>>>>> Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Hello all, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. >>>>>>>>>>>>>>> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? >>>>>>>>>>>>>> >>>>>>>>>>>>>> In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ >>>>>>>>>>>>>> >>>>>>>>>>>>>> Barry >>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Ben >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>> >> > From balay at mcs.anl.gov Thu Sep 9 15:36:55 2010 From: balay at mcs.anl.gov (Satish Balay) Date: Thu, 9 Sep 2010 15:36:55 -0500 (CDT) Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: <25591182-B391-4B29-B14F-8CC3B62ABD7F@mcs.anl.gov> References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> <30D58A3F-D8D9-4692-8BE9-BB8168201BD3@mcs.anl.gov> <6F879BC3-2E8E-4B30-B4A8-CDB039BD3A1E@cwi.nl> <9587193F-E30F-40F2-BF6A-4B690DA5066B@cwi.nl> <261FD0FD-7085-4C71-B2BD-7DB26ACDA853@cwi.nl> <25591182-B391-4B29-B14F-8CC3B62ABD7F@mcs.anl.gov> Message-ID: I just attempted this on a linux64 box [ubuntu 10.04 with matlab version 7.10.0.499 (R2010a) 64-bit (glnxa64)] with petsc-3.1 - and the loops ran fine. -bash: for i in {1..50}; do mpiexec -n 2 ./petsc_poisson_par_barry2 -viewer_socket_port 20001; echo $i; done matlab: for i=1:500,;i; test_petsc_par_barry; end Perhaps you can try a high port number - and the same loops - and see if you get stuck on this machine. Satish On Thu, 9 Sep 2010, Barry Smith wrote: > > On Sep 9, 2010, at 2:51 PM, Benjamin Sanderse wrote: > > > I have installed Petsc on a remote 64-bits Linux (Fedora). Might there be an issue with the 64-bits? > > Shouldn't mater. > > > > > My own computer is a Macbook Pro, but I am not running Petsc on it, because in the past I have had severe problems with getting mex-files to work under 64-bits Matlab. > > Yes, that is painful, Matlab never respected the Mac > > > I could try to install it, though. > > Another option is that I try it on another remote Linux machine, also 64-bits, running Red Hat. > > You could try this to see if the same problem exists. > > > > > What do you suggest? > > Are you sure it is hanging on the sockets not hanging on running MPI jobs one after each other? > > Barry > > > > > > > Op 9 sep 2010, om 13:10 heeft Barry Smith het volgende geschreven: > > > >> > >> What OS are you using. > >> > >> On my Apple Mac I made a shell script loop calling petsc_poisson_par_barry2 multiple times and a similar loop in Matlab and start then both off (with parallel PETSc runs). It runs flawlessly, opening the socket sending and receiving, dozens of times in a row with several processes. I think that maybe you are using Linux? > >> > >> > >> Barry > >> > >> > >> On Sep 8, 2010, at 2:32 PM, Benjamin Sanderse wrote: > >> > >>> That's also what I thought. I checked once again, and I found out that when I use > >>> > >>> petscmpiexec -n 1 > >>> > >>> the program works, but if I increase the number of processors to 2 it only works once in a while. > >>> > >>> I attached my test code. It is extremely simple and does nothing else than passing a vector to petsc and returning it to matlab. > >>> > >>> I run it as follows: > >>> > >>> shell #1 > >>> -bash-4.0$ make petsc_poisson_par_barry2 > >>> > >>> shell #2 > >>> -bash-4.0$ matlab -nojvm -nodisplay > >>>>> test_petsc_par_barry; > >>> > >>> shell #1 > >>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -viewer_socket_port 5006 -info > >>> > >>> On lucky days, this works, on unlucky days, petsc will stop here: > >>> > >>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 > >>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl > >>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 > >>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl > >>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 > >>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 > >>> [1] PetscCommDuplicate(): returning tag 2147483647 > >>> [0] PetscCommDuplicate(): returning tag 2147483647 > >>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5006 machine borr.mas.cwi.nl > >>> [0] PetscCommDuplicate(): returning tag 2147483646 > >>> [1] PetscCommDuplicate(): returning tag 2147483646 > >>> [1] PetscCommDuplicate(): returning tag 2147483641 > >>> [0] PetscCommDuplicate(): returning tag 2147483641 > >>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. > >>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. > >>> ^C > >>> > >>> > >>> > >>> > >>> > >>> > >>> > >>> Op 8 sep 2010, om 12:00 heeft Barry Smith het volgende geschreven: > >>> > >>>> > >>>> On Sep 8, 2010, at 10:13 AM, Benjamin Sanderse wrote: > >>>> > >>>>> Hi Barry, > >>>>> > >>>>> I am indeed closing the socket in Matlab between the two sets, using close(PS), where PS=PetscOpenSocket. > >>>>> I have tried different port numbers, but without guarantee of success. Sometimes it works, sometimes it doesn't. Often times the first time calling the PetscOpenSocket(portnumber) works, but even that is not guaranteed. I think there should be another solution. > >>>>> By the way, all these problems do not appear when using serial vectors instead of parallel. > >>>> > >>>> That is strange. Only the first process ever opens the socket so in theory the fact that the PETSc code is parallel should not matter at all. Please send me your test code that causes trouble again and I'll see if I can reproduce the problem. > >>>> > >>>> Barry > >>>> > >>>>> > >>>>> Ben > >>>>> > >>>>> Op 7 sep 2010, om 17:27 heeft Barry Smith het volgende geschreven: > >>>>> > >>>>>> > >>>>>> Are you closing the socket on Matlab between to the two sets? Just checking. > >>>>>> > >>>>>> You can try running with a different port number each time to see if it is related to trying to reuse the port. Run with PetscOpenSocket(5006) and the PETSc program with -viewer_socket_port 5006 > >>>>>> then run both with 5007 then with 5008 etc does this work smoothly? > >>>>>> > >>>>>> Let me know and the will tell me the next step to try, > >>>>>> > >>>>>> Barry > >>>>>> > >>>>>> On Sep 7, 2010, at 10:53 AM, Benjamin Sanderse wrote: > >>>>>> > >>>>>>> Hi Barry, > >>>>>>> > >>>>>>> I am still not too happy with the execution in parallel. I am working under Linux (64 bits) and still using your approach with two command windows (since it gives the best debugging possibility). > >>>>>>> As I said, sometimes things work, but most of the time not. Here is the output of two successive runs > >>>>>>> > >>>>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info > >>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 > >>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl > >>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 > >>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl > >>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 > >>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 > >>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 > >>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 > >>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl > >>>>>>> [0] PetscCommDuplicate(): returning tag 2147483646 > >>>>>>> [1] PetscCommDuplicate(): returning tag 2147483646 > >>>>>>> [1] PetscCommDuplicate(): returning tag 2147483641 > >>>>>>> [0] PetscCommDuplicate(): returning tag 2147483641 > >>>>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. > >>>>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. > >>>>>>> [1] PetscFinalize(): PetscFinalize() called > >>>>>>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 > >>>>>>> [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 > >>>>>>> [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 > >>>>>>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 > >>>>>>> [0] PetscFinalize(): PetscFinalize() called > >>>>>>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 > >>>>>>> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 > >>>>>>> [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 > >>>>>>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 > >>>>>>> > >>>>>>> > >>>>>>> -bash-4.0$ netstat | grep 5005 > >>>>>>> > >>>>>>> > >>>>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info > >>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 > >>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl > >>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 > >>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl > >>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 > >>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 > >>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl > >>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 > >>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 > >>>>>>> [1] PetscCommDuplicate(): returning tag 2147483646 > >>>>>>> [0] PetscCommDuplicate(): returning tag 2147483646 > >>>>>>> [0] PetscCommDuplicate(): returning tag 2147483641 > >>>>>>> [1] PetscCommDuplicate(): returning tag 2147483641 > >>>>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. > >>>>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. > >>>>>>> ^C > >>>>>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt > >>>>>>> [0]1:Return code = 0, signaled with Interrupt > >>>>>>> > >>>>>>> > >>>>>>> In both cases I first started the Matlab program. I am currently starting Matlab without a GUI, but with a GUI I have the same problems. > >>>>>>> As you can see, in the first case everything works fine, and Petsc finalizes and closes. Matlab gives me the correct output. The second case, run just a couple of seconds later, does not reach PetscFinalize and Matlab does not give the correct output. In between the two cases I checked if port 5005 was in use, and it was not. > >>>>>>> Do you have any more suggestions on how to get this to work properly? > >>>>>>> > >>>>>>> Benjamin > >>>>>>> > >>>>>>> Op 3 sep 2010, om 21:11 heeft Barry Smith het volgende geschreven: > >>>>>>> > >>>>>>>> > >>>>>>>> On Sep 3, 2010, at 4:32 PM, Benjamin Sanderse wrote: > >>>>>>>> > >>>>>>>>> Hi Barry, > >>>>>>>>> > >>>>>>>>> Thanks for your help! However, there are still some issues left. In other to test things, I simplified the program even more and now I am just sending a vector back and forth: matlab->petsc->matlab: > >>>>>>>>> > >>>>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; > >>>>>>>>> > >>>>>>>>> // load rhs vector > >>>>>>>>> ierr = VecLoad(fd,VECMPI,&b);CHKERRQ(ierr); > >>>>>>>>> > >>>>>>>>> // send to matlab > >>>>>>>>> ierr = VecView(b,fd);CHKERRQ(ierr); > >>>>>>>>> ierr = VecDestroy(b);CHKERRQ(ierr); > >>>>>>>>> > >>>>>>>>> > >>>>>>>>> - Your approach with two windows works *sometimes*. I removed the 'launch' statement and I executed my program 10 times, the first 2 times worked, and in all other cases I got this: > >>>>>>>>> > >>>>>>>>> petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info > >>>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 > >>>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl > >>>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 > >>>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl > >>>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 > >>>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 > >>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 > >>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 > >>>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl > >>>>>>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again > >>>>>>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again^C > >>>>>>>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt > >>>>>>>>> [0]1:Return code = 0, signaled with Interrupt > >>>>>>>>> > >>>>>>>>> Every time I start the program I use close(socket) and clear all in Matlab, so the socket from the previous run should not be present anymore. It seems that the port gets corrupted after a couple of times? Matlab does not respond and I have to kill it and restart it manually. > >>>>>>>> > >>>>>>>> Sometimes when you close a socket connection it doesn't close for a very long time so that if you try to open it again it doesn't work. When it appears the socket can not be used try using netstat | grep 5005 to see if the socket is still active. > >>>>>>>> > >>>>>>>>> > >>>>>>>>> - If I include the launch statement, or just type > >>>>>>>>> system('mpiexec -n 2 ./petsc_poisson_par_barry2 &') > >>>>>>>>> the program never works. > >>>>>>>> > >>>>>>>> Are you sure mpiexec is in the path of system and it is the right one? The problem is that we are kind of cheating with system because we start a new job in the background and have no idea what the output is. Are you using unix and running Matlab on the command line or in a GUI? > >>>>>>>> > >>>>>>>> Barry > >>>>>>>> > >>>>>>>> > >>>>>>>>> > >>>>>>>>> Hope you can figure out what is going wrong. > >>>>>>>>> > >>>>>>>>> Ben > >>>>>>>>> > >>>>>>>>> > >>>>>>>>> Op 3 sep 2010, om 13:25 heeft Barry Smith het volgende geschreven: > >>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> Ben > >>>>>>>>>> > >>>>>>>>>> Ok, I figured out the problem. It is not fundamental and mostly comes from not having a create way to debug this. > >>>>>>>>>> > >>>>>>>>>> The test vector you create is sequential then you try to view it back to Matlab with the parallel fd viewer. If you change to > >>>>>>>>>> ierr = VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,1,&test);CHKERRQ(ierr); > >>>>>>>>>> then the code runs. > >>>>>>>>>> > >>>>>>>>>> I've found (just now) that when I use launch all the output from the .c program gets lost which makes it impossible to figure out what has gone wrong. You can debug by running the two parts of the computation in two different windows. So comment out the launch from the matlab script and then in Matlab run the script (it will hang waiting for the socket to work) and in a separate terminal window run the .c program; for example petscmpiexec -n 2 ./ex1 -info Now you see exactly what is happening in the PETSc program. You can even use -start_in_debugger on the PETSc side to run the debugger on crashes. > >>>>>>>>>> > >>>>>>>>>> I'll add this to the docs for launch > >>>>>>>>>> > >>>>>>>>>> Barry > >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> On Sep 2, 2010, at 3:28 PM, Benjamin Sanderse wrote: > >>>>>>>>>> > >>>>>>>>>>> Hi Barry, > >>>>>>>>>>> > >>>>>>>>>>> I attached my matlab file, c file and makefile. First I generate the executable with 'make petsc_poisson_par_barry' and then I run test_petsc_par_barry.m. > >>>>>>>>>>> If you change MATMPIAIJ to MATAIJ and VECMPI to VECSEQ the code works fine. > >>>>>>>>>>> > >>>>>>>>>>> Thanks a lot, > >>>>>>>>>>> > >>>>>>>>>>> Benjamin > >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> Op 2 sep 2010, om 13:45 heeft Barry Smith het volgende geschreven: > >>>>>>>>>>> > >>>>>>>>>>>> > >>>>>>>>>>>> > >>>>>>>>>>>> Matlab is never aware the vector is parallel. Please send me the code and I'll figure out what is going on. > >>>>>>>>>>>> > >>>>>>>>>>>> Barry > >>>>>>>>>>>> > >>>>>>>>>>>> On Sep 2, 2010, at 2:07 PM, Benjamin Sanderse wrote: > >>>>>>>>>>>> > >>>>>>>>>>>>> That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: > >>>>>>>>>>>>> > >>>>>>>>>>>>> Found unrecogonized header 0 in file. If your file contains complex numbers > >>>>>>>>>>>>> then call PetscBinaryRead() with "complex" as the second argument > >>>>>>>>>>>>> Error in ==> PetscBinaryRead at 27 > >>>>>>>>>>>>> if nargin < 2 > >>>>>>>>>>>>> > >>>>>>>>>>>>> ??? Output argument "varargout" (and maybe others) not assigned during call to > >>>>>>>>>>>>> "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". > >>>>>>>>>>>>> > >>>>>>>>>>>>> Error in ==> test_petsc_par at 57 > >>>>>>>>>>>>> x4 = PetscBinaryReady(PS); > >>>>>>>>>>>>> > >>>>>>>>>>>>> Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: > >>>>>>>>>>>>> > >>>>>>>>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; > >>>>>>>>>>>>> ... > >>>>>>>>>>>>> KSPSolve(ksp,b,x); > >>>>>>>>>>>>> ... > >>>>>>>>>>>>> VecView(fd,x); > >>>>>>>>>>>>> > >>>>>>>>>>>>> Thanks for the help! > >>>>>>>>>>>>> > >>>>>>>>>>>>> Ben > >>>>>>>>>>>>> > >>>>>>>>>>>>> Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: > >>>>>>>>>>>>> > >>>>>>>>>>>>>> > >>>>>>>>>>>>>> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: > >>>>>>>>>>>>>> > >>>>>>>>>>>>>>> Hello all, > >>>>>>>>>>>>>>> > >>>>>>>>>>>>>>> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. > >>>>>>>>>>>>>>> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? > >>>>>>>>>>>>>> > >>>>>>>>>>>>>> In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ > >>>>>>>>>>>>>> > >>>>>>>>>>>>>> Barry > >>>>>>>>>>>>>> > >>>>>>>>>>>>>>> > >>>>>>>>>>>>>>> Thanks, > >>>>>>>>>>>>>>> > >>>>>>>>>>>>>>> Ben > >>>>>>>>>>>>>> > >>>>>>>>>>>>> > >>>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>> > >>>>>>>>> > >>>>>>>> > >>>>>>> > >>>>>> > >>>>> > >>>> > >>> > >> > > > > From bourdin at lsu.edu Thu Sep 9 16:05:47 2010 From: bourdin at lsu.edu (Blaise Bourdin) Date: Thu, 9 Sep 2010 16:05:47 -0500 Subject: [petsc-users] [petsc4py] PETSc.Viewer().createHDF5 wipes out existing files Message-ID: Hi, Hopefully this does not get double posted. I sent the original email from the wrong account to the wrong list? I am trying to do HDF5 IO in petsc4py. I noticed that when I create a viewer using PETSc.Viewer().createHDF5(inputfile,comm= PETSc.COMM_WORLD) the file "outputfile" is wiped out and replaced with a empty hdf5 container, which is bad since I am trying to read forrm it... Unsurprisingly, I have no problems using the longer approach: h5in = PETSc.Viewer().create(PETSc.COMM_WORLD) h5in.setType(PETSc.Viewer.Type.HDF5) h5in.setFileMode(PETSc.Viewer.Mode.READ) h5in.setFileName(inputfile) Is this expected? Regards, Blaise -- Department of Mathematics and Center for Computation & Technology Louisiana State University, Baton Rouge, LA 70803, USA Tel. +1 (225) 578 1612, Fax +1 (225) 578 4276 http://www.math.lsu.edu/~bourdin From dalcinl at gmail.com Thu Sep 9 16:25:31 2010 From: dalcinl at gmail.com (Lisandro Dalcin) Date: Thu, 9 Sep 2010 18:25:31 -0300 Subject: [petsc-users] [petsc4py] PETSc.Viewer().createHDF5 wipes out existing files In-Reply-To: References: Message-ID: On 9 September 2010 18:05, Blaise Bourdin wrote: > Hi, > > Hopefully this does not get double posted. I sent the original email from the wrong account to the wrong list? > It got double posted. My reply below, just in case. > I am trying to do HDF5 IO in petsc4py. I noticed that when I create a viewer using > ?PETSc.Viewer().createHDF5(inputfile,comm= PETSc.COMM_WORLD) > the file "outputfile" is wiped out and replaced with a empty hdf5 container, which is bad since I am trying to read forrm it... > > Unsurprisingly, I have no problems using the longer approach: > ?h5in = PETSc.Viewer().create(PETSc.COMM_WORLD) > ?h5in.setType(PETSc.Viewer.Type.HDF5) > ?h5in.setFileMode(PETSc.Viewer.Mode.READ) > ?h5in.setFileName(inputfile) > > Is this expected? > Yes, it is expected. These create() methods default to mode=WRITE. What you and others think about this? To open in READ mode, just do this: PETSc.Viewer().createHDF5(inputfile, mode=PETSc.Viewer.Mode.READ, comm= PETSc.COMM_WORLD) -- Lisandro Dalcin --------------- CIMEC (INTEC/CONICET-UNL) Predio CONICET-Santa Fe Colectora RN 168 Km 472, Paraje El Pozo Tel: +54-342-4511594 (ext 1011) Tel/Fax: +54-342-4511169 From B.Sanderse at cwi.nl Thu Sep 9 16:37:57 2010 From: B.Sanderse at cwi.nl (Benjamin Sanderse) Date: Thu, 9 Sep 2010 15:37:57 -0600 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> <30D58A3F-D8D9-4692-8BE9-BB8168201BD3@mcs.anl.gov> <6F879BC3-2E8E-4B30-B4A8-CDB039BD3A1E@cwi.nl> <9587193F-E30F-40F2-BF6A-4B690DA5066B@cwi.nl> <261FD0FD-7085-4C71-B2BD-7DB26ACDA853@cwi.nl> <25591182-B391-4B29-B14F-8CC3B62ABD7F@mcs.anl.gov> Message-ID: Hi guys, I installed Petsc on another remote Linux machine, and guess what - it works flawlessly. Guess I have to talk to the IT department to have a look at the other machine. For now I can use the one that is working. Thanks for all the help! Ben Op 9 sep 2010, om 14:36 heeft Satish Balay het volgende geschreven: > I just attempted this on a linux64 box [ubuntu 10.04 with matlab > version 7.10.0.499 (R2010a) 64-bit (glnxa64)] with petsc-3.1 - and > the loops ran fine. > > -bash: > for i in {1..50}; do mpiexec -n 2 ./petsc_poisson_par_barry2 -viewer_socket_port 20001; echo $i; done > > matlab: > for i=1:500,;i; test_petsc_par_barry; end > > Perhaps you can try a high port number - and the same loops - and see if you get > stuck on this machine. > > Satish > > On Thu, 9 Sep 2010, Barry Smith wrote: > >> >> On Sep 9, 2010, at 2:51 PM, Benjamin Sanderse wrote: >> >>> I have installed Petsc on a remote 64-bits Linux (Fedora). Might there be an issue with the 64-bits? >> >> Shouldn't mater. >> >>> >>> My own computer is a Macbook Pro, but I am not running Petsc on it, because in the past I have had severe problems with getting mex-files to work under 64-bits Matlab. >> >> Yes, that is painful, Matlab never respected the Mac >> >>> I could try to install it, though. >>> Another option is that I try it on another remote Linux machine, also 64-bits, running Red Hat. >> >> You could try this to see if the same problem exists. >> >>> >>> What do you suggest? >> >> Are you sure it is hanging on the sockets not hanging on running MPI jobs one after each other? >> >> Barry >> >>> >>> >>> Op 9 sep 2010, om 13:10 heeft Barry Smith het volgende geschreven: >>> >>>> >>>> What OS are you using. >>>> >>>> On my Apple Mac I made a shell script loop calling petsc_poisson_par_barry2 multiple times and a similar loop in Matlab and start then both off (with parallel PETSc runs). It runs flawlessly, opening the socket sending and receiving, dozens of times in a row with several processes. I think that maybe you are using Linux? >>>> >>>> >>>> Barry >>>> >>>> >>>> On Sep 8, 2010, at 2:32 PM, Benjamin Sanderse wrote: >>>> >>>>> That's also what I thought. I checked once again, and I found out that when I use >>>>> >>>>> petscmpiexec -n 1 >>>>> >>>>> the program works, but if I increase the number of processors to 2 it only works once in a while. >>>>> >>>>> I attached my test code. It is extremely simple and does nothing else than passing a vector to petsc and returning it to matlab. >>>>> >>>>> I run it as follows: >>>>> >>>>> shell #1 >>>>> -bash-4.0$ make petsc_poisson_par_barry2 >>>>> >>>>> shell #2 >>>>> -bash-4.0$ matlab -nojvm -nodisplay >>>>>>> test_petsc_par_barry; >>>>> >>>>> shell #1 >>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -viewer_socket_port 5006 -info >>>>> >>>>> On lucky days, this works, on unlucky days, petsc will stop here: >>>>> >>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5006 machine borr.mas.cwi.nl >>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>> ^C >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> Op 8 sep 2010, om 12:00 heeft Barry Smith het volgende geschreven: >>>>> >>>>>> >>>>>> On Sep 8, 2010, at 10:13 AM, Benjamin Sanderse wrote: >>>>>> >>>>>>> Hi Barry, >>>>>>> >>>>>>> I am indeed closing the socket in Matlab between the two sets, using close(PS), where PS=PetscOpenSocket. >>>>>>> I have tried different port numbers, but without guarantee of success. Sometimes it works, sometimes it doesn't. Often times the first time calling the PetscOpenSocket(portnumber) works, but even that is not guaranteed. I think there should be another solution. >>>>>>> By the way, all these problems do not appear when using serial vectors instead of parallel. >>>>>> >>>>>> That is strange. Only the first process ever opens the socket so in theory the fact that the PETSc code is parallel should not matter at all. Please send me your test code that causes trouble again and I'll see if I can reproduce the problem. >>>>>> >>>>>> Barry >>>>>> >>>>>>> >>>>>>> Ben >>>>>>> >>>>>>> Op 7 sep 2010, om 17:27 heeft Barry Smith het volgende geschreven: >>>>>>> >>>>>>>> >>>>>>>> Are you closing the socket on Matlab between to the two sets? Just checking. >>>>>>>> >>>>>>>> You can try running with a different port number each time to see if it is related to trying to reuse the port. Run with PetscOpenSocket(5006) and the PETSc program with -viewer_socket_port 5006 >>>>>>>> then run both with 5007 then with 5008 etc does this work smoothly? >>>>>>>> >>>>>>>> Let me know and the will tell me the next step to try, >>>>>>>> >>>>>>>> Barry >>>>>>>> >>>>>>>> On Sep 7, 2010, at 10:53 AM, Benjamin Sanderse wrote: >>>>>>>> >>>>>>>>> Hi Barry, >>>>>>>>> >>>>>>>>> I am still not too happy with the execution in parallel. I am working under Linux (64 bits) and still using your approach with two command windows (since it gives the best debugging possibility). >>>>>>>>> As I said, sometimes things work, but most of the time not. Here is the output of two successive runs >>>>>>>>> >>>>>>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>>>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>>>>>> [1] PetscFinalize(): PetscFinalize() called >>>>>>>>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >>>>>>>>> [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >>>>>>>>> [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >>>>>>>>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >>>>>>>>> [0] PetscFinalize(): PetscFinalize() called >>>>>>>>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >>>>>>>>> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >>>>>>>>> [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >>>>>>>>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >>>>>>>>> >>>>>>>>> >>>>>>>>> -bash-4.0$ netstat | grep 5005 >>>>>>>>> >>>>>>>>> >>>>>>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>>>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>>>>>> ^C >>>>>>>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>>>>>>>> [0]1:Return code = 0, signaled with Interrupt >>>>>>>>> >>>>>>>>> >>>>>>>>> In both cases I first started the Matlab program. I am currently starting Matlab without a GUI, but with a GUI I have the same problems. >>>>>>>>> As you can see, in the first case everything works fine, and Petsc finalizes and closes. Matlab gives me the correct output. The second case, run just a couple of seconds later, does not reach PetscFinalize and Matlab does not give the correct output. In between the two cases I checked if port 5005 was in use, and it was not. >>>>>>>>> Do you have any more suggestions on how to get this to work properly? >>>>>>>>> >>>>>>>>> Benjamin >>>>>>>>> >>>>>>>>> Op 3 sep 2010, om 21:11 heeft Barry Smith het volgende geschreven: >>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Sep 3, 2010, at 4:32 PM, Benjamin Sanderse wrote: >>>>>>>>>> >>>>>>>>>>> Hi Barry, >>>>>>>>>>> >>>>>>>>>>> Thanks for your help! However, there are still some issues left. In other to test things, I simplified the program even more and now I am just sending a vector back and forth: matlab->petsc->matlab: >>>>>>>>>>> >>>>>>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>>>>>>> >>>>>>>>>>> // load rhs vector >>>>>>>>>>> ierr = VecLoad(fd,VECMPI,&b);CHKERRQ(ierr); >>>>>>>>>>> >>>>>>>>>>> // send to matlab >>>>>>>>>>> ierr = VecView(b,fd);CHKERRQ(ierr); >>>>>>>>>>> ierr = VecDestroy(b);CHKERRQ(ierr); >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> - Your approach with two windows works *sometimes*. I removed the 'launch' statement and I executed my program 10 times, the first 2 times worked, and in all other cases I got this: >>>>>>>>>>> >>>>>>>>>>> petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>>>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again >>>>>>>>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again^C >>>>>>>>>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>>>>>>>>>> [0]1:Return code = 0, signaled with Interrupt >>>>>>>>>>> >>>>>>>>>>> Every time I start the program I use close(socket) and clear all in Matlab, so the socket from the previous run should not be present anymore. It seems that the port gets corrupted after a couple of times? Matlab does not respond and I have to kill it and restart it manually. >>>>>>>>>> >>>>>>>>>> Sometimes when you close a socket connection it doesn't close for a very long time so that if you try to open it again it doesn't work. When it appears the socket can not be used try using netstat | grep 5005 to see if the socket is still active. >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> - If I include the launch statement, or just type >>>>>>>>>>> system('mpiexec -n 2 ./petsc_poisson_par_barry2 &') >>>>>>>>>>> the program never works. >>>>>>>>>> >>>>>>>>>> Are you sure mpiexec is in the path of system and it is the right one? The problem is that we are kind of cheating with system because we start a new job in the background and have no idea what the output is. Are you using unix and running Matlab on the command line or in a GUI? >>>>>>>>>> >>>>>>>>>> Barry >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Hope you can figure out what is going wrong. >>>>>>>>>>> >>>>>>>>>>> Ben >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Op 3 sep 2010, om 13:25 heeft Barry Smith het volgende geschreven: >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Ben >>>>>>>>>>>> >>>>>>>>>>>> Ok, I figured out the problem. It is not fundamental and mostly comes from not having a create way to debug this. >>>>>>>>>>>> >>>>>>>>>>>> The test vector you create is sequential then you try to view it back to Matlab with the parallel fd viewer. If you change to >>>>>>>>>>>> ierr = VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,1,&test);CHKERRQ(ierr); >>>>>>>>>>>> then the code runs. >>>>>>>>>>>> >>>>>>>>>>>> I've found (just now) that when I use launch all the output from the .c program gets lost which makes it impossible to figure out what has gone wrong. You can debug by running the two parts of the computation in two different windows. So comment out the launch from the matlab script and then in Matlab run the script (it will hang waiting for the socket to work) and in a separate terminal window run the .c program; for example petscmpiexec -n 2 ./ex1 -info Now you see exactly what is happening in the PETSc program. You can even use -start_in_debugger on the PETSc side to run the debugger on crashes. >>>>>>>>>>>> >>>>>>>>>>>> I'll add this to the docs for launch >>>>>>>>>>>> >>>>>>>>>>>> Barry >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On Sep 2, 2010, at 3:28 PM, Benjamin Sanderse wrote: >>>>>>>>>>>> >>>>>>>>>>>>> Hi Barry, >>>>>>>>>>>>> >>>>>>>>>>>>> I attached my matlab file, c file and makefile. First I generate the executable with 'make petsc_poisson_par_barry' and then I run test_petsc_par_barry.m. >>>>>>>>>>>>> If you change MATMPIAIJ to MATAIJ and VECMPI to VECSEQ the code works fine. >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks a lot, >>>>>>>>>>>>> >>>>>>>>>>>>> Benjamin >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Op 2 sep 2010, om 13:45 heeft Barry Smith het volgende geschreven: >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Matlab is never aware the vector is parallel. Please send me the code and I'll figure out what is going on. >>>>>>>>>>>>>> >>>>>>>>>>>>>> Barry >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Sep 2, 2010, at 2:07 PM, Benjamin Sanderse wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Found unrecogonized header 0 in file. If your file contains complex numbers >>>>>>>>>>>>>>> then call PetscBinaryRead() with "complex" as the second argument >>>>>>>>>>>>>>> Error in ==> PetscBinaryRead at 27 >>>>>>>>>>>>>>> if nargin < 2 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> ??? Output argument "varargout" (and maybe others) not assigned during call to >>>>>>>>>>>>>>> "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Error in ==> test_petsc_par at 57 >>>>>>>>>>>>>>> x4 = PetscBinaryReady(PS); >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>>>>>>>>>>> ... >>>>>>>>>>>>>>> KSPSolve(ksp,b,x); >>>>>>>>>>>>>>> ... >>>>>>>>>>>>>>> VecView(fd,x); >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks for the help! >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Ben >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Hello all, >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. >>>>>>>>>>>>>>>>> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Ben >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>> >> >> > From xy2102 at columbia.edu Fri Sep 10 09:54:43 2010 From: xy2102 at columbia.edu (Rebecca Xuefei Yuan) Date: Fri, 10 Sep 2010 10:54:43 -0400 Subject: [petsc-users] DIVERGED_LS_FAILURE Message-ID: <20100910105443.952ldffipwko88go@cubmail.cc.columbia.edu> Dear all, I have the convergence history as 0 SNES Function norm 1.930639214656e-14 0 KSP Residual norm 1.930639214656e-14 1 KSP Residual norm 1.897969082784e-14 2 KSP Residual norm 1.890943705731e-14 3 KSP Residual norm 1.890499250642e-14 4 KSP Residual norm 1.882719263807e-14 5 KSP Residual norm 1.881639616800e-14 6 KSP Residual norm 1.881624492545e-14 7 KSP Residual norm 1.876497311488e-14 8 KSP Residual norm 1.864714377471e-14 9 KSP Residual norm 1.861061216430e-14 10 KSP Residual norm 1.856993815540e-14 11 KSP Residual norm 1.856245191617e-14 12 KSP Residual norm 1.855993722295e-14 13 KSP Residual norm 1.855756795426e-14 14 KSP Residual norm 1.855542358114e-14 15 KSP Residual norm 1.855188622531e-14 16 KSP Residual norm 1.854906019037e-14 17 KSP Residual norm 1.851397858397e-14 18 KSP Residual norm 1.851152033377e-14 19 KSP Residual norm 1.845662066987e-14 20 KSP Residual norm 1.810881990601e-14 21 KSP Residual norm 9.715701270773e-15 22 KSP Residual norm 2.134473951854e-15 Linear solve converged due to CONVERGED_RTOL iterations 22 Nonlinear solve did not converge due to DIVERGED_LS_FAILURE Why the linear solver is failed? Thanks a lot! Rebecca Xuefei YUAN Department of Applied Physics and Applied Mathematics Columbia University Tel:917-399-8032 www.columbia.edu/~xy2102 From bsmith at mcs.anl.gov Fri Sep 10 10:04:00 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 10 Sep 2010 10:04:00 -0500 Subject: [petsc-users] DIVERGED_LS_FAILURE In-Reply-To: <20100910105443.952ldffipwko88go@cubmail.cc.columbia.edu> References: <20100910105443.952ldffipwko88go@cubmail.cc.columbia.edu> Message-ID: <7302A973-505A-4592-9A3A-25EDC16F1A88@mcs.anl.gov> The linear solver did not fail to converge. It says it converged. The nonlinear solver failed because the line search failed, not the linear solver. This is common when the changes are tiny, tiny like they are here. You cannot expect that corrections that are the same size as the round error to improve the nonlinear system. That is your initial guess is so small you cannot do any better. I will change the LS in the message to line_search for clarity in petsc-dev Barry On Sep 10, 2010, at 9:54 AM, Rebecca Xuefei Yuan wrote: > Dear all, > > I have the convergence history as > > > > 0 SNES Function norm 1.930639214656e-14 > 0 KSP Residual norm 1.930639214656e-14 > 1 KSP Residual norm 1.897969082784e-14 > 2 KSP Residual norm 1.890943705731e-14 > 3 KSP Residual norm 1.890499250642e-14 > 4 KSP Residual norm 1.882719263807e-14 > 5 KSP Residual norm 1.881639616800e-14 > 6 KSP Residual norm 1.881624492545e-14 > 7 KSP Residual norm 1.876497311488e-14 > 8 KSP Residual norm 1.864714377471e-14 > 9 KSP Residual norm 1.861061216430e-14 > 10 KSP Residual norm 1.856993815540e-14 > 11 KSP Residual norm 1.856245191617e-14 > 12 KSP Residual norm 1.855993722295e-14 > 13 KSP Residual norm 1.855756795426e-14 > 14 KSP Residual norm 1.855542358114e-14 > 15 KSP Residual norm 1.855188622531e-14 > 16 KSP Residual norm 1.854906019037e-14 > 17 KSP Residual norm 1.851397858397e-14 > 18 KSP Residual norm 1.851152033377e-14 > 19 KSP Residual norm 1.845662066987e-14 > 20 KSP Residual norm 1.810881990601e-14 > 21 KSP Residual norm 9.715701270773e-15 > 22 KSP Residual norm 2.134473951854e-15 > Linear solve converged due to CONVERGED_RTOL iterations 22 > Nonlinear solve did not converge due to DIVERGED_LS_FAILURE > > Why the linear solver is failed? > > Thanks a lot! > > > > > > Rebecca Xuefei YUAN > Department of Applied Physics and Applied Mathematics > Columbia University > Tel:917-399-8032 > www.columbia.edu/~xy2102 > From xy2102 at columbia.edu Fri Sep 10 11:05:35 2010 From: xy2102 at columbia.edu (Rebecca Xuefei Yuan) Date: Fri, 10 Sep 2010 12:05:35 -0400 Subject: [petsc-users] DIVERGED_LS_FAILURE In-Reply-To: <7302A973-505A-4592-9A3A-25EDC16F1A88@mcs.anl.gov> References: <20100910105443.952ldffipwko88go@cubmail.cc.columbia.edu> <7302A973-505A-4592-9A3A-25EDC16F1A88@mcs.anl.gov> Message-ID: <20100910120535.il4k40meaok8ws8k@cubmail.cc.columbia.edu> Dear Barry, Thanks for your explanation! R Quoting Barry Smith : > > The linear solver did not fail to converge. It says it converged. > > The nonlinear solver failed because the line search failed, not > the linear solver. This is common when the changes are tiny, tiny > like they are here. You cannot expect that corrections that are the > same size as the round error to improve the nonlinear system. That > is your initial guess is so small you cannot do any better. > > I will change the LS in the message to line_search for clarity in > petsc-dev > > > > Barry > > On Sep 10, 2010, at 9:54 AM, Rebecca Xuefei Yuan wrote: > >> Dear all, >> >> I have the convergence history as >> >> >> >> 0 SNES Function norm 1.930639214656e-14 >> 0 KSP Residual norm 1.930639214656e-14 >> 1 KSP Residual norm 1.897969082784e-14 >> 2 KSP Residual norm 1.890943705731e-14 >> 3 KSP Residual norm 1.890499250642e-14 >> 4 KSP Residual norm 1.882719263807e-14 >> 5 KSP Residual norm 1.881639616800e-14 >> 6 KSP Residual norm 1.881624492545e-14 >> 7 KSP Residual norm 1.876497311488e-14 >> 8 KSP Residual norm 1.864714377471e-14 >> 9 KSP Residual norm 1.861061216430e-14 >> 10 KSP Residual norm 1.856993815540e-14 >> 11 KSP Residual norm 1.856245191617e-14 >> 12 KSP Residual norm 1.855993722295e-14 >> 13 KSP Residual norm 1.855756795426e-14 >> 14 KSP Residual norm 1.855542358114e-14 >> 15 KSP Residual norm 1.855188622531e-14 >> 16 KSP Residual norm 1.854906019037e-14 >> 17 KSP Residual norm 1.851397858397e-14 >> 18 KSP Residual norm 1.851152033377e-14 >> 19 KSP Residual norm 1.845662066987e-14 >> 20 KSP Residual norm 1.810881990601e-14 >> 21 KSP Residual norm 9.715701270773e-15 >> 22 KSP Residual norm 2.134473951854e-15 >> Linear solve converged due to CONVERGED_RTOL iterations 22 >> Nonlinear solve did not converge due to DIVERGED_LS_FAILURE >> >> Why the linear solver is failed? >> >> Thanks a lot! >> >> >> >> >> >> Rebecca Xuefei YUAN >> Department of Applied Physics and Applied Mathematics >> Columbia University >> Tel:917-399-8032 >> www.columbia.edu/~xy2102 >> > > > Rebecca Xuefei YUAN Department of Applied Physics and Applied Mathematics Columbia University Tel:917-399-8032 www.columbia.edu/~xy2102 From bourdin at lsu.edu Fri Sep 10 11:27:01 2010 From: bourdin at lsu.edu (Blaise Bourdin) Date: Fri, 10 Sep 2010 11:27:01 -0500 Subject: [petsc-users] Storing an array in a PetscBag Message-ID: <6844570C-8452-45B7-AEFE-775DDF47B791@lsu.edu> Hi, Is there an easy way to use PetscBag to manage arrays of user data? optimally, I would like to something like typedef struct { PetscInt n; PetscReal *values; } MyParameters; MyParameters *params and be able to register "values" in the bag so that I can specify its values as -n 3 -values 1.,2.,3. or -n 2 -values 1.,2. Right now, I can see how I can get n prior to creating the bag and register params->values[0] to params->values[n-1] so that the command line could become -n 3 -values0 1 -values1 2 -values2 3 or -n 2 -values0 1 -values1 2 Would that require adding PetscBagRegisterXXXArray function or is it feasible in the current implementation? Blaise -- Department of Mathematics and Center for Computation & Technology Louisiana State University, Baton Rouge, LA 70803, USA Tel. +1 (225) 578 1612, Fax +1 (225) 578 4276 http://www.math.lsu.edu/~bourdin From B.Sanderse at cwi.nl Fri Sep 10 11:27:17 2010 From: B.Sanderse at cwi.nl (Benjamin Sanderse) Date: Fri, 10 Sep 2010 10:27:17 -0600 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> <30D58A3F-D8D9-4692-8BE9-BB8168201BD3@mcs.anl.gov> <6F879BC3-2E8E-4B30-B4A8-CDB039BD3A1E@cwi.nl> <9587193F-E30F-40F2-BF6A-4B690DA5066B@cwi.nl> <261FD0FD-7085-4C71-B2BD-7DB26ACDA853@cwi.nl> <25591182-B391-4B29-B14F-8CC3B62ABD7F@mcs.anl.gov> Message-ID: <4BDFDB50-ED6D-4D44-A8F8-E14F0ED6C10F@cwi.nl> Hi guys, Still some other questions popping up: - Until now I have been using the 'two-shell' approach suggested by Barry for debugging purposes. This approach works fine, but in a later stage I would like to include the petsc execution command back in Matlab. I tried the following in Matlab: system('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &); unix('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &); In both cases this doesn't work, while issuing the command in a separate shell works fine. Any ideas? - I am using CG to solve a symmetric positive definite matrix. As preconditioner I normally use ICC (incomplete choleski), but apparently this is not implemented in parallel in Petsc. Suggestions on what to take as preconditioner? Thanks, Ben Op 9 sep 2010, om 14:36 heeft Satish Balay het volgende geschreven: > I just attempted this on a linux64 box [ubuntu 10.04 with matlab > version 7.10.0.499 (R2010a) 64-bit (glnxa64)] with petsc-3.1 - and > the loops ran fine. > > -bash: > for i in {1..50}; do mpiexec -n 2 ./petsc_poisson_par_barry2 -viewer_socket_port 20001; echo $i; done > > matlab: > for i=1:500,;i; test_petsc_par_barry; end > > Perhaps you can try a high port number - and the same loops - and see if you get > stuck on this machine. > > Satish > > On Thu, 9 Sep 2010, Barry Smith wrote: > >> >> On Sep 9, 2010, at 2:51 PM, Benjamin Sanderse wrote: >> >>> I have installed Petsc on a remote 64-bits Linux (Fedora). Might there be an issue with the 64-bits? >> >> Shouldn't mater. >> >>> >>> My own computer is a Macbook Pro, but I am not running Petsc on it, because in the past I have had severe problems with getting mex-files to work under 64-bits Matlab. >> >> Yes, that is painful, Matlab never respected the Mac >> >>> I could try to install it, though. >>> Another option is that I try it on another remote Linux machine, also 64-bits, running Red Hat. >> >> You could try this to see if the same problem exists. >> >>> >>> What do you suggest? >> >> Are you sure it is hanging on the sockets not hanging on running MPI jobs one after each other? >> >> Barry >> >>> >>> >>> Op 9 sep 2010, om 13:10 heeft Barry Smith het volgende geschreven: >>> >>>> >>>> What OS are you using. >>>> >>>> On my Apple Mac I made a shell script loop calling petsc_poisson_par_barry2 multiple times and a similar loop in Matlab and start then both off (with parallel PETSc runs). It runs flawlessly, opening the socket sending and receiving, dozens of times in a row with several processes. I think that maybe you are using Linux? >>>> >>>> >>>> Barry >>>> >>>> >>>> On Sep 8, 2010, at 2:32 PM, Benjamin Sanderse wrote: >>>> >>>>> That's also what I thought. I checked once again, and I found out that when I use >>>>> >>>>> petscmpiexec -n 1 >>>>> >>>>> the program works, but if I increase the number of processors to 2 it only works once in a while. >>>>> >>>>> I attached my test code. It is extremely simple and does nothing else than passing a vector to petsc and returning it to matlab. >>>>> >>>>> I run it as follows: >>>>> >>>>> shell #1 >>>>> -bash-4.0$ make petsc_poisson_par_barry2 >>>>> >>>>> shell #2 >>>>> -bash-4.0$ matlab -nojvm -nodisplay >>>>>>> test_petsc_par_barry; >>>>> >>>>> shell #1 >>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -viewer_socket_port 5006 -info >>>>> >>>>> On lucky days, this works, on unlucky days, petsc will stop here: >>>>> >>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5006 machine borr.mas.cwi.nl >>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>> ^C >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> Op 8 sep 2010, om 12:00 heeft Barry Smith het volgende geschreven: >>>>> >>>>>> >>>>>> On Sep 8, 2010, at 10:13 AM, Benjamin Sanderse wrote: >>>>>> >>>>>>> Hi Barry, >>>>>>> >>>>>>> I am indeed closing the socket in Matlab between the two sets, using close(PS), where PS=PetscOpenSocket. >>>>>>> I have tried different port numbers, but without guarantee of success. Sometimes it works, sometimes it doesn't. Often times the first time calling the PetscOpenSocket(portnumber) works, but even that is not guaranteed. I think there should be another solution. >>>>>>> By the way, all these problems do not appear when using serial vectors instead of parallel. >>>>>> >>>>>> That is strange. Only the first process ever opens the socket so in theory the fact that the PETSc code is parallel should not matter at all. Please send me your test code that causes trouble again and I'll see if I can reproduce the problem. >>>>>> >>>>>> Barry >>>>>> >>>>>>> >>>>>>> Ben >>>>>>> >>>>>>> Op 7 sep 2010, om 17:27 heeft Barry Smith het volgende geschreven: >>>>>>> >>>>>>>> >>>>>>>> Are you closing the socket on Matlab between to the two sets? Just checking. >>>>>>>> >>>>>>>> You can try running with a different port number each time to see if it is related to trying to reuse the port. Run with PetscOpenSocket(5006) and the PETSc program with -viewer_socket_port 5006 >>>>>>>> then run both with 5007 then with 5008 etc does this work smoothly? >>>>>>>> >>>>>>>> Let me know and the will tell me the next step to try, >>>>>>>> >>>>>>>> Barry >>>>>>>> >>>>>>>> On Sep 7, 2010, at 10:53 AM, Benjamin Sanderse wrote: >>>>>>>> >>>>>>>>> Hi Barry, >>>>>>>>> >>>>>>>>> I am still not too happy with the execution in parallel. I am working under Linux (64 bits) and still using your approach with two command windows (since it gives the best debugging possibility). >>>>>>>>> As I said, sometimes things work, but most of the time not. Here is the output of two successive runs >>>>>>>>> >>>>>>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>>>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>>>>>> [1] PetscFinalize(): PetscFinalize() called >>>>>>>>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >>>>>>>>> [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >>>>>>>>> [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >>>>>>>>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >>>>>>>>> [0] PetscFinalize(): PetscFinalize() called >>>>>>>>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >>>>>>>>> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >>>>>>>>> [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >>>>>>>>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >>>>>>>>> >>>>>>>>> >>>>>>>>> -bash-4.0$ netstat | grep 5005 >>>>>>>>> >>>>>>>>> >>>>>>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>>>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>>>>>> ^C >>>>>>>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>>>>>>>> [0]1:Return code = 0, signaled with Interrupt >>>>>>>>> >>>>>>>>> >>>>>>>>> In both cases I first started the Matlab program. I am currently starting Matlab without a GUI, but with a GUI I have the same problems. >>>>>>>>> As you can see, in the first case everything works fine, and Petsc finalizes and closes. Matlab gives me the correct output. The second case, run just a couple of seconds later, does not reach PetscFinalize and Matlab does not give the correct output. In between the two cases I checked if port 5005 was in use, and it was not. >>>>>>>>> Do you have any more suggestions on how to get this to work properly? >>>>>>>>> >>>>>>>>> Benjamin >>>>>>>>> >>>>>>>>> Op 3 sep 2010, om 21:11 heeft Barry Smith het volgende geschreven: >>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Sep 3, 2010, at 4:32 PM, Benjamin Sanderse wrote: >>>>>>>>>> >>>>>>>>>>> Hi Barry, >>>>>>>>>>> >>>>>>>>>>> Thanks for your help! However, there are still some issues left. In other to test things, I simplified the program even more and now I am just sending a vector back and forth: matlab->petsc->matlab: >>>>>>>>>>> >>>>>>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>>>>>>> >>>>>>>>>>> // load rhs vector >>>>>>>>>>> ierr = VecLoad(fd,VECMPI,&b);CHKERRQ(ierr); >>>>>>>>>>> >>>>>>>>>>> // send to matlab >>>>>>>>>>> ierr = VecView(b,fd);CHKERRQ(ierr); >>>>>>>>>>> ierr = VecDestroy(b);CHKERRQ(ierr); >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> - Your approach with two windows works *sometimes*. I removed the 'launch' statement and I executed my program 10 times, the first 2 times worked, and in all other cases I got this: >>>>>>>>>>> >>>>>>>>>>> petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>>>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again >>>>>>>>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again^C >>>>>>>>>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>>>>>>>>>> [0]1:Return code = 0, signaled with Interrupt >>>>>>>>>>> >>>>>>>>>>> Every time I start the program I use close(socket) and clear all in Matlab, so the socket from the previous run should not be present anymore. It seems that the port gets corrupted after a couple of times? Matlab does not respond and I have to kill it and restart it manually. >>>>>>>>>> >>>>>>>>>> Sometimes when you close a socket connection it doesn't close for a very long time so that if you try to open it again it doesn't work. When it appears the socket can not be used try using netstat | grep 5005 to see if the socket is still active. >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> - If I include the launch statement, or just type >>>>>>>>>>> system('mpiexec -n 2 ./petsc_poisson_par_barry2 &') >>>>>>>>>>> the program never works. >>>>>>>>>> >>>>>>>>>> Are you sure mpiexec is in the path of system and it is the right one? The problem is that we are kind of cheating with system because we start a new job in the background and have no idea what the output is. Are you using unix and running Matlab on the command line or in a GUI? >>>>>>>>>> >>>>>>>>>> Barry >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Hope you can figure out what is going wrong. >>>>>>>>>>> >>>>>>>>>>> Ben >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Op 3 sep 2010, om 13:25 heeft Barry Smith het volgende geschreven: >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Ben >>>>>>>>>>>> >>>>>>>>>>>> Ok, I figured out the problem. It is not fundamental and mostly comes from not having a create way to debug this. >>>>>>>>>>>> >>>>>>>>>>>> The test vector you create is sequential then you try to view it back to Matlab with the parallel fd viewer. If you change to >>>>>>>>>>>> ierr = VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,1,&test);CHKERRQ(ierr); >>>>>>>>>>>> then the code runs. >>>>>>>>>>>> >>>>>>>>>>>> I've found (just now) that when I use launch all the output from the .c program gets lost which makes it impossible to figure out what has gone wrong. You can debug by running the two parts of the computation in two different windows. So comment out the launch from the matlab script and then in Matlab run the script (it will hang waiting for the socket to work) and in a separate terminal window run the .c program; for example petscmpiexec -n 2 ./ex1 -info Now you see exactly what is happening in the PETSc program. You can even use -start_in_debugger on the PETSc side to run the debugger on crashes. >>>>>>>>>>>> >>>>>>>>>>>> I'll add this to the docs for launch >>>>>>>>>>>> >>>>>>>>>>>> Barry >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On Sep 2, 2010, at 3:28 PM, Benjamin Sanderse wrote: >>>>>>>>>>>> >>>>>>>>>>>>> Hi Barry, >>>>>>>>>>>>> >>>>>>>>>>>>> I attached my matlab file, c file and makefile. First I generate the executable with 'make petsc_poisson_par_barry' and then I run test_petsc_par_barry.m. >>>>>>>>>>>>> If you change MATMPIAIJ to MATAIJ and VECMPI to VECSEQ the code works fine. >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks a lot, >>>>>>>>>>>>> >>>>>>>>>>>>> Benjamin >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Op 2 sep 2010, om 13:45 heeft Barry Smith het volgende geschreven: >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Matlab is never aware the vector is parallel. Please send me the code and I'll figure out what is going on. >>>>>>>>>>>>>> >>>>>>>>>>>>>> Barry >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Sep 2, 2010, at 2:07 PM, Benjamin Sanderse wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Found unrecogonized header 0 in file. If your file contains complex numbers >>>>>>>>>>>>>>> then call PetscBinaryRead() with "complex" as the second argument >>>>>>>>>>>>>>> Error in ==> PetscBinaryRead at 27 >>>>>>>>>>>>>>> if nargin < 2 >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> ??? Output argument "varargout" (and maybe others) not assigned during call to >>>>>>>>>>>>>>> "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Error in ==> test_petsc_par at 57 >>>>>>>>>>>>>>> x4 = PetscBinaryReady(PS); >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>>>>>>>>>>> ... >>>>>>>>>>>>>>> KSPSolve(ksp,b,x); >>>>>>>>>>>>>>> ... >>>>>>>>>>>>>>> VecView(fd,x); >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks for the help! >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Ben >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Hello all, >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. >>>>>>>>>>>>>>>>> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Ben >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>> >> >> > From jed at 59A2.org Fri Sep 10 11:38:32 2010 From: jed at 59A2.org (Jed Brown) Date: Fri, 10 Sep 2010 18:38:32 +0200 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: <4BDFDB50-ED6D-4D44-A8F8-E14F0ED6C10F@cwi.nl> References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> <30D58A3F-D8D9-4692-8BE9-BB8168201BD3@mcs.anl.gov> <6F879BC3-2E8E-4B30-B4A8-CDB039BD3A1E@cwi.nl> <9587193F-E30F-40F2-BF6A-4B690DA5066B@cwi.nl> <261FD0FD-7085-4C71-B2BD-7DB26ACDA853@cwi.nl> <25591182-B391-4B29-B14F-8CC3B62ABD7F@mcs.anl.gov> <4BDFDB50-ED6D-4D44-A8F8-E14F0ED6C10F@cwi.nl> Message-ID: <878w39wo3b.fsf@59A2.org> On Fri, 10 Sep 2010 10:27:17 -0600, Benjamin Sanderse wrote: > - Until now I have been using the 'two-shell' approach suggested by Barry for debugging purposes. This approach works fine, but in a later stage I would like to include the petsc execution command back in Matlab. I tried the following in Matlab: > system('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &); > > unix('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &); > > In both cases this doesn't work, while issuing the command in a separate shell works fine. Any ideas? Please provide the output, "doesn't work" is not much information. > - I am using CG to solve a symmetric positive definite matrix. As preconditioner I normally use ICC (incomplete choleski), but apparently this is not implemented in parallel in Petsc. Suggestions on what to take as preconditioner? Start with block Jacobi + ICC (-pc_type bjacobi -sub_pc_type icc) or ASM + ICC (-pc_type asm -sub_pc_type icc). Jed From bsmith at mcs.anl.gov Fri Sep 10 13:05:48 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 10 Sep 2010 13:05:48 -0500 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: <4BDFDB50-ED6D-4D44-A8F8-E14F0ED6C10F@cwi.nl> References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> <30D58A3F-D8D9-4692-8BE9-BB8168201BD3@mcs.anl.gov> <6F879BC3-2E8E-4B30-B4A8-CDB039BD3A1E@cwi.nl> <9587193F-E30F-40F2-BF6A-4B690DA5066B@cwi.nl> <261FD0FD-7085-4C71-B2BD-7DB26ACDA853@cwi.nl> <25591182-B391-4B29-B14F-8CC3B62ABD7F@mcs.anl.gov> <4BDFDB50-ED6D-4D44-A8F8-E14F0ED6C10F@cwi.nl> Message-ID: <8D1D3964-EA53-4C69-AECB-4763243786ED@mcs.anl.gov> On Sep 10, 2010, at 11:27 AM, Benjamin Sanderse wrote: > Hi guys, > > Still some other questions popping up: > > - Until now I have been using the 'two-shell' approach suggested by Barry for debugging purposes. This approach works fine, but in a later stage I would like to include the petsc execution command back in Matlab. I tried the following in Matlab: > system('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &); > > unix('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &); > > In both cases this doesn't work, while issuing the command in a separate shell works fine. Any ideas? Is petscmpiexec in your path when run inside system() and is PETSC_DIR and PETSC_ARCH set in there? Barry > > - I am using CG to solve a symmetric positive definite matrix. As preconditioner I normally use ICC (incomplete choleski), but apparently this is not implemented in parallel in Petsc. Suggestions on what to take as preconditioner? > > Thanks, > > Ben > > Op 9 sep 2010, om 14:36 heeft Satish Balay het volgende geschreven: > >> I just attempted this on a linux64 box [ubuntu 10.04 with matlab >> version 7.10.0.499 (R2010a) 64-bit (glnxa64)] with petsc-3.1 - and >> the loops ran fine. >> >> -bash: >> for i in {1..50}; do mpiexec -n 2 ./petsc_poisson_par_barry2 -viewer_socket_port 20001; echo $i; done >> >> matlab: >> for i=1:500,;i; test_petsc_par_barry; end >> >> Perhaps you can try a high port number - and the same loops - and see if you get >> stuck on this machine. >> >> Satish >> >> On Thu, 9 Sep 2010, Barry Smith wrote: >> >>> >>> On Sep 9, 2010, at 2:51 PM, Benjamin Sanderse wrote: >>> >>>> I have installed Petsc on a remote 64-bits Linux (Fedora). Might there be an issue with the 64-bits? >>> >>> Shouldn't mater. >>> >>>> >>>> My own computer is a Macbook Pro, but I am not running Petsc on it, because in the past I have had severe problems with getting mex-files to work under 64-bits Matlab. >>> >>> Yes, that is painful, Matlab never respected the Mac >>> >>>> I could try to install it, though. >>>> Another option is that I try it on another remote Linux machine, also 64-bits, running Red Hat. >>> >>> You could try this to see if the same problem exists. >>> >>>> >>>> What do you suggest? >>> >>> Are you sure it is hanging on the sockets not hanging on running MPI jobs one after each other? >>> >>> Barry >>> >>>> >>>> >>>> Op 9 sep 2010, om 13:10 heeft Barry Smith het volgende geschreven: >>>> >>>>> >>>>> What OS are you using. >>>>> >>>>> On my Apple Mac I made a shell script loop calling petsc_poisson_par_barry2 multiple times and a similar loop in Matlab and start then both off (with parallel PETSc runs). It runs flawlessly, opening the socket sending and receiving, dozens of times in a row with several processes. I think that maybe you are using Linux? >>>>> >>>>> >>>>> Barry >>>>> >>>>> >>>>> On Sep 8, 2010, at 2:32 PM, Benjamin Sanderse wrote: >>>>> >>>>>> That's also what I thought. I checked once again, and I found out that when I use >>>>>> >>>>>> petscmpiexec -n 1 >>>>>> >>>>>> the program works, but if I increase the number of processors to 2 it only works once in a while. >>>>>> >>>>>> I attached my test code. It is extremely simple and does nothing else than passing a vector to petsc and returning it to matlab. >>>>>> >>>>>> I run it as follows: >>>>>> >>>>>> shell #1 >>>>>> -bash-4.0$ make petsc_poisson_par_barry2 >>>>>> >>>>>> shell #2 >>>>>> -bash-4.0$ matlab -nojvm -nodisplay >>>>>>>> test_petsc_par_barry; >>>>>> >>>>>> shell #1 >>>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -viewer_socket_port 5006 -info >>>>>> >>>>>> On lucky days, this works, on unlucky days, petsc will stop here: >>>>>> >>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5006 machine borr.mas.cwi.nl >>>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>>> ^C >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> Op 8 sep 2010, om 12:00 heeft Barry Smith het volgende geschreven: >>>>>> >>>>>>> >>>>>>> On Sep 8, 2010, at 10:13 AM, Benjamin Sanderse wrote: >>>>>>> >>>>>>>> Hi Barry, >>>>>>>> >>>>>>>> I am indeed closing the socket in Matlab between the two sets, using close(PS), where PS=PetscOpenSocket. >>>>>>>> I have tried different port numbers, but without guarantee of success. Sometimes it works, sometimes it doesn't. Often times the first time calling the PetscOpenSocket(portnumber) works, but even that is not guaranteed. I think there should be another solution. >>>>>>>> By the way, all these problems do not appear when using serial vectors instead of parallel. >>>>>>> >>>>>>> That is strange. Only the first process ever opens the socket so in theory the fact that the PETSc code is parallel should not matter at all. Please send me your test code that causes trouble again and I'll see if I can reproduce the problem. >>>>>>> >>>>>>> Barry >>>>>>> >>>>>>>> >>>>>>>> Ben >>>>>>>> >>>>>>>> Op 7 sep 2010, om 17:27 heeft Barry Smith het volgende geschreven: >>>>>>>> >>>>>>>>> >>>>>>>>> Are you closing the socket on Matlab between to the two sets? Just checking. >>>>>>>>> >>>>>>>>> You can try running with a different port number each time to see if it is related to trying to reuse the port. Run with PetscOpenSocket(5006) and the PETSc program with -viewer_socket_port 5006 >>>>>>>>> then run both with 5007 then with 5008 etc does this work smoothly? >>>>>>>>> >>>>>>>>> Let me know and the will tell me the next step to try, >>>>>>>>> >>>>>>>>> Barry >>>>>>>>> >>>>>>>>> On Sep 7, 2010, at 10:53 AM, Benjamin Sanderse wrote: >>>>>>>>> >>>>>>>>>> Hi Barry, >>>>>>>>>> >>>>>>>>>> I am still not too happy with the execution in parallel. I am working under Linux (64 bits) and still using your approach with two command windows (since it gives the best debugging possibility). >>>>>>>>>> As I said, sometimes things work, but most of the time not. Here is the output of two successive runs >>>>>>>>>> >>>>>>>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>>>>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>>>>>>> [1] PetscFinalize(): PetscFinalize() called >>>>>>>>>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >>>>>>>>>> [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >>>>>>>>>> [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >>>>>>>>>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >>>>>>>>>> [0] PetscFinalize(): PetscFinalize() called >>>>>>>>>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >>>>>>>>>> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >>>>>>>>>> [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >>>>>>>>>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -bash-4.0$ netstat | grep 5005 >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>>>>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>>>>>>> ^C >>>>>>>>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>>>>>>>>> [0]1:Return code = 0, signaled with Interrupt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> In both cases I first started the Matlab program. I am currently starting Matlab without a GUI, but with a GUI I have the same problems. >>>>>>>>>> As you can see, in the first case everything works fine, and Petsc finalizes and closes. Matlab gives me the correct output. The second case, run just a couple of seconds later, does not reach PetscFinalize and Matlab does not give the correct output. In between the two cases I checked if port 5005 was in use, and it was not. >>>>>>>>>> Do you have any more suggestions on how to get this to work properly? >>>>>>>>>> >>>>>>>>>> Benjamin >>>>>>>>>> >>>>>>>>>> Op 3 sep 2010, om 21:11 heeft Barry Smith het volgende geschreven: >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> On Sep 3, 2010, at 4:32 PM, Benjamin Sanderse wrote: >>>>>>>>>>> >>>>>>>>>>>> Hi Barry, >>>>>>>>>>>> >>>>>>>>>>>> Thanks for your help! However, there are still some issues left. In other to test things, I simplified the program even more and now I am just sending a vector back and forth: matlab->petsc->matlab: >>>>>>>>>>>> >>>>>>>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>>>>>>>> >>>>>>>>>>>> // load rhs vector >>>>>>>>>>>> ierr = VecLoad(fd,VECMPI,&b);CHKERRQ(ierr); >>>>>>>>>>>> >>>>>>>>>>>> // send to matlab >>>>>>>>>>>> ierr = VecView(b,fd);CHKERRQ(ierr); >>>>>>>>>>>> ierr = VecDestroy(b);CHKERRQ(ierr); >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> - Your approach with two windows works *sometimes*. I removed the 'launch' statement and I executed my program 10 times, the first 2 times worked, and in all other cases I got this: >>>>>>>>>>>> >>>>>>>>>>>> petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>>>>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again >>>>>>>>>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again^C >>>>>>>>>>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>>>>>>>>>>> [0]1:Return code = 0, signaled with Interrupt >>>>>>>>>>>> >>>>>>>>>>>> Every time I start the program I use close(socket) and clear all in Matlab, so the socket from the previous run should not be present anymore. It seems that the port gets corrupted after a couple of times? Matlab does not respond and I have to kill it and restart it manually. >>>>>>>>>>> >>>>>>>>>>> Sometimes when you close a socket connection it doesn't close for a very long time so that if you try to open it again it doesn't work. When it appears the socket can not be used try using netstat | grep 5005 to see if the socket is still active. >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> - If I include the launch statement, or just type >>>>>>>>>>>> system('mpiexec -n 2 ./petsc_poisson_par_barry2 &') >>>>>>>>>>>> the program never works. >>>>>>>>>>> >>>>>>>>>>> Are you sure mpiexec is in the path of system and it is the right one? The problem is that we are kind of cheating with system because we start a new job in the background and have no idea what the output is. Are you using unix and running Matlab on the command line or in a GUI? >>>>>>>>>>> >>>>>>>>>>> Barry >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Hope you can figure out what is going wrong. >>>>>>>>>>>> >>>>>>>>>>>> Ben >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Op 3 sep 2010, om 13:25 heeft Barry Smith het volgende geschreven: >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Ben >>>>>>>>>>>>> >>>>>>>>>>>>> Ok, I figured out the problem. It is not fundamental and mostly comes from not having a create way to debug this. >>>>>>>>>>>>> >>>>>>>>>>>>> The test vector you create is sequential then you try to view it back to Matlab with the parallel fd viewer. If you change to >>>>>>>>>>>>> ierr = VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,1,&test);CHKERRQ(ierr); >>>>>>>>>>>>> then the code runs. >>>>>>>>>>>>> >>>>>>>>>>>>> I've found (just now) that when I use launch all the output from the .c program gets lost which makes it impossible to figure out what has gone wrong. You can debug by running the two parts of the computation in two different windows. So comment out the launch from the matlab script and then in Matlab run the script (it will hang waiting for the socket to work) and in a separate terminal window run the .c program; for example petscmpiexec -n 2 ./ex1 -info Now you see exactly what is happening in the PETSc program. You can even use -start_in_debugger on the PETSc side to run the debugger on crashes. >>>>>>>>>>>>> >>>>>>>>>>>>> I'll add this to the docs for launch >>>>>>>>>>>>> >>>>>>>>>>>>> Barry >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> On Sep 2, 2010, at 3:28 PM, Benjamin Sanderse wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> Hi Barry, >>>>>>>>>>>>>> >>>>>>>>>>>>>> I attached my matlab file, c file and makefile. First I generate the executable with 'make petsc_poisson_par_barry' and then I run test_petsc_par_barry.m. >>>>>>>>>>>>>> If you change MATMPIAIJ to MATAIJ and VECMPI to VECSEQ the code works fine. >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks a lot, >>>>>>>>>>>>>> >>>>>>>>>>>>>> Benjamin >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Op 2 sep 2010, om 13:45 heeft Barry Smith het volgende geschreven: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Matlab is never aware the vector is parallel. Please send me the code and I'll figure out what is going on. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Sep 2, 2010, at 2:07 PM, Benjamin Sanderse wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Found unrecogonized header 0 in file. If your file contains complex numbers >>>>>>>>>>>>>>>> then call PetscBinaryRead() with "complex" as the second argument >>>>>>>>>>>>>>>> Error in ==> PetscBinaryRead at 27 >>>>>>>>>>>>>>>> if nargin < 2 >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> ??? Output argument "varargout" (and maybe others) not assigned during call to >>>>>>>>>>>>>>>> "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Error in ==> test_petsc_par at 57 >>>>>>>>>>>>>>>> x4 = PetscBinaryReady(PS); >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>>>>>>>>>>>> ... >>>>>>>>>>>>>>>> KSPSolve(ksp,b,x); >>>>>>>>>>>>>>>> ... >>>>>>>>>>>>>>>> VecView(fd,x); >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thanks for the help! >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Ben >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Hello all, >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. >>>>>>>>>>>>>>>>>> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Ben >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>> >>> >> > From bsmith at mcs.anl.gov Fri Sep 10 13:44:29 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 10 Sep 2010 13:44:29 -0500 Subject: [petsc-users] Storing an array in a PetscBag In-Reply-To: <6844570C-8452-45B7-AEFE-775DDF47B791@lsu.edu> References: <6844570C-8452-45B7-AEFE-775DDF47B791@lsu.edu> Message-ID: We have always thought about bags as containing only very small, fixed amounts of data so no there is no mechanism to put in variable sized arrays. I hesitate to say we should add all that support and make it the bag such a complicated object. Barry On Sep 10, 2010, at 11:27 AM, Blaise Bourdin wrote: > Hi, > > Is there an easy way to use PetscBag to manage arrays of user data? > optimally, I would like to something like > typedef struct { > PetscInt n; > PetscReal *values; > } MyParameters; > MyParameters *params > > and be able to register "values" in the bag so that I can specify its values as > -n 3 -values 1.,2.,3. or -n 2 -values 1.,2. > > Right now, I can see how I can get n prior to creating the bag and register params->values[0] to params->values[n-1] so that the command line could become -n 3 -values0 1 -values1 2 -values2 3 or -n 2 -values0 1 -values1 2 > > Would that require adding PetscBagRegisterXXXArray function or is it feasible in the current implementation? > > Blaise > > -- > Department of Mathematics and Center for Computation & Technology > Louisiana State University, Baton Rouge, LA 70803, USA > Tel. +1 (225) 578 1612, Fax +1 (225) 578 4276 http://www.math.lsu.edu/~bourdin > > > > > > > From B.Sanderse at cwi.nl Fri Sep 10 14:57:45 2010 From: B.Sanderse at cwi.nl (Benjamin Sanderse) Date: Fri, 10 Sep 2010 13:57:45 -0600 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: <8D1D3964-EA53-4C69-AECB-4763243786ED@mcs.anl.gov> References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> <30D58A3F-D8D9-4692-8BE9-BB8168201BD3@mcs.anl.gov> <6F879BC3-2E8E-4B30-B4A8-CDB039BD3A1E@cwi.nl> <9587193F-E30F-40F2-BF6A-4B690DA5066B@cwi.nl> <261FD0FD-7085-4C71-B2BD-7DB26ACDA853@cwi.nl> <25591182-B391-4B29-B14F-8CC3B62ABD7F@mcs.anl.gov> <4BDFDB50-ED6D-4D44-A8F8-E14F0ED6C10F@cwi.nl> <8D1D3964-EA53-4C69-AEC B-4763243786ED@mcs.anl.gov> Message-ID: <891D8036-CDCB-45A8-8A65-31F30ADE8258@cwi.nl> I tried the following in Matlab: >> system('printenv PETSC_DIR') /net/shareware/src/petsc-3.1-p4/ ans = 0 >> system('which petscmpiexec') /net/shareware/src/petsc-3.1-p4/bin//petscmpiexec ans = 0 >> system('printenv PETSC_ARCH') linux-gnu-c-debug ans = 0 So I guess that this should be ok when running system(). Op 10 sep 2010, om 12:05 heeft Barry Smith het volgende geschreven: > > On Sep 10, 2010, at 11:27 AM, Benjamin Sanderse wrote: > >> Hi guys, >> >> Still some other questions popping up: >> >> - Until now I have been using the 'two-shell' approach suggested by Barry for debugging purposes. This approach works fine, but in a later stage I would like to include the petsc execution command back in Matlab. I tried the following in Matlab: >> system('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &); >> >> unix('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &); >> >> In both cases this doesn't work, while issuing the command in a separate shell works fine. Any ideas? > > Is petscmpiexec in your path when run inside system() and is PETSC_DIR and PETSC_ARCH set in there? > > Barry > >> >> - I am using CG to solve a symmetric positive definite matrix. As preconditioner I normally use ICC (incomplete choleski), but apparently this is not implemented in parallel in Petsc. Suggestions on what to take as preconditioner? >> >> Thanks, >> >> Ben >> >> Op 9 sep 2010, om 14:36 heeft Satish Balay het volgende geschreven: >> >>> I just attempted this on a linux64 box [ubuntu 10.04 with matlab >>> version 7.10.0.499 (R2010a) 64-bit (glnxa64)] with petsc-3.1 - and >>> the loops ran fine. >>> >>> -bash: >>> for i in {1..50}; do mpiexec -n 2 ./petsc_poisson_par_barry2 -viewer_socket_port 20001; echo $i; done >>> >>> matlab: >>> for i=1:500,;i; test_petsc_par_barry; end >>> >>> Perhaps you can try a high port number - and the same loops - and see if you get >>> stuck on this machine. >>> >>> Satish >>> >>> On Thu, 9 Sep 2010, Barry Smith wrote: >>> >>>> >>>> On Sep 9, 2010, at 2:51 PM, Benjamin Sanderse wrote: >>>> >>>>> I have installed Petsc on a remote 64-bits Linux (Fedora). Might there be an issue with the 64-bits? >>>> >>>> Shouldn't mater. >>>> >>>>> >>>>> My own computer is a Macbook Pro, but I am not running Petsc on it, because in the past I have had severe problems with getting mex-files to work under 64-bits Matlab. >>>> >>>> Yes, that is painful, Matlab never respected the Mac >>>> >>>>> I could try to install it, though. >>>>> Another option is that I try it on another remote Linux machine, also 64-bits, running Red Hat. >>>> >>>> You could try this to see if the same problem exists. >>>> >>>>> >>>>> What do you suggest? >>>> >>>> Are you sure it is hanging on the sockets not hanging on running MPI jobs one after each other? >>>> >>>> Barry >>>> >>>>> >>>>> >>>>> Op 9 sep 2010, om 13:10 heeft Barry Smith het volgende geschreven: >>>>> >>>>>> >>>>>> What OS are you using. >>>>>> >>>>>> On my Apple Mac I made a shell script loop calling petsc_poisson_par_barry2 multiple times and a similar loop in Matlab and start then both off (with parallel PETSc runs). It runs flawlessly, opening the socket sending and receiving, dozens of times in a row with several processes. I think that maybe you are using Linux? >>>>>> >>>>>> >>>>>> Barry >>>>>> >>>>>> >>>>>> On Sep 8, 2010, at 2:32 PM, Benjamin Sanderse wrote: >>>>>> >>>>>>> That's also what I thought. I checked once again, and I found out that when I use >>>>>>> >>>>>>> petscmpiexec -n 1 >>>>>>> >>>>>>> the program works, but if I increase the number of processors to 2 it only works once in a while. >>>>>>> >>>>>>> I attached my test code. It is extremely simple and does nothing else than passing a vector to petsc and returning it to matlab. >>>>>>> >>>>>>> I run it as follows: >>>>>>> >>>>>>> shell #1 >>>>>>> -bash-4.0$ make petsc_poisson_par_barry2 >>>>>>> >>>>>>> shell #2 >>>>>>> -bash-4.0$ matlab -nojvm -nodisplay >>>>>>>>> test_petsc_par_barry; >>>>>>> >>>>>>> shell #1 >>>>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -viewer_socket_port 5006 -info >>>>>>> >>>>>>> On lucky days, this works, on unlucky days, petsc will stop here: >>>>>>> >>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5006 machine borr.mas.cwi.nl >>>>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>>>> ^C >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> Op 8 sep 2010, om 12:00 heeft Barry Smith het volgende geschreven: >>>>>>> >>>>>>>> >>>>>>>> On Sep 8, 2010, at 10:13 AM, Benjamin Sanderse wrote: >>>>>>>> >>>>>>>>> Hi Barry, >>>>>>>>> >>>>>>>>> I am indeed closing the socket in Matlab between the two sets, using close(PS), where PS=PetscOpenSocket. >>>>>>>>> I have tried different port numbers, but without guarantee of success. Sometimes it works, sometimes it doesn't. Often times the first time calling the PetscOpenSocket(portnumber) works, but even that is not guaranteed. I think there should be another solution. >>>>>>>>> By the way, all these problems do not appear when using serial vectors instead of parallel. >>>>>>>> >>>>>>>> That is strange. Only the first process ever opens the socket so in theory the fact that the PETSc code is parallel should not matter at all. Please send me your test code that causes trouble again and I'll see if I can reproduce the problem. >>>>>>>> >>>>>>>> Barry >>>>>>>> >>>>>>>>> >>>>>>>>> Ben >>>>>>>>> >>>>>>>>> Op 7 sep 2010, om 17:27 heeft Barry Smith het volgende geschreven: >>>>>>>>> >>>>>>>>>> >>>>>>>>>> Are you closing the socket on Matlab between to the two sets? Just checking. >>>>>>>>>> >>>>>>>>>> You can try running with a different port number each time to see if it is related to trying to reuse the port. Run with PetscOpenSocket(5006) and the PETSc program with -viewer_socket_port 5006 >>>>>>>>>> then run both with 5007 then with 5008 etc does this work smoothly? >>>>>>>>>> >>>>>>>>>> Let me know and the will tell me the next step to try, >>>>>>>>>> >>>>>>>>>> Barry >>>>>>>>>> >>>>>>>>>> On Sep 7, 2010, at 10:53 AM, Benjamin Sanderse wrote: >>>>>>>>>> >>>>>>>>>>> Hi Barry, >>>>>>>>>>> >>>>>>>>>>> I am still not too happy with the execution in parallel. I am working under Linux (64 bits) and still using your approach with two command windows (since it gives the best debugging possibility). >>>>>>>>>>> As I said, sometimes things work, but most of the time not. Here is the output of two successive runs >>>>>>>>>>> >>>>>>>>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>>>>>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>>>>>>>> [1] PetscFinalize(): PetscFinalize() called >>>>>>>>>>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >>>>>>>>>>> [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >>>>>>>>>>> [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >>>>>>>>>>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >>>>>>>>>>> [0] PetscFinalize(): PetscFinalize() called >>>>>>>>>>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >>>>>>>>>>> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >>>>>>>>>>> [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >>>>>>>>>>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -bash-4.0$ netstat | grep 5005 >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>>>>>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>>>>>>>> ^C >>>>>>>>>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>>>>>>>>>> [0]1:Return code = 0, signaled with Interrupt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> In both cases I first started the Matlab program. I am currently starting Matlab without a GUI, but with a GUI I have the same problems. >>>>>>>>>>> As you can see, in the first case everything works fine, and Petsc finalizes and closes. Matlab gives me the correct output. The second case, run just a couple of seconds later, does not reach PetscFinalize and Matlab does not give the correct output. In between the two cases I checked if port 5005 was in use, and it was not. >>>>>>>>>>> Do you have any more suggestions on how to get this to work properly? >>>>>>>>>>> >>>>>>>>>>> Benjamin >>>>>>>>>>> >>>>>>>>>>> Op 3 sep 2010, om 21:11 heeft Barry Smith het volgende geschreven: >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> On Sep 3, 2010, at 4:32 PM, Benjamin Sanderse wrote: >>>>>>>>>>>> >>>>>>>>>>>>> Hi Barry, >>>>>>>>>>>>> >>>>>>>>>>>>> Thanks for your help! However, there are still some issues left. In other to test things, I simplified the program even more and now I am just sending a vector back and forth: matlab->petsc->matlab: >>>>>>>>>>>>> >>>>>>>>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>>>>>>>>> >>>>>>>>>>>>> // load rhs vector >>>>>>>>>>>>> ierr = VecLoad(fd,VECMPI,&b);CHKERRQ(ierr); >>>>>>>>>>>>> >>>>>>>>>>>>> // send to matlab >>>>>>>>>>>>> ierr = VecView(b,fd);CHKERRQ(ierr); >>>>>>>>>>>>> ierr = VecDestroy(b);CHKERRQ(ierr); >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> - Your approach with two windows works *sometimes*. I removed the 'launch' statement and I executed my program 10 times, the first 2 times worked, and in all other cases I got this: >>>>>>>>>>>>> >>>>>>>>>>>>> petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>>>>>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again >>>>>>>>>>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again^C >>>>>>>>>>>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>>>>>>>>>>>> [0]1:Return code = 0, signaled with Interrupt >>>>>>>>>>>>> >>>>>>>>>>>>> Every time I start the program I use close(socket) and clear all in Matlab, so the socket from the previous run should not be present anymore. It seems that the port gets corrupted after a couple of times? Matlab does not respond and I have to kill it and restart it manually. >>>>>>>>>>>> >>>>>>>>>>>> Sometimes when you close a socket connection it doesn't close for a very long time so that if you try to open it again it doesn't work. When it appears the socket can not be used try using netstat | grep 5005 to see if the socket is still active. >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> - If I include the launch statement, or just type >>>>>>>>>>>>> system('mpiexec -n 2 ./petsc_poisson_par_barry2 &') >>>>>>>>>>>>> the program never works. >>>>>>>>>>>> >>>>>>>>>>>> Are you sure mpiexec is in the path of system and it is the right one? The problem is that we are kind of cheating with system because we start a new job in the background and have no idea what the output is. Are you using unix and running Matlab on the command line or in a GUI? >>>>>>>>>>>> >>>>>>>>>>>> Barry >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Hope you can figure out what is going wrong. >>>>>>>>>>>>> >>>>>>>>>>>>> Ben >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> Op 3 sep 2010, om 13:25 heeft Barry Smith het volgende geschreven: >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Ben >>>>>>>>>>>>>> >>>>>>>>>>>>>> Ok, I figured out the problem. It is not fundamental and mostly comes from not having a create way to debug this. >>>>>>>>>>>>>> >>>>>>>>>>>>>> The test vector you create is sequential then you try to view it back to Matlab with the parallel fd viewer. If you change to >>>>>>>>>>>>>> ierr = VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,1,&test);CHKERRQ(ierr); >>>>>>>>>>>>>> then the code runs. >>>>>>>>>>>>>> >>>>>>>>>>>>>> I've found (just now) that when I use launch all the output from the .c program gets lost which makes it impossible to figure out what has gone wrong. You can debug by running the two parts of the computation in two different windows. So comment out the launch from the matlab script and then in Matlab run the script (it will hang waiting for the socket to work) and in a separate terminal window run the .c program; for example petscmpiexec -n 2 ./ex1 -info Now you see exactly what is happening in the PETSc program. You can even use -start_in_debugger on the PETSc side to run the debugger on crashes. >>>>>>>>>>>>>> >>>>>>>>>>>>>> I'll add this to the docs for launch >>>>>>>>>>>>>> >>>>>>>>>>>>>> Barry >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Sep 2, 2010, at 3:28 PM, Benjamin Sanderse wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Hi Barry, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> I attached my matlab file, c file and makefile. First I generate the executable with 'make petsc_poisson_par_barry' and then I run test_petsc_par_barry.m. >>>>>>>>>>>>>>> If you change MATMPIAIJ to MATAIJ and VECMPI to VECSEQ the code works fine. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks a lot, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Benjamin >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Op 2 sep 2010, om 13:45 heeft Barry Smith het volgende geschreven: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Matlab is never aware the vector is parallel. Please send me the code and I'll figure out what is going on. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Sep 2, 2010, at 2:07 PM, Benjamin Sanderse wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Found unrecogonized header 0 in file. If your file contains complex numbers >>>>>>>>>>>>>>>>> then call PetscBinaryRead() with "complex" as the second argument >>>>>>>>>>>>>>>>> Error in ==> PetscBinaryRead at 27 >>>>>>>>>>>>>>>>> if nargin < 2 >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> ??? Output argument "varargout" (and maybe others) not assigned during call to >>>>>>>>>>>>>>>>> "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Error in ==> test_petsc_par at 57 >>>>>>>>>>>>>>>>> x4 = PetscBinaryReady(PS); >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>>>>>>>>>>>>> ... >>>>>>>>>>>>>>>>> KSPSolve(ksp,b,x); >>>>>>>>>>>>>>>>> ... >>>>>>>>>>>>>>>>> VecView(fd,x); >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thanks for the help! >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Ben >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Hello all, >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. >>>>>>>>>>>>>>>>>>> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Ben >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>>> >>> >> > From bsmith at mcs.anl.gov Fri Sep 10 15:53:00 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 10 Sep 2010 15:53:00 -0500 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: <891D8036-CDCB-45A8-8A65-31F30ADE8258@cwi.nl> References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> <30D58A3F-D8D9-4692-8BE9-BB8168201BD3@mcs.anl.gov> <6F879BC3-2E8E-4B30-B4A8-CDB039BD3A1E@cwi.nl> <9587193F-E30F-40F2-BF6A-4B690DA5066B@cwi.nl> <261FD0FD-7085-4C71-B2BD-7DB26ACDA853@cwi.nl> <25591182-B391-4B29-B14F-8CC3B62ABD7F@mcs.anl.gov> <4BDFDB50-ED6D-4D44-A8F8-E14F0ED6C10F@cwi.nl> <8D1D3964-EA53-4C69-AEC B-4763243786ED@mcs.anl.gov> <891D8036-CDCB-45A8-8A65-31F30ADE8258@cwi.nl> Message-ID: I don't know if these means much. Suggest the following, since you cannot see the message returned by the system call. Try doing the mpiexec and redirecting the stdout and stderr into a file in /tmp and then look in that file to see that the error message is. Barry On Sep 10, 2010, at 2:57 PM, Benjamin Sanderse wrote: > I tried the following in Matlab: > >>> system('printenv PETSC_DIR') > /net/shareware/src/petsc-3.1-p4/ > ans = > 0 >>> system('which petscmpiexec') > /net/shareware/src/petsc-3.1-p4/bin//petscmpiexec > ans = > 0 >>> system('printenv PETSC_ARCH') > linux-gnu-c-debug > ans = > 0 > > So I guess that this should be ok when running system(). > > > > > Op 10 sep 2010, om 12:05 heeft Barry Smith het volgende geschreven: > >> >> On Sep 10, 2010, at 11:27 AM, Benjamin Sanderse wrote: >> >>> Hi guys, >>> >>> Still some other questions popping up: >>> >>> - Until now I have been using the 'two-shell' approach suggested by Barry for debugging purposes. This approach works fine, but in a later stage I would like to include the petsc execution command back in Matlab. I tried the following in Matlab: >>> system('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &); >>> >>> unix('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &); >>> >>> In both cases this doesn't work, while issuing the command in a separate shell works fine. Any ideas? >> >> Is petscmpiexec in your path when run inside system() and is PETSC_DIR and PETSC_ARCH set in there? >> >> Barry >> >>> >>> - I am using CG to solve a symmetric positive definite matrix. As preconditioner I normally use ICC (incomplete choleski), but apparently this is not implemented in parallel in Petsc. Suggestions on what to take as preconditioner? >>> >>> Thanks, >>> >>> Ben >>> >>> Op 9 sep 2010, om 14:36 heeft Satish Balay het volgende geschreven: >>> >>>> I just attempted this on a linux64 box [ubuntu 10.04 with matlab >>>> version 7.10.0.499 (R2010a) 64-bit (glnxa64)] with petsc-3.1 - and >>>> the loops ran fine. >>>> >>>> -bash: >>>> for i in {1..50}; do mpiexec -n 2 ./petsc_poisson_par_barry2 -viewer_socket_port 20001; echo $i; done >>>> >>>> matlab: >>>> for i=1:500,;i; test_petsc_par_barry; end >>>> >>>> Perhaps you can try a high port number - and the same loops - and see if you get >>>> stuck on this machine. >>>> >>>> Satish >>>> >>>> On Thu, 9 Sep 2010, Barry Smith wrote: >>>> >>>>> >>>>> On Sep 9, 2010, at 2:51 PM, Benjamin Sanderse wrote: >>>>> >>>>>> I have installed Petsc on a remote 64-bits Linux (Fedora). Might there be an issue with the 64-bits? >>>>> >>>>> Shouldn't mater. >>>>> >>>>>> >>>>>> My own computer is a Macbook Pro, but I am not running Petsc on it, because in the past I have had severe problems with getting mex-files to work under 64-bits Matlab. >>>>> >>>>> Yes, that is painful, Matlab never respected the Mac >>>>> >>>>>> I could try to install it, though. >>>>>> Another option is that I try it on another remote Linux machine, also 64-bits, running Red Hat. >>>>> >>>>> You could try this to see if the same problem exists. >>>>> >>>>>> >>>>>> What do you suggest? >>>>> >>>>> Are you sure it is hanging on the sockets not hanging on running MPI jobs one after each other? >>>>> >>>>> Barry >>>>> >>>>>> >>>>>> >>>>>> Op 9 sep 2010, om 13:10 heeft Barry Smith het volgende geschreven: >>>>>> >>>>>>> >>>>>>> What OS are you using. >>>>>>> >>>>>>> On my Apple Mac I made a shell script loop calling petsc_poisson_par_barry2 multiple times and a similar loop in Matlab and start then both off (with parallel PETSc runs). It runs flawlessly, opening the socket sending and receiving, dozens of times in a row with several processes. I think that maybe you are using Linux? >>>>>>> >>>>>>> >>>>>>> Barry >>>>>>> >>>>>>> >>>>>>> On Sep 8, 2010, at 2:32 PM, Benjamin Sanderse wrote: >>>>>>> >>>>>>>> That's also what I thought. I checked once again, and I found out that when I use >>>>>>>> >>>>>>>> petscmpiexec -n 1 >>>>>>>> >>>>>>>> the program works, but if I increase the number of processors to 2 it only works once in a while. >>>>>>>> >>>>>>>> I attached my test code. It is extremely simple and does nothing else than passing a vector to petsc and returning it to matlab. >>>>>>>> >>>>>>>> I run it as follows: >>>>>>>> >>>>>>>> shell #1 >>>>>>>> -bash-4.0$ make petsc_poisson_par_barry2 >>>>>>>> >>>>>>>> shell #2 >>>>>>>> -bash-4.0$ matlab -nojvm -nodisplay >>>>>>>>>> test_petsc_par_barry; >>>>>>>> >>>>>>>> shell #1 >>>>>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -viewer_socket_port 5006 -info >>>>>>>> >>>>>>>> On lucky days, this works, on unlucky days, petsc will stop here: >>>>>>>> >>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5006 machine borr.mas.cwi.nl >>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>>>>> ^C >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> Op 8 sep 2010, om 12:00 heeft Barry Smith het volgende geschreven: >>>>>>>> >>>>>>>>> >>>>>>>>> On Sep 8, 2010, at 10:13 AM, Benjamin Sanderse wrote: >>>>>>>>> >>>>>>>>>> Hi Barry, >>>>>>>>>> >>>>>>>>>> I am indeed closing the socket in Matlab between the two sets, using close(PS), where PS=PetscOpenSocket. >>>>>>>>>> I have tried different port numbers, but without guarantee of success. Sometimes it works, sometimes it doesn't. Often times the first time calling the PetscOpenSocket(portnumber) works, but even that is not guaranteed. I think there should be another solution. >>>>>>>>>> By the way, all these problems do not appear when using serial vectors instead of parallel. >>>>>>>>> >>>>>>>>> That is strange. Only the first process ever opens the socket so in theory the fact that the PETSc code is parallel should not matter at all. Please send me your test code that causes trouble again and I'll see if I can reproduce the problem. >>>>>>>>> >>>>>>>>> Barry >>>>>>>>> >>>>>>>>>> >>>>>>>>>> Ben >>>>>>>>>> >>>>>>>>>> Op 7 sep 2010, om 17:27 heeft Barry Smith het volgende geschreven: >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Are you closing the socket on Matlab between to the two sets? Just checking. >>>>>>>>>>> >>>>>>>>>>> You can try running with a different port number each time to see if it is related to trying to reuse the port. Run with PetscOpenSocket(5006) and the PETSc program with -viewer_socket_port 5006 >>>>>>>>>>> then run both with 5007 then with 5008 etc does this work smoothly? >>>>>>>>>>> >>>>>>>>>>> Let me know and the will tell me the next step to try, >>>>>>>>>>> >>>>>>>>>>> Barry >>>>>>>>>>> >>>>>>>>>>> On Sep 7, 2010, at 10:53 AM, Benjamin Sanderse wrote: >>>>>>>>>>> >>>>>>>>>>>> Hi Barry, >>>>>>>>>>>> >>>>>>>>>>>> I am still not too happy with the execution in parallel. I am working under Linux (64 bits) and still using your approach with two command windows (since it gives the best debugging possibility). >>>>>>>>>>>> As I said, sometimes things work, but most of the time not. Here is the output of two successive runs >>>>>>>>>>>> >>>>>>>>>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>>>>>>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>>>>>>>>> [1] PetscFinalize(): PetscFinalize() called >>>>>>>>>>>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >>>>>>>>>>>> [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >>>>>>>>>>>> [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >>>>>>>>>>>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >>>>>>>>>>>> [0] PetscFinalize(): PetscFinalize() called >>>>>>>>>>>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >>>>>>>>>>>> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >>>>>>>>>>>> [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >>>>>>>>>>>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -bash-4.0$ netstat | grep 5005 >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>>>>>>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>>>>>>>>> ^C >>>>>>>>>>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>>>>>>>>>>> [0]1:Return code = 0, signaled with Interrupt >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> In both cases I first started the Matlab program. I am currently starting Matlab without a GUI, but with a GUI I have the same problems. >>>>>>>>>>>> As you can see, in the first case everything works fine, and Petsc finalizes and closes. Matlab gives me the correct output. The second case, run just a couple of seconds later, does not reach PetscFinalize and Matlab does not give the correct output. In between the two cases I checked if port 5005 was in use, and it was not. >>>>>>>>>>>> Do you have any more suggestions on how to get this to work properly? >>>>>>>>>>>> >>>>>>>>>>>> Benjamin >>>>>>>>>>>> >>>>>>>>>>>> Op 3 sep 2010, om 21:11 heeft Barry Smith het volgende geschreven: >>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> On Sep 3, 2010, at 4:32 PM, Benjamin Sanderse wrote: >>>>>>>>>>>>> >>>>>>>>>>>>>> Hi Barry, >>>>>>>>>>>>>> >>>>>>>>>>>>>> Thanks for your help! However, there are still some issues left. In other to test things, I simplified the program even more and now I am just sending a vector back and forth: matlab->petsc->matlab: >>>>>>>>>>>>>> >>>>>>>>>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>>>>>>>>>> >>>>>>>>>>>>>> // load rhs vector >>>>>>>>>>>>>> ierr = VecLoad(fd,VECMPI,&b);CHKERRQ(ierr); >>>>>>>>>>>>>> >>>>>>>>>>>>>> // send to matlab >>>>>>>>>>>>>> ierr = VecView(b,fd);CHKERRQ(ierr); >>>>>>>>>>>>>> ierr = VecDestroy(b);CHKERRQ(ierr); >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> - Your approach with two windows works *sometimes*. I removed the 'launch' statement and I executed my program 10 times, the first 2 times worked, and in all other cases I got this: >>>>>>>>>>>>>> >>>>>>>>>>>>>> petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>>>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>>>>>>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again >>>>>>>>>>>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again^C >>>>>>>>>>>>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>>>>>>>>>>>>> [0]1:Return code = 0, signaled with Interrupt >>>>>>>>>>>>>> >>>>>>>>>>>>>> Every time I start the program I use close(socket) and clear all in Matlab, so the socket from the previous run should not be present anymore. It seems that the port gets corrupted after a couple of times? Matlab does not respond and I have to kill it and restart it manually. >>>>>>>>>>>>> >>>>>>>>>>>>> Sometimes when you close a socket connection it doesn't close for a very long time so that if you try to open it again it doesn't work. When it appears the socket can not be used try using netstat | grep 5005 to see if the socket is still active. >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> - If I include the launch statement, or just type >>>>>>>>>>>>>> system('mpiexec -n 2 ./petsc_poisson_par_barry2 &') >>>>>>>>>>>>>> the program never works. >>>>>>>>>>>>> >>>>>>>>>>>>> Are you sure mpiexec is in the path of system and it is the right one? The problem is that we are kind of cheating with system because we start a new job in the background and have no idea what the output is. Are you using unix and running Matlab on the command line or in a GUI? >>>>>>>>>>>>> >>>>>>>>>>>>> Barry >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Hope you can figure out what is going wrong. >>>>>>>>>>>>>> >>>>>>>>>>>>>> Ben >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> Op 3 sep 2010, om 13:25 heeft Barry Smith het volgende geschreven: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Ben >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Ok, I figured out the problem. It is not fundamental and mostly comes from not having a create way to debug this. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> The test vector you create is sequential then you try to view it back to Matlab with the parallel fd viewer. If you change to >>>>>>>>>>>>>>> ierr = VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,1,&test);CHKERRQ(ierr); >>>>>>>>>>>>>>> then the code runs. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> I've found (just now) that when I use launch all the output from the .c program gets lost which makes it impossible to figure out what has gone wrong. You can debug by running the two parts of the computation in two different windows. So comment out the launch from the matlab script and then in Matlab run the script (it will hang waiting for the socket to work) and in a separate terminal window run the .c program; for example petscmpiexec -n 2 ./ex1 -info Now you see exactly what is happening in the PETSc program. You can even use -start_in_debugger on the PETSc side to run the debugger on crashes. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> I'll add this to the docs for launch >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> On Sep 2, 2010, at 3:28 PM, Benjamin Sanderse wrote: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Hi Barry, >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> I attached my matlab file, c file and makefile. First I generate the executable with 'make petsc_poisson_par_barry' and then I run test_petsc_par_barry.m. >>>>>>>>>>>>>>>> If you change MATMPIAIJ to MATAIJ and VECMPI to VECSEQ the code works fine. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Thanks a lot, >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Benjamin >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Op 2 sep 2010, om 13:45 heeft Barry Smith het volgende geschreven: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Matlab is never aware the vector is parallel. Please send me the code and I'll figure out what is going on. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> On Sep 2, 2010, at 2:07 PM, Benjamin Sanderse wrote: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Found unrecogonized header 0 in file. If your file contains complex numbers >>>>>>>>>>>>>>>>>> then call PetscBinaryRead() with "complex" as the second argument >>>>>>>>>>>>>>>>>> Error in ==> PetscBinaryRead at 27 >>>>>>>>>>>>>>>>>> if nargin < 2 >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> ??? Output argument "varargout" (and maybe others) not assigned during call to >>>>>>>>>>>>>>>>>> "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Error in ==> test_petsc_par at 57 >>>>>>>>>>>>>>>>>> x4 = PetscBinaryReady(PS); >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>>>>>>>>>>>>>> ... >>>>>>>>>>>>>>>>>> KSPSolve(ksp,b,x); >>>>>>>>>>>>>>>>>> ... >>>>>>>>>>>>>>>>>> VecView(fd,x); >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Thanks for the help! >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Ben >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Hello all, >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. >>>>>>>>>>>>>>>>>>>> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Ben >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>> >>>>> >>>> >>> >> > From mafunk at nmsu.edu Fri Sep 10 15:59:46 2010 From: mafunk at nmsu.edu (Matt Funk) Date: Fri, 10 Sep 2010 14:59:46 -0600 Subject: [petsc-users] using superlu_dist Message-ID: <201009101459.47077.mafunk@nmsu.edu> Hi, i was wondering on how i need to set the matrix type when i want to use the superlu_dist solver. Right now what i have is: if (m_preCondType == "LU_SUPERLU") { if (numProc() > 1) m_ierr = MatSetType(m_globalMatrix, MATAIJ); else { m_ierr = MatSetType(m_globalMatrix, MATSEQAIJ); } } This i believe is according to the table in the petsc users manual (p.82). Anyway, things work ok on 1 processor. However, when i try 8 processors (i.e. it tells me: [3]PETSC ERROR: --------------------- Error Message ------------------------------------ [3]PETSC ERROR: No support for this operation for this object type! [3]PETSC ERROR: Matrix format mpiaij does not have a built-in PETSc direct solver! So i guess i should not use the MATAIJ matrix format? I also tried the MATMPIAIJ format, but got the same problem. So how is one supposed to use it? Obviously i am doing something wrong. Any help is appreciated. thanks matt From bsmith at mcs.anl.gov Fri Sep 10 16:04:48 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 10 Sep 2010 16:04:48 -0500 Subject: [petsc-users] using superlu_dist In-Reply-To: <201009101459.47077.mafunk@nmsu.edu> References: <201009101459.47077.mafunk@nmsu.edu> Message-ID: <0CEAA087-44A4-4A6B-A578-EE63FE0DF867@mcs.anl.gov> This has all changed in the 3.0.0 release. It is much simpler now. Any ways you don't need that crap for differences between 1 or more processors. Just use MATAIJ always and use -pc_type lu -pc_factor_mat_solver_package superlu_dist with 3.0.0 or later Barry On Sep 10, 2010, at 3:59 PM, Matt Funk wrote: > Hi, > > i was wondering on how i need to set the matrix type when i want to use the > superlu_dist solver. > > Right now what i have is: > if (m_preCondType == "LU_SUPERLU") { > if (numProc() > 1) > m_ierr = MatSetType(m_globalMatrix, MATAIJ); > else { > m_ierr = MatSetType(m_globalMatrix, MATSEQAIJ); > } > } > > This i believe is according to the table in the petsc users manual (p.82). > Anyway, things work ok on 1 processor. However, when i try 8 processors (i.e. > it tells me: > [3]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [3]PETSC ERROR: No support for this operation for this object type! > [3]PETSC ERROR: Matrix format mpiaij does not have a built-in PETSc direct > solver! > > > So i guess i should not use the MATAIJ matrix format? I also tried the > MATMPIAIJ format, but got the same problem. > > So how is one supposed to use it? Obviously i am doing something wrong. Any > help is appreciated. > > thanks > matt From B.Sanderse at cwi.nl Fri Sep 10 16:07:02 2010 From: B.Sanderse at cwi.nl (Benjamin Sanderse) Date: Fri, 10 Sep 2010 15:07:02 -0600 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> <30D58A3F-D8D9-4692-8BE9-BB8168201BD3@mcs.anl.gov> <6F879BC3-2E8E-4B30-B4A8-CDB039BD3A1E@cwi.nl> <9587193F-E30F-40F2-BF6A-4B690DA5066B@cwi.nl> <261FD0FD-7085-4C71-B2BD-7DB26ACDA853@cwi.nl> <25591182-B391-4B29-B14F-8CC3B62ABD7F@mcs.anl.gov> <4BDFDB50-ED6D-4D44-A8F8-E14F0ED6C10F@cwi.nl> <8D1D3964-EA53-4C69-AEC B-4763243786ED@mcs.anl.gov> <891D8036-CDCB-45A8-8A65-31F30ADE8258@cwi.nl> Message-ID: <10A31ACB-B940-4C47-8329-DF0062BAA897@cwi.nl> Hi Barry, That was a useful hint. I found out that once I made an error in calling Petsc subsequent (correct) calls to Petsc often fail as well. Restarting the shell then fixes matters. After opening a new shell and using the same command things worked. Ben Op 10 sep 2010, om 14:53 heeft Barry Smith het volgende geschreven: > > I don't know if these means much. Suggest the following, since you cannot see the message returned by the system call. Try doing the mpiexec and redirecting the stdout and stderr into a file in /tmp and then look in that file to see that the error message is. > > Barry > > On Sep 10, 2010, at 2:57 PM, Benjamin Sanderse wrote: > >> I tried the following in Matlab: >> >>>> system('printenv PETSC_DIR') >> /net/shareware/src/petsc-3.1-p4/ >> ans = >> 0 >>>> system('which petscmpiexec') >> /net/shareware/src/petsc-3.1-p4/bin//petscmpiexec >> ans = >> 0 >>>> system('printenv PETSC_ARCH') >> linux-gnu-c-debug >> ans = >> 0 >> >> So I guess that this should be ok when running system(). >> >> >> >> >> Op 10 sep 2010, om 12:05 heeft Barry Smith het volgende geschreven: >> >>> >>> On Sep 10, 2010, at 11:27 AM, Benjamin Sanderse wrote: >>> >>>> Hi guys, >>>> >>>> Still some other questions popping up: >>>> >>>> - Until now I have been using the 'two-shell' approach suggested by Barry for debugging purposes. This approach works fine, but in a later stage I would like to include the petsc execution command back in Matlab. I tried the following in Matlab: >>>> system('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &); >>>> >>>> unix('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &); >>>> >>>> In both cases this doesn't work, while issuing the command in a separate shell works fine. Any ideas? >>> >>> Is petscmpiexec in your path when run inside system() and is PETSC_DIR and PETSC_ARCH set in there? >>> >>> Barry >>> >>>> >>>> - I am using CG to solve a symmetric positive definite matrix. As preconditioner I normally use ICC (incomplete choleski), but apparently this is not implemented in parallel in Petsc. Suggestions on what to take as preconditioner? >>>> >>>> Thanks, >>>> >>>> Ben >>>> >>>> Op 9 sep 2010, om 14:36 heeft Satish Balay het volgende geschreven: >>>> >>>>> I just attempted this on a linux64 box [ubuntu 10.04 with matlab >>>>> version 7.10.0.499 (R2010a) 64-bit (glnxa64)] with petsc-3.1 - and >>>>> the loops ran fine. >>>>> >>>>> -bash: >>>>> for i in {1..50}; do mpiexec -n 2 ./petsc_poisson_par_barry2 -viewer_socket_port 20001; echo $i; done >>>>> >>>>> matlab: >>>>> for i=1:500,;i; test_petsc_par_barry; end >>>>> >>>>> Perhaps you can try a high port number - and the same loops - and see if you get >>>>> stuck on this machine. >>>>> >>>>> Satish >>>>> >>>>> On Thu, 9 Sep 2010, Barry Smith wrote: >>>>> >>>>>> >>>>>> On Sep 9, 2010, at 2:51 PM, Benjamin Sanderse wrote: >>>>>> >>>>>>> I have installed Petsc on a remote 64-bits Linux (Fedora). Might there be an issue with the 64-bits? >>>>>> >>>>>> Shouldn't mater. >>>>>> >>>>>>> >>>>>>> My own computer is a Macbook Pro, but I am not running Petsc on it, because in the past I have had severe problems with getting mex-files to work under 64-bits Matlab. >>>>>> >>>>>> Yes, that is painful, Matlab never respected the Mac >>>>>> >>>>>>> I could try to install it, though. >>>>>>> Another option is that I try it on another remote Linux machine, also 64-bits, running Red Hat. >>>>>> >>>>>> You could try this to see if the same problem exists. >>>>>> >>>>>>> >>>>>>> What do you suggest? >>>>>> >>>>>> Are you sure it is hanging on the sockets not hanging on running MPI jobs one after each other? >>>>>> >>>>>> Barry >>>>>> >>>>>>> >>>>>>> >>>>>>> Op 9 sep 2010, om 13:10 heeft Barry Smith het volgende geschreven: >>>>>>> >>>>>>>> >>>>>>>> What OS are you using. >>>>>>>> >>>>>>>> On my Apple Mac I made a shell script loop calling petsc_poisson_par_barry2 multiple times and a similar loop in Matlab and start then both off (with parallel PETSc runs). It runs flawlessly, opening the socket sending and receiving, dozens of times in a row with several processes. I think that maybe you are using Linux? >>>>>>>> >>>>>>>> >>>>>>>> Barry >>>>>>>> >>>>>>>> >>>>>>>> On Sep 8, 2010, at 2:32 PM, Benjamin Sanderse wrote: >>>>>>>> >>>>>>>>> That's also what I thought. I checked once again, and I found out that when I use >>>>>>>>> >>>>>>>>> petscmpiexec -n 1 >>>>>>>>> >>>>>>>>> the program works, but if I increase the number of processors to 2 it only works once in a while. >>>>>>>>> >>>>>>>>> I attached my test code. It is extremely simple and does nothing else than passing a vector to petsc and returning it to matlab. >>>>>>>>> >>>>>>>>> I run it as follows: >>>>>>>>> >>>>>>>>> shell #1 >>>>>>>>> -bash-4.0$ make petsc_poisson_par_barry2 >>>>>>>>> >>>>>>>>> shell #2 >>>>>>>>> -bash-4.0$ matlab -nojvm -nodisplay >>>>>>>>>>> test_petsc_par_barry; >>>>>>>>> >>>>>>>>> shell #1 >>>>>>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -viewer_socket_port 5006 -info >>>>>>>>> >>>>>>>>> On lucky days, this works, on unlucky days, petsc will stop here: >>>>>>>>> >>>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5006 machine borr.mas.cwi.nl >>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>>>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>>>>>> ^C >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> Op 8 sep 2010, om 12:00 heeft Barry Smith het volgende geschreven: >>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Sep 8, 2010, at 10:13 AM, Benjamin Sanderse wrote: >>>>>>>>>> >>>>>>>>>>> Hi Barry, >>>>>>>>>>> >>>>>>>>>>> I am indeed closing the socket in Matlab between the two sets, using close(PS), where PS=PetscOpenSocket. >>>>>>>>>>> I have tried different port numbers, but without guarantee of success. Sometimes it works, sometimes it doesn't. Often times the first time calling the PetscOpenSocket(portnumber) works, but even that is not guaranteed. I think there should be another solution. >>>>>>>>>>> By the way, all these problems do not appear when using serial vectors instead of parallel. >>>>>>>>>> >>>>>>>>>> That is strange. Only the first process ever opens the socket so in theory the fact that the PETSc code is parallel should not matter at all. Please send me your test code that causes trouble again and I'll see if I can reproduce the problem. >>>>>>>>>> >>>>>>>>>> Barry >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Ben >>>>>>>>>>> >>>>>>>>>>> Op 7 sep 2010, om 17:27 heeft Barry Smith het volgende geschreven: >>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> Are you closing the socket on Matlab between to the two sets? Just checking. >>>>>>>>>>>> >>>>>>>>>>>> You can try running with a different port number each time to see if it is related to trying to reuse the port. Run with PetscOpenSocket(5006) and the PETSc program with -viewer_socket_port 5006 >>>>>>>>>>>> then run both with 5007 then with 5008 etc does this work smoothly? >>>>>>>>>>>> >>>>>>>>>>>> Let me know and the will tell me the next step to try, >>>>>>>>>>>> >>>>>>>>>>>> Barry >>>>>>>>>>>> >>>>>>>>>>>> On Sep 7, 2010, at 10:53 AM, Benjamin Sanderse wrote: >>>>>>>>>>>> >>>>>>>>>>>>> Hi Barry, >>>>>>>>>>>>> >>>>>>>>>>>>> I am still not too happy with the execution in parallel. I am working under Linux (64 bits) and still using your approach with two command windows (since it gives the best debugging possibility). >>>>>>>>>>>>> As I said, sometimes things work, but most of the time not. Here is the output of two successive runs >>>>>>>>>>>>> >>>>>>>>>>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>>>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>>>>>>>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>>>>>>>>>> [1] PetscFinalize(): PetscFinalize() called >>>>>>>>>>>>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >>>>>>>>>>>>> [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >>>>>>>>>>>>> [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >>>>>>>>>>>>> [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >>>>>>>>>>>>> [0] PetscFinalize(): PetscFinalize() called >>>>>>>>>>>>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 >>>>>>>>>>>>> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >>>>>>>>>>>>> [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >>>>>>>>>>>>> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -bash-4.0$ netstat | grep 5005 >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> -bash-4.0$ petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483646 >>>>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483641 >>>>>>>>>>>>> [0] VecAssemblyBegin_MPI(): Stash has 0 entries, uses 0 mallocs. >>>>>>>>>>>>> [0] VecAssemblyBegin_MPI(): Block-Stash has 0 entries, uses 0 mallocs. >>>>>>>>>>>>> ^C >>>>>>>>>>>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>>>>>>>>>>>> [0]1:Return code = 0, signaled with Interrupt >>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>>> In both cases I first started the Matlab program. I am currently starting Matlab without a GUI, but with a GUI I have the same problems. >>>>>>>>>>>>> As you can see, in the first case everything works fine, and Petsc finalizes and closes. Matlab gives me the correct output. The second case, run just a couple of seconds later, does not reach PetscFinalize and Matlab does not give the correct output. In between the two cases I checked if port 5005 was in use, and it was not. >>>>>>>>>>>>> Do you have any more suggestions on how to get this to work properly? >>>>>>>>>>>>> >>>>>>>>>>>>> Benjamin >>>>>>>>>>>>> >>>>>>>>>>>>> Op 3 sep 2010, om 21:11 heeft Barry Smith het volgende geschreven: >>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>> On Sep 3, 2010, at 4:32 PM, Benjamin Sanderse wrote: >>>>>>>>>>>>>> >>>>>>>>>>>>>>> Hi Barry, >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Thanks for your help! However, there are still some issues left. In other to test things, I simplified the program even more and now I am just sending a vector back and forth: matlab->petsc->matlab: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> // load rhs vector >>>>>>>>>>>>>>> ierr = VecLoad(fd,VECMPI,&b);CHKERRQ(ierr); >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> // send to matlab >>>>>>>>>>>>>>> ierr = VecView(b,fd);CHKERRQ(ierr); >>>>>>>>>>>>>>> ierr = VecDestroy(b);CHKERRQ(ierr); >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> - Your approach with two windows works *sometimes*. I removed the 'launch' statement and I executed my program 10 times, the first 2 times worked, and in all other cases I got this: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> petscmpiexec -n 2 ./petsc_poisson_par_barry2 -info >>>>>>>>>>>>>>> [1] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>>>>>> [1] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>>>>>> [0] PetscInitialize(): PETSc successfully started: number of processors = 2 >>>>>>>>>>>>>>> [0] PetscInitialize(): Running on machine: borr.mas.cwi.nl >>>>>>>>>>>>>>> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>>>>>> [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 >>>>>>>>>>>>>>> [1] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>>>>>> [0] PetscCommDuplicate(): returning tag 2147483647 >>>>>>>>>>>>>>> [0] PetscViewerSocketSetConnection(): Connecting to socket process on port 5005 machine borr.mas.cwi.nl >>>>>>>>>>>>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again[0] PetscOpenSocket(): Connection refused in attaching socket, trying again >>>>>>>>>>>>>>> [0] PetscOpenSocket(): Connection refused in attaching socket, trying again^C >>>>>>>>>>>>>>> -bash-4.0$ [0]0:Return code = 0, signaled with Interrupt >>>>>>>>>>>>>>> [0]1:Return code = 0, signaled with Interrupt >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Every time I start the program I use close(socket) and clear all in Matlab, so the socket from the previous run should not be present anymore. It seems that the port gets corrupted after a couple of times? Matlab does not respond and I have to kill it and restart it manually. >>>>>>>>>>>>>> >>>>>>>>>>>>>> Sometimes when you close a socket connection it doesn't close for a very long time so that if you try to open it again it doesn't work. When it appears the socket can not be used try using netstat | grep 5005 to see if the socket is still active. >>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> - If I include the launch statement, or just type >>>>>>>>>>>>>>> system('mpiexec -n 2 ./petsc_poisson_par_barry2 &') >>>>>>>>>>>>>>> the program never works. >>>>>>>>>>>>>> >>>>>>>>>>>>>> Are you sure mpiexec is in the path of system and it is the right one? The problem is that we are kind of cheating with system because we start a new job in the background and have no idea what the output is. Are you using unix and running Matlab on the command line or in a GUI? >>>>>>>>>>>>>> >>>>>>>>>>>>>> Barry >>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Hope you can figure out what is going wrong. >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Ben >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>>> Op 3 sep 2010, om 13:25 heeft Barry Smith het volgende geschreven: >>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Ben >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Ok, I figured out the problem. It is not fundamental and mostly comes from not having a create way to debug this. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> The test vector you create is sequential then you try to view it back to Matlab with the parallel fd viewer. If you change to >>>>>>>>>>>>>>>> ierr = VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,1,&test);CHKERRQ(ierr); >>>>>>>>>>>>>>>> then the code runs. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> I've found (just now) that when I use launch all the output from the .c program gets lost which makes it impossible to figure out what has gone wrong. You can debug by running the two parts of the computation in two different windows. So comment out the launch from the matlab script and then in Matlab run the script (it will hang waiting for the socket to work) and in a separate terminal window run the .c program; for example petscmpiexec -n 2 ./ex1 -info Now you see exactly what is happening in the PETSc program. You can even use -start_in_debugger on the PETSc side to run the debugger on crashes. >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> I'll add this to the docs for launch >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> On Sep 2, 2010, at 3:28 PM, Benjamin Sanderse wrote: >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Hi Barry, >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> I attached my matlab file, c file and makefile. First I generate the executable with 'make petsc_poisson_par_barry' and then I run test_petsc_par_barry.m. >>>>>>>>>>>>>>>>> If you change MATMPIAIJ to MATAIJ and VECMPI to VECSEQ the code works fine. >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Thanks a lot, >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Benjamin >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> Op 2 sep 2010, om 13:45 heeft Barry Smith het volgende geschreven: >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Matlab is never aware the vector is parallel. Please send me the code and I'll figure out what is going on. >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> On Sep 2, 2010, at 2:07 PM, Benjamin Sanderse wrote: >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> That sounds great, but there is one issue I am encountering. I switched vector types to VECMPI and matrix type to MATMPIAIJ, but when running Matlab I get the following error: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Found unrecogonized header 0 in file. If your file contains complex numbers >>>>>>>>>>>>>>>>>>> then call PetscBinaryRead() with "complex" as the second argument >>>>>>>>>>>>>>>>>>> Error in ==> PetscBinaryRead at 27 >>>>>>>>>>>>>>>>>>> if nargin < 2 >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> ??? Output argument "varargout" (and maybe others) not assigned during call to >>>>>>>>>>>>>>>>>>> "/ufs/sanderse/Software/petsc-3.1-p4/bin/matlab/PetscBinaryRead.m>PetscBinaryRead". >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Error in ==> test_petsc_par at 57 >>>>>>>>>>>>>>>>>>> x4 = PetscBinaryReady(PS); >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Could it be that Matlab does not understand the "parallel" vector which is returned by Petsc? Currently I have this done with VecView as follows: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> fd = PETSC_VIEWER_SOCKET_WORLD; >>>>>>>>>>>>>>>>>>> ... >>>>>>>>>>>>>>>>>>> KSPSolve(ksp,b,x); >>>>>>>>>>>>>>>>>>> ... >>>>>>>>>>>>>>>>>>> VecView(fd,x); >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Thanks for the help! >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Ben >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> Op 2 sep 2010, om 10:09 heeft Barry Smith het volgende geschreven: >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> On Sep 2, 2010, at 10:51 AM, Benjamin Sanderse wrote: >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Hello all, >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> I figured out the coupling with Matlab and I can send back and forth matrices and vectors between Petsc and Matlab. Actually, I send only once a matrix from Matlab to Petsc and then repeatedly send new right hand sides from Matlab->Petsc and the solution vector from Petsc->Matlab. That works great. >>>>>>>>>>>>>>>>>>>>> I know want to see if the matrix that is send from (serial) Matlab to Petsc can be stored as a parallel matrix in Petsc so that subsequent computations with different right hand sides can be performed in parallel by Petsc. Does this simply work by using MatLoad and setting Mattype MPIAIJ? Or is something more fancy required? >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> In theory this can be done using the same code as sequential only with parallel vectors VECMPI and matrices. MATMPIAIJ >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>> Barry >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Thanks, >>>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>>>> Ben >>>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>>> >>>>>>>>>>>>>>>> >>>>>>>>>>>>>>> >>>>>>>>>>>>>> >>>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>>> >>>>> >>>> >>> >> > From zhaonanavril at gmail.com Fri Sep 10 16:27:23 2010 From: zhaonanavril at gmail.com (NAN ZHAO) Date: Fri, 10 Sep 2010 15:27:23 -0600 Subject: [petsc-users] noknow petsc error after kspsolve Message-ID: Dear all, *An error message from petsc comes after the kspsolve:* [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [*0]PETSC ERROR: Invalid pointer!* *[0]PETSC ERROR: Invalid Pointer: Parameter # 2!* [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 2.3.3, Patch 15, Tue Sep 23 10:02:49 CDT 2008 HG revision: 31306062cd1a6f6a2496fccb4878f485c9b91760 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Unknown Name on a linux-gnu named perc.ices.utah.edu by nan Fri Sep 10 11:06:35 2010 [0]PETSC ERROR: Libraries linked from /usr/local/petsc-2.3.3-p15-nox11/lib/linux-gnu-cxx-debug [0]PETSC ERROR: Configure run at Tue Aug 17 14:47:03 2010 [0]PETSC ERROR: *Configure options --prefix=/usr/local/petsc-2.3.3-p15-nox11 --with-mpi-dir=/usr/local/mpich2-1.2.1-install --with-blas-lib=/usr/lib64/libblas.a --with-lapack-lib=/usr/lib64/liblapack.a --with-fc=0 --with-x11=0 --with-x=0 --with-clanguage=cxx --with-cxx=mpicxx --with-cc=mpicc --with-debugging=1 COPTFLAGS=-O3 --with-shared=0* [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: KSPGetDiagonalScaleFix() line 1748 in src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: *Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range* [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[0]PETSCERROR: or try http://valgrind.org on linux or man libgmalloc on Apple to find memory corruption errors [0]PETSC ERROR: likely location of problem given in stack below [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [0]PETSC ERROR: INSTEAD the line number of the start of the function [0]PETSC ERROR: is given. [0]PETSC ERROR: [0] KSPGetDiagonalScaleFix line 1746 src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: [0] KSPGetDiagonalScale line 1667 src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Signal received! -------- [0]PETSC ERROR: Petsc Release Version 2.3.3, Patch 15, Tue Sep 23 10:02:49 CDT 2008 HG revision: 31306062cd1a6f6a2496fccb4878f485c9b91760 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Unknown Name on a linux-gnu named perc.ices.utah.edu by nan Fri Sep 10 11:05:07 2010 [0]PETSC ERROR: Libraries linked from /usr/local/petsc-2.3.3-p15-nox11/lib/linux-gnu-cxx-debug [0]PETSC ERROR: Configure run at Tue Aug 17 14:47:03 2010 [0]PETSC ERROR: Configure options --prefix=/usr/local/petsc-2.3.3-p15-nox11 --with-mpi-dir=/usr/local/mpich2-1.2.1-install --with-blas-lib=/usr/lib64/libblas.a --with-lapack-lib=/usr/lib64/liblapack.a --with-fc=0 --with-x11=0 --with-x=0 --with-clanguage=cxx --with-cxx=mpicxx --with-cc=mpicc --with-debugging=1 COPTFLAGS=-O3 --with-shared=0 [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 *I tried vlgrind, it just no use. I think I put some CHKMEMQ in my code, sometimes it runs. Can someone give a hint on this problem.* Thanks, Nan -------------- next part -------------- An HTML attachment was scrubbed... URL: From mafunk at nmsu.edu Fri Sep 10 16:43:25 2010 From: mafunk at nmsu.edu (Matt Funk) Date: Fri, 10 Sep 2010 15:43:25 -0600 Subject: [petsc-users] using superlu_dist In-Reply-To: <0CEAA087-44A4-4A6B-A578-EE63FE0DF867@mcs.anl.gov> References: <201009101459.47077.mafunk@nmsu.edu> <0CEAA087-44A4-4A6B-A578-EE63FE0DF867@mcs.anl.gov> Message-ID: <201009101543.26151.mafunk@nmsu.edu> HI Barry, thanks for the heads up, however, i am not using the command line. So what i did is when i set my pc i do: if(m_preCondType == "LU_SUPERLU") { m_ierr = PCSetType(m_pc, PCLU); if (numProc() > 1) { PCFactorSetMatSolverPackage(m_pc,MAT_SOLVER_SUPERLU_DIST); } else { PCFactorSetMatSolverPackage(m_pc,MAT_SOLVER_SUPERLU); } } and the matrix type is set to MATAIJ. Anyway, but i still need to distingiush between superlu and superlu_dist it seems as specifying superlu for a parallel run throws an error. I suppose that when invoking this from the command line there is some code that test to see whether this is a serial/parallel run and makes similar calls as i did above? thank you matt On Friday, September 10, 2010, Barry Smith wrote: > This has all changed in the 3.0.0 release. It is much simpler now. > > Any ways you don't need that crap for differences between 1 or more > processors. Just use MATAIJ always and use -pc_type lu > -pc_factor_mat_solver_package superlu_dist with 3.0.0 or later > > > Barry > > On Sep 10, 2010, at 3:59 PM, Matt Funk wrote: > > Hi, > > > > i was wondering on how i need to set the matrix type when i want to use > > the superlu_dist solver. > > > > Right now what i have is: > > if (m_preCondType == "LU_SUPERLU") { > > > > if (numProc() > 1) > > > > m_ierr = MatSetType(m_globalMatrix, MATAIJ); > > > > else { > > > > m_ierr = MatSetType(m_globalMatrix, MATSEQAIJ); > > > > } > > > > } > > > > This i believe is according to the table in the petsc users manual > > (p.82). Anyway, things work ok on 1 processor. However, when i try 8 > > processors (i.e. it tells me: > > [3]PETSC ERROR: --------------------- Error Message > > ------------------------------------ > > [3]PETSC ERROR: No support for this operation for this object type! > > [3]PETSC ERROR: Matrix format mpiaij does not have a built-in PETSc > > direct solver! > > > > > > So i guess i should not use the MATAIJ matrix format? I also tried the > > MATMPIAIJ format, but got the same problem. > > > > So how is one supposed to use it? Obviously i am doing something wrong. > > Any help is appreciated. > > > > thanks > > matt From B.Sanderse at cwi.nl Fri Sep 10 17:00:02 2010 From: B.Sanderse at cwi.nl (Benjamin Sanderse) Date: Fri, 10 Sep 2010 16:00:02 -0600 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: <878w39wo3b.fsf@59A2.org> References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> <30D58A3F-D8D9-4692-8BE9-BB8168201BD3@mcs.anl.gov> <6F879BC3-2E8E-4B30-B4A8-CDB039BD3A1E@cwi.nl> <9587193F-E30F-40F2-BF6A-4B690DA5066B@cwi.nl> <261FD0FD-7085-4C71-B2BD-7DB26ACDA853@cwi.nl> <25591182-B391-4B29-B14F-8CC3B62ABD7F@mcs.anl.gov> <4BDFDB50-ED6D-4D44-A8F8-E14F0ED6C10F@cwi.nl> <878w39wo3b.fsf@59A2.or g> Message-ID: <72AF3017-8875-429E-AE5A-EF68E2C9BD87@cwi.nl> Hi Jed, I forgot to note that my matrix is *extremely* well structured, because I am working on a (non-uniform) Cartesian mesh. The matrix that I want to solve results from discretizing the Laplacian, so in 3D it consists basically of only 7 diagonals. Do you think the same conclusions hold with regard to preconditioners? Furthermore, I get these messages: [1] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 76032)/(num_localrows 82944) > 0.6. Use CompressedRow routines. Does Petsc give a hint here to use other routines, or does it indicate what it is doing? Ben Op 10 sep 2010, om 10:38 heeft Jed Brown het volgende geschreven: > On Fri, 10 Sep 2010 10:27:17 -0600, Benjamin Sanderse wrote: >> - Until now I have been using the 'two-shell' approach suggested by Barry for debugging purposes. This approach works fine, but in a later stage I would like to include the petsc execution command back in Matlab. I tried the following in Matlab: >> system('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &); >> >> unix('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &); >> >> In both cases this doesn't work, while issuing the command in a separate shell works fine. Any ideas? > > Please provide the output, "doesn't work" is not much information. > >> - I am using CG to solve a symmetric positive definite matrix. As preconditioner I normally use ICC (incomplete choleski), but apparently this is not implemented in parallel in Petsc. Suggestions on what to take as preconditioner? > > Start with block Jacobi + ICC (-pc_type bjacobi -sub_pc_type icc) or ASM > + ICC (-pc_type asm -sub_pc_type icc). > > Jed -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Sep 10 19:19:32 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 10 Sep 2010 19:19:32 -0500 Subject: [petsc-users] noknow petsc error after kspsolve In-Reply-To: References: Message-ID: How is valgrind "no use". Use valgrind but also pass the option -malloc_debug no to the PETSc program. Send the output to petsc-maint at mcs.anl.gov Barry On Sep 10, 2010, at 4:27 PM, NAN ZHAO wrote: > Dear all, > > An error message from petsc comes after the kspsolve: > > [0]PETSC ERROR: --------------------- Error Message ------------------------------------ > [0]PETSC ERROR: Invalid pointer! > [0]PETSC ERROR: Invalid Pointer: Parameter # 2! > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 2.3.3, Patch 15, Tue Sep 23 10:02:49 CDT 2008 HG revision: 31306062cd1a6f6a2496fccb4878f485c9b91760 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: Unknown Name on a linux-gnu named perc.ices.utah.edu by nan Fri Sep 10 11:06:35 2010 > [0]PETSC ERROR: Libraries linked from /usr/local/petsc-2.3.3-p15-nox11/lib/linux-gnu-cxx-debug > [0]PETSC ERROR: Configure run at Tue Aug 17 14:47:03 2010 > [0]PETSC ERROR: Configure options --prefix=/usr/local/petsc-2.3.3-p15-nox11 --with-mpi-dir=/usr/local/mpich2-1.2.1-install --with-blas-lib=/usr/lib64/libblas.a --with-lapack-lib=/usr/lib64/liblapack.a --with-fc=0 --with-x11=0 --with-x=0 --with-clanguage=cxx --with-cxx=mpicxx --with-cc=mpicc --with-debugging=1 COPTFLAGS=-O3 --with-shared=0 > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: KSPGetDiagonalScaleFix() line 1748 in src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[0]PETSC ERROR: or try http://valgrind.org on linux or man libgmalloc on Apple to find memory corruption errors > [0]PETSC ERROR: likely location of problem given in stack below > [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------ > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, > [0]PETSC ERROR: INSTEAD the line number of the start of the function > [0]PETSC ERROR: is given. > [0]PETSC ERROR: [0] KSPGetDiagonalScaleFix line 1746 src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: [0] KSPGetDiagonalScale line 1667 src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: --------------------- Error Message ------------------------------------ > [0]PETSC ERROR: Signal received! > -------- > [0]PETSC ERROR: Petsc Release Version 2.3.3, Patch 15, Tue Sep 23 10:02:49 CDT 2008 HG revision: 31306062cd1a6f6a2496fccb4878f485c9b91760 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: Unknown Name on a linux-gnu named perc.ices.utah.edu by nan Fri Sep 10 11:05:07 2010 > [0]PETSC ERROR: Libraries linked from /usr/local/petsc-2.3.3-p15-nox11/lib/linux-gnu-cxx-debug > [0]PETSC ERROR: Configure run at Tue Aug 17 14:47:03 2010 > [0]PETSC ERROR: Configure options --prefix=/usr/local/petsc-2.3.3-p15-nox11 --with-mpi-dir=/usr/local/mpich2-1.2.1-install --with-blas-lib=/usr/lib64/libblas.a --with-lapack-lib=/usr/lib64/liblapack.a --with-fc=0 --with-x11=0 --with-x=0 --with-clanguage=cxx --with-cxx=mpicxx --with-cc=mpicc --with-debugging=1 COPTFLAGS=-O3 --with-shared=0 > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 > > I tried vlgrind, it just no use. I think I put some CHKMEMQ in my code, sometimes it runs. Can someone give a hint on this problem. > > Thanks, > > Nan -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Sep 10 19:21:55 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 10 Sep 2010 19:21:55 -0500 Subject: [petsc-users] using superlu_dist In-Reply-To: <201009101543.26151.mafunk@nmsu.edu> References: <201009101459.47077.mafunk@nmsu.edu> <0CEAA087-44A4-4A6B-A578-EE63FE0DF867@mcs.anl.gov> <201009101543.26151.mafunk@nmsu.edu> Message-ID: <527FD798-5AE4-4B22-A4F1-60F9833043B2@mcs.anl.gov> Just always use SUPERLU_DIST for any number of processes including 1. It is much faster and uses less memory the superlu. Barry You should only use superlu when the matrix is super ill-conditioned like condition number 10^10 and superlu_dist don't work On Sep 10, 2010, at 4:43 PM, Matt Funk wrote: > HI Barry, > > thanks for the heads up, however, i am not using the command line. > So what i did is when i set my pc i do: > > if(m_preCondType == "LU_SUPERLU") { > m_ierr = PCSetType(m_pc, PCLU); > if (numProc() > 1) { > PCFactorSetMatSolverPackage(m_pc,MAT_SOLVER_SUPERLU_DIST); > } > else { > PCFactorSetMatSolverPackage(m_pc,MAT_SOLVER_SUPERLU); > } > } > > and the matrix type is set to MATAIJ. Anyway, but i still need to distingiush > between superlu and superlu_dist it seems as specifying superlu for a parallel > run throws an error. > > I suppose that when invoking this from the command line there is some code > that test to see whether this is a serial/parallel run and makes similar calls > as i did above? > > > thank you > matt > > > > On Friday, September 10, 2010, Barry Smith wrote: >> This has all changed in the 3.0.0 release. It is much simpler now. >> >> Any ways you don't need that crap for differences between 1 or more >> processors. Just use MATAIJ always and use -pc_type lu >> -pc_factor_mat_solver_package superlu_dist with 3.0.0 or later >> >> >> Barry >> >> On Sep 10, 2010, at 3:59 PM, Matt Funk wrote: >>> Hi, >>> >>> i was wondering on how i need to set the matrix type when i want to use >>> the superlu_dist solver. >>> >>> Right now what i have is: >>> if (m_preCondType == "LU_SUPERLU") { >>> >>> if (numProc() > 1) >>> >>> m_ierr = MatSetType(m_globalMatrix, MATAIJ); >>> >>> else { >>> >>> m_ierr = MatSetType(m_globalMatrix, MATSEQAIJ); >>> >>> } >>> >>> } >>> >>> This i believe is according to the table in the petsc users manual >>> (p.82). Anyway, things work ok on 1 processor. However, when i try 8 >>> processors (i.e. it tells me: >>> [3]PETSC ERROR: --------------------- Error Message >>> ------------------------------------ >>> [3]PETSC ERROR: No support for this operation for this object type! >>> [3]PETSC ERROR: Matrix format mpiaij does not have a built-in PETSc >>> direct solver! >>> >>> >>> So i guess i should not use the MATAIJ matrix format? I also tried the >>> MATMPIAIJ format, but got the same problem. >>> >>> So how is one supposed to use it? Obviously i am doing something wrong. >>> Any help is appreciated. >>> >>> thanks >>> matt > From bsmith at mcs.anl.gov Fri Sep 10 19:25:32 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 10 Sep 2010 19:25:32 -0500 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: <72AF3017-8875-429E-AE5A-EF68E2C9BD87@cwi.nl> References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> <30D58A3F-D8D9-4692-8BE9-BB8168201BD3@mcs.anl.gov> <6F879BC3-2E8E-4B30-B4A8-CDB039BD3A1E@cwi.nl> <9587193F-E30F-40F2-BF6A-4B690DA5066B@cwi.nl> <261FD0FD-7085-4C71-B2BD-7DB26ACDA853@cwi.nl> <25591182-B391-4B29-B14F-8CC3B62ABD7F@mcs.anl.gov> <4BDFDB50-ED6D-4D44-A8F8-E14F0ED6C10F@cwi.nl> <878w39wo3b.fsf@59A2.or g> <72AF3017-8875-429E-AE5A-EF68E2C9BD87@cwi.nl> Message-ID: <365E880E-F16D-46E1-8339-FF8A780E677E@mcs.anl.gov> On Sep 10, 2010, at 5:00 PM, Benjamin Sanderse wrote: > Hi Jed, > > I forgot to note that my matrix is *extremely* well structured, because I am working on a (non-uniform) Cartesian mesh. The matrix that I want to solve results from discretizing the Laplacian, so in 3D it consists basically of only 7 diagonals. Do you think the same conclusions hold with regard to preconditioners? > Furthermore, I get these messages: > [1] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 76032)/(num_localrows 82944) > 0.6. Use CompressedRow routines. > > Does Petsc give a hint here to use other routines, or does it indicate what it is doing? No you can ignore this message. I recommend you use boomerAMG as your preconditioner -pc_type hyper -pc_hypre_type boomeramg you must first have configured PETSc with --download-hypre BTW: your matrix is so simple it doesn't seem to make sense to be generating it in Matlab and shipping it over to PETSc, you should generate the matrix in PETSc (and likely the vectors also) and just use Matlab for visualization or stuff like that. If you use the PETSc DA to parallelize the PETSc code and generate the matrix in PETSc you can use geometric multigrid to solve the system and it will scream in parallel. > > Ben > > Op 10 sep 2010, om 10:38 heeft Jed Brown het volgende geschreven: > >> On Fri, 10 Sep 2010 10:27:17 -0600, Benjamin Sanderse wrote: >>> - Until now I have been using the 'two-shell' approach suggested by Barry for debugging purposes. This approach works fine, but in a later stage I would like to include the petsc execution command back in Matlab. I tried the following in Matlab: >>> system('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &); >>> >>> unix('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &); >>> >>> In both cases this doesn't work, while issuing the command in a separate shell works fine. Any ideas? >> >> Please provide the output, "doesn't work" is not much information. >> >>> - I am using CG to solve a symmetric positive definite matrix. As preconditioner I normally use ICC (incomplete choleski), but apparently this is not implemented in parallel in Petsc. Suggestions on what to take as preconditioner? >> >> Start with block Jacobi + ICC (-pc_type bjacobi -sub_pc_type icc) or ASM >> + ICC (-pc_type asm -sub_pc_type icc). >> >> Jed > -------------- next part -------------- An HTML attachment was scrubbed... URL: From B.Sanderse at cwi.nl Fri Sep 10 20:13:47 2010 From: B.Sanderse at cwi.nl (Benjamin Sanderse) Date: Fri, 10 Sep 2010 19:13:47 -0600 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: <365E880E-F16D-46E1-8339-FF8A780E677E@mcs.anl.gov> References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> <30D58A3F-D8D9-4692-8BE9-BB8168201BD3@mcs.anl.gov> <6F879BC3-2E8E-4B30-B4A8-CDB039BD3A1E@cwi.nl> <9587193F-E30F-40F2-BF6A-4B690DA5066B@cwi.nl> <261FD0FD-7085-4C71-B2BD-7DB26ACDA853@cwi.nl> <25591182-B391-4B29-B14F-8CC3B62ABD7F@mcs.anl.gov> <4BDFDB50-ED6D-4D44-A8F8-E14F0ED6C10F@cwi.nl> <878w39wo3b.fsf@59A2.or g> <72AF3017-8875-429E-AE5A-EF68E2C9BD87@cwi.nl> <365E880E-F16D-46E1-8339-FF8A780E677E@mcs.anl.gov> Message-ID: <26D7BD78-9879-41A1-B855-9AA1ADE006C9@cwi.nl> Hi Barry, Thanks for your comments. I have thought of some of these options as well. There is one (big) thing: I do not build the Laplacian simply by programming its entries directly, but I generate it as being the product of a divergence and a gradient matrix. These divergence and gradient matrices are also used in other parts of my code. In the end, there are a lot of matrix generation routines which (very) efficiently and elegantly build my entire discretization. I am not sure if I can easily transfer that to C or Fortran, since it heavily relies on the sparse-matrix features of Matlab (like spdiags). Maybe you have suggestions for this? For now I stick to Matlab since I love it for prototyping, but I want to solve the pressure matrix a bit faster and that's why I am looking at this 'quick and dirty' solution. Ben Op 10 sep 2010, om 18:25 heeft Barry Smith het volgende geschreven: > > On Sep 10, 2010, at 5:00 PM, Benjamin Sanderse wrote: > >> Hi Jed, >> >> I forgot to note that my matrix is *extremely* well structured, because I am working on a (non-uniform) Cartesian mesh. The matrix that I want to solve results from discretizing the Laplacian, so in 3D it consists basically of only 7 diagonals. Do you think the same conclusions hold with regard to preconditioners? >> Furthermore, I get these messages: >> [1] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 76032)/(num_localrows 82944) > 0.6. Use CompressedRow routines. >> >> Does Petsc give a hint here to use other routines, or does it indicate what it is doing? > > No you can ignore this message. > > I recommend you use boomerAMG as your preconditioner -pc_type hyper -pc_hypre_type boomeramg you must first have configured PETSc with --download-hypre > > BTW: your matrix is so simple it doesn't seem to make sense to be generating it in Matlab and shipping it over to PETSc, you should generate the matrix in PETSc (and likely the vectors also) and just use Matlab for visualization or stuff like that. If you use the PETSc DA to parallelize the PETSc code and generate the matrix in PETSc you can use geometric multigrid to solve the system and it will scream in parallel. > >> >> Ben >> >> Op 10 sep 2010, om 10:38 heeft Jed Brown het volgende geschreven: >> >>> On Fri, 10 Sep 2010 10:27:17 -0600, Benjamin Sanderse wrote: >>>> - Until now I have been using the 'two-shell' approach suggested by Barry for debugging purposes. This approach works fine, but in a later stage I would like to include the petsc execution command back in Matlab. I tried the following in Matlab: >>>> system('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &); >>>> >>>> unix('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &); >>>> >>>> In both cases this doesn't work, while issuing the command in a separate shell works fine. Any ideas? >>> >>> Please provide the output, "doesn't work" is not much information. >>> >>>> - I am using CG to solve a symmetric positive definite matrix. As preconditioner I normally use ICC (incomplete choleski), but apparently this is not implemented in parallel in Petsc. Suggestions on what to take as preconditioner? >>> >>> Start with block Jacobi + ICC (-pc_type bjacobi -sub_pc_type icc) or ASM >>> + ICC (-pc_type asm -sub_pc_type icc). >>> >>> Jed >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Sep 10 21:18:48 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 10 Sep 2010 21:18:48 -0500 Subject: [petsc-users] coupling with Matlab and parallel solution In-Reply-To: <26D7BD78-9879-41A1-B855-9AA1ADE006C9@cwi.nl> References: <1D7DA60F-2616-41F3-BC17-FD797E9A9AF2@cwi.nl> <87zkw1cej4.fsf@59A2.org> <96BE91D1-0EFF-4CF1-B1DA-0C5A2D13BFF3@cwi.nl> <6C3C8604-E079-40DC-ADCC-FACB91A277B2@mcs.anl.gov> <070410A8-7B2E-405E-B65D-0CC8521BB17C@cwi.nl> <209DE1A7-D893-450F-89BC-DFBE7B2F2C5D@cwi.nl> <6A83D0A7-1C42-4A91-909F-6B93509C5BF5@mcs.anl.gov> <61D23634-FF5A-4916-9055-9561F69CC36A@cwi.nl> <30D58A3F-D8D9-4692-8BE9-BB8168201BD3@mcs.anl.gov> <6F879BC3-2E8E-4B30-B4A8-CDB039BD3A1E@cwi.nl> <9587193F-E30F-40F2-BF6A-4B690DA5066B@cwi.nl> <261FD0FD-7085-4C71-B2BD-7DB26ACDA853@cwi.nl> <25591182-B391-4B29-B14F-8CC3B62ABD7F@mcs.anl.gov> <4BDFDB50-ED6D-4D44-A8F8-E14F0ED6C10F@cwi.nl> <878w39wo3b.fsf@59A2.or g> <72AF3017-8875-429E-AE5A-EF68E2C9BD87@cwi.nl> <365E880E-F16D-46E1-8339-FF8A780E677E@mcs.anl.gov> <26D7BD78-9879-41A1-B855-9AA1ADE006C9@cwi.nl> Message-ID: Why don't you see how it goes. Where the time seems to be spent for large runs and then if need be you could move more over to the PETSc code. Barry On Sep 10, 2010, at 8:13 PM, Benjamin Sanderse wrote: > Hi Barry, > > Thanks for your comments. I have thought of some of these options as well. There is one (big) thing: > I do not build the Laplacian simply by programming its entries directly, but I generate it as being the product of a divergence and a gradient matrix. These divergence and gradient matrices are also used in other parts of my code. In the end, there are a lot of matrix generation routines which (very) efficiently and elegantly build my entire discretization. I am not sure if I can easily transfer that to C or Fortran, since it heavily relies on the sparse-matrix features of Matlab (like spdiags). > > Maybe you have suggestions for this? For now I stick to Matlab since I love it for prototyping, but I want to solve the pressure matrix a bit faster and that's why I am looking at this 'quick and dirty' solution. > > Ben > > > > Op 10 sep 2010, om 18:25 heeft Barry Smith het volgende geschreven: > >> >> On Sep 10, 2010, at 5:00 PM, Benjamin Sanderse wrote: >> >>> Hi Jed, >>> >>> I forgot to note that my matrix is *extremely* well structured, because I am working on a (non-uniform) Cartesian mesh. The matrix that I want to solve results from discretizing the Laplacian, so in 3D it consists basically of only 7 diagonals. Do you think the same conclusions hold with regard to preconditioners? >>> Furthermore, I get these messages: >>> [1] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 76032)/(num_localrows 82944) > 0.6. Use CompressedRow routines. >>> >>> Does Petsc give a hint here to use other routines, or does it indicate what it is doing? >> >> No you can ignore this message. >> >> I recommend you use boomerAMG as your preconditioner -pc_type hyper -pc_hypre_type boomeramg you must first have configured PETSc with --download-hypre >> >> BTW: your matrix is so simple it doesn't seem to make sense to be generating it in Matlab and shipping it over to PETSc, you should generate the matrix in PETSc (and likely the vectors also) and just use Matlab for visualization or stuff like that. If you use the PETSc DA to parallelize the PETSc code and generate the matrix in PETSc you can use geometric multigrid to solve the system and it will scream in parallel. >> >>> >>> Ben >>> >>> Op 10 sep 2010, om 10:38 heeft Jed Brown het volgende geschreven: >>> >>>> On Fri, 10 Sep 2010 10:27:17 -0600, Benjamin Sanderse wrote: >>>>> - Until now I have been using the 'two-shell' approach suggested by Barry for debugging purposes. This approach works fine, but in a later stage I would like to include the petsc execution command back in Matlab. I tried the following in Matlab: >>>>> system('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &); >>>>> >>>>> unix('petscmpiexec -n 2 ./petsc_poisson_par -viewer_socket_port 5600 &); >>>>> >>>>> In both cases this doesn't work, while issuing the command in a separate shell works fine. Any ideas? >>>> >>>> Please provide the output, "doesn't work" is not much information. >>>> >>>>> - I am using CG to solve a symmetric positive definite matrix. As preconditioner I normally use ICC (incomplete choleski), but apparently this is not implemented in parallel in Petsc. Suggestions on what to take as preconditioner? >>>> >>>> Start with block Jacobi + ICC (-pc_type bjacobi -sub_pc_type icc) or ASM >>>> + ICC (-pc_type asm -sub_pc_type icc). >>>> >>>> Jed >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gdiso at ustc.edu Sat Sep 11 02:00:41 2010 From: gdiso at ustc.edu (Gong Ding) Date: Sat, 11 Sep 2010 15:00:41 +0800 Subject: [petsc-users] Change linear solver during SNES iteration Message-ID: <06BF876415154A8D85886403D224DCF4@cogendaeda> Hi, I wonder if it is possible to use different linear solvers as well as PCs in each SNES iteration. For example, I would like to use MUMPS in the first nonlinear iteration. And then use the factorized matrix by MUMPS as the preconditioner for i.e. BCGS solver in the second iteration. Or even i can use direct/iterative solver alternately for nonlinear iterations? Gong Ding From bsmith at mcs.anl.gov Sat Sep 11 09:57:52 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sat, 11 Sep 2010 09:57:52 -0500 Subject: [petsc-users] Change linear solver during SNES iteration In-Reply-To: <06BF876415154A8D85886403D224DCF4@cogendaeda> References: <06BF876415154A8D85886403D224DCF4@cogendaeda> Message-ID: On Sep 11, 2010, at 2:00 AM, Gong Ding wrote: > Hi, > I wonder if it is possible to use different linear solvers as well as PCs > in each SNES iteration. For example, I would like to use MUMPS > in the first nonlinear iteration. And then use the factorized matrix by > MUMPS as the preconditioner for i.e. BCGS solver in the second iteration. -snes_lag_preconditioner 2 and it will use the old factorization once then on the third Newton step it will do the factorization again -snes_lag_preconditioner 3 means reuse factorization for two more iterations etc. If computing the Jacobian is expensive you can use -snes_mf_operator -snes_lag_preconditioner 100 and -snes_lag_jacobian 100 and it will use the initial factorization for all Newton steps and never form the Jacobian again just apply it matrix free for the iterative solver using the initial factorization as the preconditioner. > Or even i can use direct/iterative solver alternately for nonlinear iterations? To do more complicated things. Like using a a direct solver and then SOR or ASM or some other solver you need to write a little code. You can add your own SNESMonitorSet() routine that changes the PC in the SNES based on some criteria. But likely using the lag options is all you need. Barry > > Gong Ding > > From mtp.vtk at gmail.com Tue Sep 14 09:23:17 2010 From: mtp.vtk at gmail.com (Mathieu P) Date: Tue, 14 Sep 2010 16:23:17 +0200 Subject: [petsc-users] Migrating from Petsc 2 to Petsc 3 Message-ID: HI, I am trying to migrate my program from Petsc 2 to Petsc 3. I am using petscsles.h, so the SLES package in my program. Can you confirm that this library doesn't exist anymore in PETSC 3 ? How to manage this lack of library ? Thanks, Cordialement, -- Mathieu P Sent from my phone. -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Sep 14 09:26:52 2010 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 14 Sep 2010 09:26:52 -0500 Subject: [petsc-users] Migrating from Petsc 2 to Petsc 3 In-Reply-To: References: Message-ID: On Tue, Sep 14, 2010 at 9:23 AM, Mathieu P wrote: > HI, > > I am trying to migrate my program from Petsc 2 to Petsc 3. > > I am using petscsles.h, so the SLES package in my program. > Can you confirm that this library doesn't exist anymore in PETSC 3 ? How to > manage this lack of library ? > In release 2.1.6 (2003) we removed this library since the only functionality already existed in KSP and PC. You can find the list of changes in http://www.mcs.anl.gov/petsc/petsc-as/documentation/changes/index.html Matt > Thanks, > > Cordialement, > > -- Mathieu P > Sent from my phone. > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mtp.vtk at gmail.com Tue Sep 14 10:02:55 2010 From: mtp.vtk at gmail.com (Mathieu P) Date: Tue, 14 Sep 2010 17:02:55 +0200 Subject: [petsc-users] Migrating from Petsc 2 to Petsc 3 In-Reply-To: References: Message-ID: Ok thanks, I have found the remove changes notification in Changes 2.2.0 notes. I will check if i can migrate petsc to v3. Cordialement, -- Mathieu P Sent from my phone. On Tue, Sep 14, 2010 at 4:26 PM, Matthew Knepley wrote: > On Tue, Sep 14, 2010 at 9:23 AM, Mathieu P wrote: > >> HI, >> >> I am trying to migrate my program from Petsc 2 to Petsc 3. >> >> I am using petscsles.h, so the SLES package in my program. >> Can you confirm that this library doesn't exist anymore in PETSC 3 ? How >> to manage this lack of library ? >> > > In release 2.1.6 (2003) we removed this library since the only > functionality already existed in KSP and PC. You > can find the list of changes in > > http://www.mcs.anl.gov/petsc/petsc-as/documentation/changes/index.html > > Matt > > >> Thanks, >> >> Cordialement, >> >> -- Mathieu P >> Sent from my phone. >> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mtp.vtk at gmail.com Tue Sep 14 09:21:14 2010 From: mtp.vtk at gmail.com (Mathieu P) Date: Tue, 14 Sep 2010 16:21:14 +0200 Subject: [petsc-users] Migrating from Petsc 2 to Petsc 3 Message-ID: HI, I am trying to migrate my program from Petsc 2 to Petsc 3. I am using petscsles.h, so the SLES package in my program. Can you confirm that this library doesn't exist anymore in PETSC 3 ? How to manage this lack of library ? Thanks, Cordialement, -- Mathieu P -------------- next part -------------- An HTML attachment was scrubbed... URL: From keita at cray.com Thu Sep 16 13:26:10 2010 From: keita at cray.com (Keita Teranishi) Date: Thu, 16 Sep 2010 13:26:10 -0500 Subject: [petsc-users] Makefile options to leave .o files? Message-ID: <5D6E0DF460ACF34C88644E1EA91DCD0D01ABCB0615@CFEXMBX.americas.cray.com> Hi, I have observed that the default option of the makefile at the main directory deletes libpetsc.a file before compiling the source code files. Is there any options that keep libpetsc.a, and only compile those modified after libpetsc.a? Thanks, ================================ Keita Teranishi Scientific Library Group Cray, Inc. keita at cray.com ================================ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at 59A2.org Thu Sep 16 13:41:06 2010 From: jed at 59A2.org (Jed Brown) Date: Thu, 16 Sep 2010 20:41:06 +0200 Subject: [petsc-users] Makefile options to leave .o files? In-Reply-To: <5D6E0DF460ACF34C88644E1EA91DCD0D01ABCB0615@CFEXMBX.americas.cray.com> References: <5D6E0DF460ACF34C88644E1EA91DCD0D01ABCB0615@CFEXMBX.americas.cray.com> Message-ID: On Thu, Sep 16, 2010 at 20:26, Keita Teranishi wrote: > I have observed that the default option of the makefile at the main > directory deletes libpetsc.a file before compiling the source code files. > ??Is there any options that keep libpetsc.a, and only compile those modified > after libpetsc.a? You can run make from subdirectories, but PETSc's normal recursive make system does not do proper dependency analysis. If you are using petsc-dev, there are two options. config/builder.py leaves the .o files behind, but will require rebuilding everything each time you switch PETSC_ARCH (this is an implementation issue that could be fixed). If you have CMake (>=2.6.2) installed, you can run make from the build directory (as in make -j5 -C $PETSC_DIR/$PETSC_ARCH, the environment variables are irrelevant for this option). This will only rebuild what is necessary, and the .o files are separate for each PETSC_ARCH. Jed From balay at mcs.anl.gov Thu Sep 16 14:24:48 2010 From: balay at mcs.anl.gov (Satish Balay) Date: Thu, 16 Sep 2010 14:24:48 -0500 (CDT) Subject: [petsc-users] Makefile options to leave .o files? In-Reply-To: <5D6E0DF460ACF34C88644E1EA91DCD0D01ABCB0615@CFEXMBX.americas.cray.com> References: <5D6E0DF460ACF34C88644E1EA91DCD0D01ABCB0615@CFEXMBX.americas.cray.com> Message-ID: make ACTION=lib tree However as Jed mentioned - no dependency analysis - so the above recompiles sources that are changed - but if any includes change - you are better off with 'make all' Satish On Thu, 16 Sep 2010, Keita Teranishi wrote: > Hi, > > I have observed that the default option of the makefile at the main directory deletes libpetsc.a file before compiling the source code files. Is there any options that keep libpetsc.a, and only compile those modified after libpetsc.a? > > Thanks, > ================================ > Keita Teranishi > Scientific Library Group > Cray, Inc. > keita at cray.com > ================================ > > From xy2102 at columbia.edu Thu Sep 16 23:42:20 2010 From: xy2102 at columbia.edu (Rebecca Xuefei Yuan) Date: Fri, 17 Sep 2010 00:42:20 -0400 Subject: [petsc-users] Could we tar all saved files on the machine once the computation is done? Message-ID: <20100917004220.g5volhvr40004ws0@cubmail.cc.columbia.edu> Dear all, I remembered to see some notes about taring all saved results once the computation is done, but I was not able to find the reference. Any suggestions? Thanks a lot! Rebecca Xuefei YUAN Department of Applied Physics and Applied Mathematics Columbia University Tel:917-399-8032 www.columbia.edu/~xy2102 From jed at 59A2.org Fri Sep 17 05:16:23 2010 From: jed at 59A2.org (Jed Brown) Date: Fri, 17 Sep 2010 12:16:23 +0200 Subject: [petsc-users] Could we tar all saved files on the machine once the computation is done? In-Reply-To: <20100917004220.g5volhvr40004ws0@cubmail.cc.columbia.edu> References: <20100917004220.g5volhvr40004ws0@cubmail.cc.columbia.edu> Message-ID: On Fri, Sep 17, 2010 at 06:42, Rebecca Xuefei Yuan wrote: > I remembered to see some notes about taring all saved results once the > computation is done, but I was not able to find the reference. Saved results from what? This doesn't really sound like a PETSc question. Jed From daniel.langr at gmail.com Fri Sep 17 06:15:41 2010 From: daniel.langr at gmail.com (Daniel Langr) Date: Fri, 17 Sep 2010 13:15:41 +0200 Subject: [petsc-users] Storage space for symmetric (SBAIJ) matrix Message-ID: <4C934DDD.2010000@gmail.com> Hi all, I do not understand much how PETSc works with symmetric matrices. I tried some tests with a symmetric matrix, which have nonzeroes only in the main diagonal and in the last column/row, e.g., the following pattern: * 0 0 * 0 * 0 * 0 0 * * * * * * for a 4 x 4 matrix. Since PETSc requires to set values only for the upper triangular part of a matrix, I set two values for every row except of the last one with only one value. My code looks like: MatCreate(PETSC_COMM_WORLD, &A); MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, n, n); MatSetType(A, MATMPISBAIJ); MatMPISBAIJSetPreallocation(A, 1, 1, PETSC_NULL, 1, PETSC_NULL); MatGetOwnershipRange(A, &first, &last); last--; for (i = first; i <= last; i++) { MatSetValue(A, i, i, a, INSERT_VALUES); if (i != (n - 1)) MatSetValue(A, i, n - 1, a, INSERT_VALUES); } MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY); MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY); (The value of the variable "a" changes for every nonzero entry. I omitted this code in the above printout together with an error checking for better readability.) Resulting matrix works quite fine but when I check the matrix info (e.g. using -mat_view_info_detailed option) for matrix size n=10000 and 2 MPI processes I get: // First and last row for every MPI process Rank: 0, Fist row: 0, Last row: 4999 Rank: 1, Fist row: 5000, Last row: 9999 Matrix Object: type=mpisbaij, rows=10000, cols=10000 total: nonzeros=19999, allocated nonzeros=69990 [0] Local rows 10000 nz 10000 nz alloced 10000 bs 1 mem 254496 [0] on-diagonal part: nz 5000 [0] off-diagonal part: nz 5000 [1] Local rows 10000 nz 9999 nz alloced 59990 bs 1 mem 264494 [1] on-diagonal part: nz 9999 [1] off-diagonal part: nz 0 1. My problem is with the amount of memory used. For 10000 nonzeroes of the first process I would except memory needs for CSR storage format something approximately like: nz * sizeof(PetscScalar) + nz * sizeof(PetscInt) + n_local_rows * sizeof(PetscInt) = 10000 * 8 + 10000 * 4 + 5000 * 4 = 140000 bytes and matrix info gives 254496 bytes. Similarly for the second process. I would understand some additional space needed because of efficiency but this is more than 180 precent of a space really needed for storing CSR matrix, which is quite unacceptable for large problems. 2. Why there is "Local rows 10000"? Shouldn't be this 5000 for every process? 3. Why there is "alloced 59990 bs" for the second process? Why there is "total: nonzeros=19999, allocated nonzeros=69990"? 4. Why there is 9999 on-diagonal and 0 off-diagonal nonzeroes for the second process? This is not true for my matrix. There is no information about symmetric matrices in PETSc Users Manual. I would really welcome some hints how to works with them. For example, how to effectively construct such matrices. When I have to set values only for the upper triangular part and expect approximately similar fill for every row, then, to give every process the same amount of rows (as MatGetOwnershipRange indicates) would lead to terrible load balancing. At least for the matrix construction process. Thanks, Daniel From knepley at gmail.com Fri Sep 17 06:25:28 2010 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 17 Sep 2010 06:25:28 -0500 Subject: [petsc-users] Could we tar all saved files on the machine once the computation is done? In-Reply-To: <20100917004220.g5volhvr40004ws0@cubmail.cc.columbia.edu> References: <20100917004220.g5volhvr40004ws0@cubmail.cc.columbia.edu> Message-ID: I recommend you use the tools in Python. Look at the 'tarfile' package. It is used in config/configure.py Matt On Thu, Sep 16, 2010 at 11:42 PM, Rebecca Xuefei Yuan wrote: > Dear all, > > I remembered to see some notes about taring all saved results once the > computation is done, but I was not able to find the reference. > > Any suggestions? > > Thanks a lot! > > Rebecca Xuefei YUAN > Department of Applied Physics and Applied Mathematics > Columbia University > Tel:917-399-8032 > www.columbia.edu/~xy2102 > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at 59A2.org Fri Sep 17 06:29:12 2010 From: jed at 59A2.org (Jed Brown) Date: Fri, 17 Sep 2010 13:29:12 +0200 Subject: [petsc-users] Storage space for symmetric (SBAIJ) matrix In-Reply-To: <4C934DDD.2010000@gmail.com> References: <4C934DDD.2010000@gmail.com> Message-ID: On Fri, Sep 17, 2010 at 13:15, Daniel Langr wrote: > Hi all, > > I do not understand much how PETSc works with symmetric matrices. I tried > some tests with a symmetric matrix, which have nonzeroes only > in the main diagonal and in the last column/row, e.g., the following > pattern: > > * 0 0 * > 0 * 0 * > 0 0 * * > * * * * > > for a 4 x 4 matrix. Since PETSc requires to set values only for the upper > triangular part of a matrix, I set two values for every row except of the > last one with only one value. My code looks like: > > MatCreate(PETSC_COMM_WORLD, &A); > MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, n, n); > MatSetType(A, MATMPISBAIJ); > MatMPISBAIJSetPreallocation(A, 1, 1, PETSC_NULL, 1, PETSC_NULL); > MatGetOwnershipRange(A, &first, &last); > last--; > > for (i = first; i <= last; i++) { > ?MatSetValue(A, i, i, a, INSERT_VALUES); > ?if (i != (n - 1)) > ? ?MatSetValue(A, i, n - 1, a, INSERT_VALUES); > } > > MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY); > MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY); > > (The value of the variable "a" changes for every nonzero entry. I omitted > this code in the above printout together with an error checking for better > readability.) > > Resulting matrix works quite fine but when I check the matrix info (e.g. > using -mat_view_info_detailed option) for matrix size n=10000 and 2 MPI > processes I get: > > // First and last row for every MPI process > Rank: 0, Fist row: 0, Last row: 4999 > Rank: 1, Fist row: 5000, Last row: 9999 > > Matrix Object: > ?type=mpisbaij, rows=10000, cols=10000 > ?total: nonzeros=19999, allocated nonzeros=69990 > ? ?[0] Local rows 10000 nz 10000 nz alloced 10000 bs 1 mem 254496 > ? ?[0] on-diagonal part: nz 5000 > ? ?[0] off-diagonal part: nz 5000 > ? ?[1] Local rows 10000 nz 9999 nz alloced 59990 bs 1 mem 264494 > ? ?[1] on-diagonal part: nz 9999 > ? ?[1] off-diagonal part: nz 0 How were you preallocating this matrix? Jed From daniel.langr at gmail.com Fri Sep 17 06:33:15 2010 From: daniel.langr at gmail.com (Daniel Langr) Date: Fri, 17 Sep 2010 13:33:15 +0200 Subject: [petsc-users] Storage space for symmetric (SBAIJ) matrix In-Reply-To: References: <4C934DDD.2010000@gmail.com> Message-ID: <4C9351FB.7020608@gmail.com> >> Hi all, >> >> I do not understand much how PETSc works with symmetric matrices. I tried >> some tests with a symmetric matrix, which have nonzeroes only >> in the main diagonal and in the last column/row, e.g., the following >> pattern: >> >> * 0 0 * >> 0 * 0 * >> 0 0 * * >> * * * * >> >> for a 4 x 4 matrix. Since PETSc requires to set values only for the upper >> triangular part of a matrix, I set two values for every row except of the >> last one with only one value. My code looks like: >> >> MatCreate(PETSC_COMM_WORLD,&A); >> MatSetSizes(A, PETSC_DECIDE, PETSC_DECIDE, n, n); >> MatSetType(A, MATMPISBAIJ); >> MatMPISBAIJSetPreallocation(A, 1, 1, PETSC_NULL, 1, PETSC_NULL); >> MatGetOwnershipRange(A,&first,&last); >> last--; >> >> for (i = first; i<= last; i++) { >> MatSetValue(A, i, i, a, INSERT_VALUES); >> if (i != (n - 1)) >> MatSetValue(A, i, n - 1, a, INSERT_VALUES); >> } >> >> MatAssemblyBegin(A, MAT_FINAL_ASSEMBLY); >> MatAssemblyEnd(A, MAT_FINAL_ASSEMBLY); >> >> (The value of the variable "a" changes for every nonzero entry. I omitted >> this code in the above printout together with an error checking for better >> readability.) >> >> Resulting matrix works quite fine but when I check the matrix info (e.g. >> using -mat_view_info_detailed option) for matrix size n=10000 and 2 MPI >> processes I get: >> >> // First and last row for every MPI process >> Rank: 0, Fist row: 0, Last row: 4999 >> Rank: 1, Fist row: 5000, Last row: 9999 >> >> Matrix Object: >> type=mpisbaij, rows=10000, cols=10000 >> total: nonzeros=19999, allocated nonzeros=69990 >> [0] Local rows 10000 nz 10000 nz alloced 10000 bs 1 mem 254496 >> [0] on-diagonal part: nz 5000 >> [0] off-diagonal part: nz 5000 >> [1] Local rows 10000 nz 9999 nz alloced 59990 bs 1 mem 264494 >> [1] on-diagonal part: nz 9999 >> [1] off-diagonal part: nz 0 > > How were you preallocating this matrix? > > Jed There is a preallocation routine call in my printout. Again: MatMPISBAIJSetPreallocation(A, 1, 1, PETSC_NULL, 1, PETSC_NULL); Daniel From xy2102 at columbia.edu Fri Sep 17 06:33:54 2010 From: xy2102 at columbia.edu (Rebecca Xuefei Yuan) Date: Fri, 17 Sep 2010 07:33:54 -0400 Subject: [petsc-users] Could we tar all saved files on the machine once the computation is done? In-Reply-To: References: <20100917004220.g5volhvr40004ws0@cubmail.cc.columbia.edu> Message-ID: <20100917073354.qor8ynryso4gg4c0@cubmail.cc.columbia.edu> Dear Matt and Jed, I will look into that! Thanks a lot! Rebecca Quoting Matthew Knepley : > I recommend you use the tools in Python. Look at the 'tarfile' package. It > is used in config/configure.py > > Matt > > On Thu, Sep 16, 2010 at 11:42 PM, Rebecca Xuefei Yuan > wrote: > >> Dear all, >> >> I remembered to see some notes about taring all saved results once the >> computation is done, but I was not able to find the reference. >> >> Any suggestions? >> >> Thanks a lot! >> >> Rebecca Xuefei YUAN >> Department of Applied Physics and Applied Mathematics >> Columbia University >> Tel:917-399-8032 >> www.columbia.edu/~xy2102 >> >> > > > -- > What most experimenters take for granted before they begin their experiments > is infinitely more interesting than any results to which their experiments > lead. > -- Norbert Wiener > Rebecca Xuefei YUAN Department of Applied Physics and Applied Mathematics Columbia University Tel:917-399-8032 www.columbia.edu/~xy2102 From jed at 59A2.org Fri Sep 17 07:16:14 2010 From: jed at 59A2.org (Jed Brown) Date: Fri, 17 Sep 2010 14:16:14 +0200 Subject: [petsc-users] Storage space for symmetric (SBAIJ) matrix In-Reply-To: <4C9351FB.7020608@gmail.com> References: <4C934DDD.2010000@gmail.com> <4C9351FB.7020608@gmail.com> Message-ID: There is wrong with the accounting here, I'm looking into it. Jed From jed at 59A2.org Fri Sep 17 11:51:15 2010 From: jed at 59A2.org (Jed Brown) Date: Fri, 17 Sep 2010 18:51:15 +0200 Subject: [petsc-users] Storage space for symmetric (SBAIJ) matrix In-Reply-To: <4C934DDD.2010000@gmail.com> References: <4C934DDD.2010000@gmail.com> Message-ID: On Fri, Sep 17, 2010 at 13:15, Daniel Langr wrote: > Matrix Object: > ?type=mpisbaij, rows=10000, cols=10000 > ?total: nonzeros=19999, allocated nonzeros=69990 > ? ?[0] Local rows 10000 nz 10000 nz alloced 10000 bs 1 mem 254496 > ? ?[0] on-diagonal part: nz 5000 > ? ?[0] off-diagonal part: nz 5000 > ? ?[1] Local rows 10000 nz 9999 nz alloced 59990 bs 1 mem 264494 > ? ?[1] on-diagonal part: nz 9999 > ? ?[1] off-diagonal part: nz 0 This now with petsc-dev, see src/mat/examples/tests/ex135.c. $ mpiexec -n 2 ./ex135 -n 10000 -mat_view_info_detailed Matrix Object: type=mpisbaij, rows=10000, cols=10000 total: nonzeros=19999, allocated nonzeros=20000 total number of mallocs used during MatSetValues calls =0 [0] Local rows 5000 nz 10000 nz alloced 10000 bs 1 mem 254856 [0] on-diagonal part: nz 5000 [0] off-diagonal part: nz 5000 [1] Local rows 5000 nz 9999 nz alloced 10000 bs 1 mem 264854 [1] on-diagonal part: nz 9999 [1] off-diagonal part: nz 0 Information on VecScatter used in matrix-vector product: [0] Number sends = 0; Number to self = 0 [0] Number receives = 1; Number from self = 0 [0] 0 length 1 from whom 1 Now the indices for all remote receives (in order by process received from) [0] 0 [1] Number sends = 1; Number to self = 0 [1] 0 length = 1 to whom 0 Now the indices for all remote sends (in order by process sent to) [1] 4999 [1] Number receives = 0; Number from self = 0 > 1. My problem is with the amount of memory used. For 10000 nonzeroes of the > first process I would except memory needs for CSR storage format something > approximately like: > > nz * sizeof(PetscScalar) + nz * sizeof(PetscInt) + n_local_rows * > sizeof(PetscInt) > = 10000 * 8 + 10000 * 4 + 5000 * 4 > = 140000 bytes The dynamic assembly process involves two more arrays of length equal to the number of local rows. I don't think there is currently an interface for MPISBAIJ to avoid these arrays. > and matrix info gives 254496 bytes. Similarly for the second process. I > would understand some additional space needed because of efficiency but this > is more than 180 precent of a space really needed for storing CSR matrix, > which is quite unacceptable for large problems. Most matrices have more than two nonzeros per row in which case 2*local_rows*sizeof(PetscInt) is lost in the noise (it costs the same as one extra vector). Note that there are two additional private vectors needed for the parallel multiply (not included in the matrix "mem" field). If the matrices you are really interested in working with have structure Diagonal + Fringe, then you could define your own optimized format (e.g. only storing two vectors). > 2. Why there is "Local rows 10000"? Shouldn't be this 5000 for every > process? Yes, thanks. This is fixed now. > 3. Why there is "alloced 59990 bs" for the second process? Why there is > "total: nonzeros=19999, allocated nonzeros=69990"? You did not preallocate correctly. > 4. Why there is 9999 on-diagonal and 0 off-diagonal nonzeroes for the second > process? This is not true for my matrix. That is referring to the diagonal block, not the diagonal itself. The last process hold both the (scalar) diagonal and the fringe in the diagonal block (it's action is local). > There is no information about symmetric matrices in PETSc Users Manual. I > would really welcome some hints how to works with them. For example, how to > effectively construct such matrices. When I have to set values only for the > upper triangular part and expect approximately similar fill for every row, > then, to give every process the same amount of rows (as MatGetOwnershipRange > indicates) would lead to terrible load balancing. At least for the matrix > construction process. It looks like this info is only in the man pages at the moment, we'll put it on the list to add to the manual. An example using (by default) block size 2 MPISBAIJ matrices can be found at src/snes/examples/tutorials/ex48.c. Note that the same assembly code works with AIJ, BAIJ, and SBAIJ (parallel and serial). This (finite element) example computes the upper triangular part of the element stiffness matrix (which may not be upper triangular in the global ordering) and mirrors it (cheap) before calling MatSetValuesBlockedStencil. Most sparse matrices involve some spatial locality and have the majority of entries in the diagonal block. For example, consider a structural mechanics problem with some spatial domain decomposition. Only the nodes lying on subdomain boundaries involve any entries in the off-diagonal block. Symmetric formats involve an asymmetric decision of which side "owns" these interface values, but the entries can still be produced in a symmetric manner. Symmetric formats usually have somewhat slower throughput, although the memory savings are significant. You will have to decide which is better for your application. I hope this helps. Jed From aron.ahmadia at kaust.edu.sa Sat Sep 18 19:01:17 2010 From: aron.ahmadia at kaust.edu.sa (Aron Ahmadia) Date: Sat, 18 Sep 2010 20:01:17 -0400 Subject: [petsc-users] some questions about PETSc In-Reply-To: References: Message-ID: Dear Amal, Thanks for the questions. These are great! I think they show a good fundamental approach, you are thinking about these problems like a PETSc scientist would. I am going to cc petsc-users on the reply in case anybody wants to add or comment: *How to set up a DA for multiple equations in petsc4py or PETSc? (for q)* The PETSc terminology for the number of equations active at a given node is 'degrees of freedom', which as far as I can tell, is a term borrowed from finite element analysis of structures, originally referring (in mechanics) to the number of potential displacements or rotations that specify an element in the system, but in computational science as a way to describe the total number of equations in the discretized system (i.e., the number of elements in the right-hand side vector). After you have called DACreate, you call DASetDof ( http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-dev/docs/manualpages/DA/DASetDof.html) to set the number of degrees of freedom per vertex. *How to deal with boundary conditions where you have special treatment for them? In an example we have seen that is done by simply if statements that checks if this point is a boundary condition cell, should we do it this way. * As far as I know, this is the PETSc way, so you do not need to do anything differently :o) Later on, when we are tuning for performance, we might extract the boundary-handling code from the main for loop, and create index sets to represent each of the sections of the array we would like to iterate over. If we profile the code and this is a non-significant section, then we just leave it. *When we start incorporating petsc4py structures in clawpack, output is being duplicate for each process, is the way to avoid this is by (if statements) to determine that everything need to be done once is the responsibility of process 0 for instance, or there are other ways to do it?* PetscPrintf ( http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-dev/docs/manualpages/Sys/PetscPrintf.html) will do what you want. Hope this helps, Aron On Sat, Sep 18, 2010 at 3:06 AM, Amal Alghamdi wrote: > Dear Dr. Aron and Dr. Matt, > > I had a discussion with Dr. David where we had some questions about PETSc. > We would appriciate if you can help in this regard. > > - How to set up a DA for multiple equations in petsc4py or PETSc? (for q) > - How to deal with boundary conditions where you have special treatment for > them? and should they be included in the main vector in the DA explicitly. > In an example we have seen that is done by simply if statements that checks > if this point is a boundary condition cell, should we do it this way. > - When we start incorporating petsc4py structures in clawpack, output is > being duplicate for each process, is the way to avoid this is by (if > statements) to determine that everything need to be done once is the > responsibility of process 0 for instance, or there are other ways to do it? > > Thank you, > Amal > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dalcinl at gmail.com Sat Sep 18 19:55:02 2010 From: dalcinl at gmail.com (Lisandro Dalcin) Date: Sat, 18 Sep 2010 21:55:02 -0300 Subject: [petsc-users] some questions about PETSc In-Reply-To: References: Message-ID: On 18 September 2010 21:01, Aron Ahmadia wrote: > Dear Amal, > Thanks for the questions. ?These are great! ?I think they show a good > fundamental approach, you are thinking about these problems like a PETSc > scientist would. ?I am going to cc petsc-users on the reply in case anybody > wants to add or comment: > How to set up a DA for multiple equations in petsc4py or PETSc? (for q) > The PETSc terminology for the number of equations active at a given node is > 'degrees of freedom', which as far as I can tell, is a term borrowed from > finite element analysis of structures, originally referring (in mechanics) > to the number of potential displacements or rotations that specify an > element in the system, but in computational science as a way to describe the > total number of equations in the discretized system (i.e., the number of > elements in the right-hand side vector). ?After you have called DACreate, > you call DASetDof > (http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-dev/docs/manualpages/DA/DASetDof.html) > to set the number of degrees of freedom per vertex. In petsc4py, you pass "dof" as a keyword argument to create() method: da = PETSc.DA().create((Nx, Ny, Nz), dof=4, ...) > How to deal with boundary conditions where you have special treatment for > them? ?In an example we have seen that is done by simply if statements that > checks if this point is a boundary condition cell, should we do it this way. > As far as I know, this is the PETSc way, so you do not need to do anything > differently :o) ?Later on, when we are tuning for performance, we might > extract the boundary-handling code from the main for loop, and create index > sets to represent each of the sections of the array we would like to iterate > over. ?If we profile the code and this is a non-significant section, then we > just leave it. And with petsc4py running on parallel, this is a mess to implement. I still have to found a way to emulate DA Vec arrays, as NumPy arrays does not support lower bounds in dimensions. > When we start incorporating petsc4py structures in clawpack, output is being > duplicate for each process, is the way to avoid this is by (if statements) > to determine that everything need to be done once is the responsibility of > process 0 for instance, or there are other ways to do it? > PetscPrintf > (http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-dev/docs/manualpages/Sys/PetscPrintf.html) > will do what you want. In petsc4py, you can use PETSc.Sys.Print (also look at PETSc.Sys.syncPrint and PETSc.Sys.syncFlush). PETSc.Sys.Print (and syncPrint) should be used like print function Python 3, but it interprets an extra 'comm' kw argument. -- Lisandro Dalcin --------------- CIMEC (INTEC/CONICET-UNL) Predio CONICET-Santa Fe Colectora RN 168 Km 472, Paraje El Pozo Tel: +54-342-4511594 (ext 1011) Tel/Fax: +54-342-4511169 From aron.ahmadia at kaust.edu.sa Sat Sep 18 22:12:39 2010 From: aron.ahmadia at kaust.edu.sa (Aron Ahmadia) Date: Sat, 18 Sep 2010 23:12:39 -0400 Subject: [petsc-users] some questions about PETSc In-Reply-To: References: Message-ID: Thanks for the extra notes Lisandro, I've migrated some of the results of this discussion to the petclaw development wiki here: https://bitbucket.org/knepley/petclaw/wiki/How_do_I A On Sat, Sep 18, 2010 at 8:55 PM, Lisandro Dalcin wrote: > On 18 September 2010 21:01, Aron Ahmadia > wrote: > > Dear Amal, > > Thanks for the questions. These are great! I think they show a good > > fundamental approach, you are thinking about these problems like a PETSc > > scientist would. I am going to cc petsc-users on the reply in case > anybody > > wants to add or comment: > > How to set up a DA for multiple equations in petsc4py or PETSc? (for q) > > The PETSc terminology for the number of equations active at a given node > is > > 'degrees of freedom', which as far as I can tell, is a term borrowed from > > finite element analysis of structures, originally referring (in > mechanics) > > to the number of potential displacements or rotations that specify an > > element in the system, but in computational science as a way to describe > the > > total number of equations in the discretized system (i.e., the number of > > elements in the right-hand side vector). After you have called DACreate, > > you call DASetDof > > ( > http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-dev/docs/manualpages/DA/DASetDof.html > ) > > to set the number of degrees of freedom per vertex. > > In petsc4py, you pass "dof" as a keyword argument to create() method: > > da = PETSc.DA().create((Nx, Ny, Nz), dof=4, ...) > > > > How to deal with boundary conditions where you have special treatment for > > them? In an example we have seen that is done by simply if statements > that > > checks if this point is a boundary condition cell, should we do it this > way. > > As far as I know, this is the PETSc way, so you do not need to do > anything > > differently :o) Later on, when we are tuning for performance, we might > > extract the boundary-handling code from the main for loop, and create > index > > sets to represent each of the sections of the array we would like to > iterate > > over. If we profile the code and this is a non-significant section, then > we > > just leave it. > > And with petsc4py running on parallel, this is a mess to implement. I > still have to found a way to emulate DA Vec arrays, as NumPy arrays > does not support lower bounds in dimensions. > > > When we start incorporating petsc4py structures in clawpack, output is > being > > duplicate for each process, is the way to avoid this is by (if > statements) > > to determine that everything need to be done once is the responsibility > of > > process 0 for instance, or there are other ways to do it? > > PetscPrintf > > ( > http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-dev/docs/manualpages/Sys/PetscPrintf.html > ) > > will do what you want. > > In petsc4py, you can use PETSc.Sys.Print (also look at > PETSc.Sys.syncPrint and PETSc.Sys.syncFlush). PETSc.Sys.Print (and > syncPrint) should be used like print function Python 3, but it > interprets an extra 'comm' kw argument. > > > > -- > Lisandro Dalcin > --------------- > CIMEC (INTEC/CONICET-UNL) > Predio CONICET-Santa Fe > Colectora RN 168 Km 472, Paraje El Pozo > Tel: +54-342-4511594 (ext 1011) > Tel/Fax: +54-342-4511169 > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vedaprakashsubramanian at gmail.com Sun Sep 19 20:05:48 2010 From: vedaprakashsubramanian at gmail.com (vedaprakash subramanian) Date: Sun, 19 Sep 2010 19:05:48 -0600 Subject: [petsc-users] How to pass the parameters for KSP Message-ID: I am converting a MATLAB function into a KSP solver. I am doing it similar to BiCGStab. But I wanted to know how to pass the arguments of the function into KSP solver. Thanks, Vedaprakash -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sun Sep 19 21:49:47 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sun, 19 Sep 2010 21:49:47 -0500 Subject: [petsc-users] How to pass the parameters for KSP In-Reply-To: References: Message-ID: <27B820EF-3A6A-424B-8A43-2A43E82089B3@mcs.anl.gov> What arguments? Do you mean the right hand side x and the matrix? Or do you mean parameters like the relative tolerance in convergence? See src/ksp/ksp/examples/tutorials/ex1.c for a simple example. Barry On Sep 19, 2010, at 8:05 PM, vedaprakash subramanian wrote: > I am converting a MATLAB function into a KSP solver. I am doing it similar to BiCGStab. But I wanted to know how to pass the arguments of the function into KSP solver. > > Thanks, > Vedaprakash From labarba at bu.edu Mon Sep 20 06:49:57 2010 From: labarba at bu.edu (Lorena Barba) Date: Mon, 20 Sep 2010 07:49:57 -0400 Subject: [petsc-users] Advanced studies institute in Chile Message-ID: Dear PETSc friends, Please note the announcement below, and distribute to interested parties. Many thanks, Lorena Barba Boston University ================== "Pan-American Advanced Studies Institute ? Scientific Computing in the Americas: the challenge of massive parallelism" (3-14 Jan. 2011) http://www.bu.edu/pasi/ Thanks to NSF and DOE funding, there is travel support for up to 30 graduate students and postdoctoral fellows to attend, from the US and the rest of the Americas. Apply online to be a part of this exciting educational event! Keynote lectures by: David Keyes, Columbia University and KAUST Takayuki Aoki, Tokyo Institute of Technology Satoshi Matsuoka, Tokyo Institute of Technology Guest courses by: Tsuyoshi Hamada, Nagasaki Advanced Computing Center (NACC) and Felipe A Cruz, NACC Lecturers (confirmed): Lorena Barba, Boston University Jaydeep Bardhan, Rush University Medical Center Nathan Bell, NVIDIA Richard Brower, Boston University Luis Miguel de la Cruz, Universidad Nacional Aut?noma de M?xico lisandro Dalcin, Centro Internacional de M?todos Computacionales en Ingenier?a, Argentina Susana G?mez, Universidad Nacional Aut?noma de M?xixo Andreas Klockner, New York University Matthew Knepley, University of Chicago Marc Spiegelman, Columbia University Rio Yokota, Boston Universiy To be confirmed: Wen-mei Hwu, University of Illinois -------------- next part -------------- An HTML attachment was scrubbed... URL: From mafunk at nmsu.edu Mon Sep 20 12:24:20 2010 From: mafunk at nmsu.edu (Matt Funk) Date: Mon, 20 Sep 2010 11:24:20 -0600 Subject: [petsc-users] using superlu_dist In-Reply-To: <527FD798-5AE4-4B22-A4F1-60F9833043B2@mcs.anl.gov> References: <201009101459.47077.mafunk@nmsu.edu> <201009101543.26151.mafunk@nmsu.edu> <527FD798-5AE4-4B22-A4F1-60F9833043B2@mcs.anl.gov> Message-ID: <201009201124.20536.mafunk@nmsu.edu> Hi Barry, thanks for the advice. I will do what you suggested. matt On Friday, September 10, 2010, Barry Smith wrote: > Just always use SUPERLU_DIST for any number of processes including 1. It > is much faster and uses less memory the superlu. > > Barry > > You should only use superlu when the matrix is super ill-conditioned like > condition number 10^10 and superlu_dist don't work > > On Sep 10, 2010, at 4:43 PM, Matt Funk wrote: > > HI Barry, > > > > thanks for the heads up, however, i am not using the command line. > > So what i did is when i set my pc i do: > > > > if(m_preCondType == "LU_SUPERLU") { > > > > m_ierr = PCSetType(m_pc, PCLU); > > if (numProc() > 1) { > > > > PCFactorSetMatSolverPackage(m_pc,MAT_SOLVER_SUPERLU_DIST); > > > > } > > > > else { > > > > PCFactorSetMatSolverPackage(m_pc,MAT_SOLVER_SUPERLU); > > > > } > > > > } > > > > and the matrix type is set to MATAIJ. Anyway, but i still need to > > distingiush between superlu and superlu_dist it seems as specifying > > superlu for a parallel run throws an error. > > > > I suppose that when invoking this from the command line there is some > > code that test to see whether this is a serial/parallel run and makes > > similar calls as i did above? > > > > > > thank you > > matt > > > > On Friday, September 10, 2010, Barry Smith wrote: > >> This has all changed in the 3.0.0 release. It is much simpler now. > >> > >> Any ways you don't need that crap for differences between 1 or more > >> > >> processors. Just use MATAIJ always and use -pc_type lu > >> -pc_factor_mat_solver_package superlu_dist with 3.0.0 or later > >> > >> Barry > >> > >> On Sep 10, 2010, at 3:59 PM, Matt Funk wrote: > >>> Hi, > >>> > >>> i was wondering on how i need to set the matrix type when i want to use > >>> the superlu_dist solver. > >>> > >>> Right now what i have is: > >>> if (m_preCondType == "LU_SUPERLU") { > >>> > >>> if (numProc() > 1) > >>> > >>> m_ierr = MatSetType(m_globalMatrix, MATAIJ); > >>> > >>> else { > >>> > >>> m_ierr = MatSetType(m_globalMatrix, MATSEQAIJ); > >>> > >>> } > >>> > >>> } > >>> > >>> This i believe is according to the table in the petsc users manual > >>> (p.82). Anyway, things work ok on 1 processor. However, when i try 8 > >>> processors (i.e. it tells me: > >>> [3]PETSC ERROR: --------------------- Error Message > >>> ------------------------------------ > >>> [3]PETSC ERROR: No support for this operation for this object type! > >>> [3]PETSC ERROR: Matrix format mpiaij does not have a built-in PETSc > >>> direct solver! > >>> > >>> > >>> So i guess i should not use the MATAIJ matrix format? I also tried the > >>> MATMPIAIJ format, but got the same problem. > >>> > >>> So how is one supposed to use it? Obviously i am doing something wrong. > >>> Any help is appreciated. > >>> > >>> thanks > >>> matt From u.tabak at tudelft.nl Mon Sep 20 13:17:03 2010 From: u.tabak at tudelft.nl (Umut Tabak) Date: Mon, 20 Sep 2010 20:17:03 +0200 Subject: [petsc-users] Advanced studies institute in Chile In-Reply-To: References: Message-ID: <4C97A51F.8060604@tudelft.nl> Lorena Barba wrote: > > Thanks to NSF and DOE funding, there is travel support for up to 30 > graduate students and postdoctoral fellows to attend, from the US and > the rest of the Americas. Too bad that it is restricted to 'Americas' :(, why not include Europe as well... From lvankampenhout at gmail.com Tue Sep 21 03:41:43 2010 From: lvankampenhout at gmail.com (Leo van Kampenhout) Date: Tue, 21 Sep 2010 10:41:43 +0200 Subject: [petsc-users] profiling question Message-ID: Dear all, in order to calculate speedup (Sp = T1/Tp) I need an accurate measurement of T1, the time to solve on 1 processor. I will be using the parallel algorithm for that, but there seems to be a hick-up. At the cluster I am currently working on, each node is made up by 12 PEs and have shared memory. When I would just reserve 1 PE for my job, the other 11 processors are given to other users, therefore giving dynamic load on the memory system resulting into inaccurate timings. The solve-times I get are ranging between 1 and 5 minutes. For me, this is not very scientific either. The second idea was to reserve all 12 PEs on the node and just let 1 PE run the job. However, in this way the single CPU gets all the memory bandwidth and has no waiting time, therefore giving very fast results. When I would calculate speedup from here, the algorithm does not scale very well. Another idea would be to spawn 12 identical jobs on 12 PEs and take the average runtime. Unfortunately, there is only one PETSC_COMM_WORLD, so I think this is impossible to do from within one program (MPI_COMM_WORLD). Do you fellow PETSc-users have any ideas on the subject? It would be much appreciated. regards, Leo van Kampenhout -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at 59A2.org Tue Sep 21 04:36:16 2010 From: jed at 59A2.org (Jed Brown) Date: Tue, 21 Sep 2010 11:36:16 +0200 Subject: [petsc-users] profiling question In-Reply-To: References: Message-ID: On Tue, Sep 21, 2010 at 10:41, Leo van Kampenhout wrote: > At the cluster I am currently working on, each node is made up by 12 PEs and > have shared memory. When I would just reserve 1 PE for my job, the other 11 > processors are given to other users, therefore giving dynamic load on the > memory system resulting into inaccurate timings. The solve-times I get are > ranging between 1 and 5 minutes. For me, this is not very scientific either. First, shared memory and especially NUMA architectures are very difficult to draw meaningful intra-node scalability conclusions on. If at all possible, try to compare inter-node scalability instead since it is a far more reliable estimate and less architecture-dependent (provided the network is decent). That said, you should be looking for reproducibility much more than "good" scaling. It's well known that intra-node memory contention is a major issue, the STREAM benchmarks actually show _lower_ total bandwidth when running on all 6 cores per socket with Istanbul than when using only 4 (and 2 cores is within a few percent). > The second idea was to reserve all 12 PEs on the node and just let 1 PE run > the job. However, in this way the single CPU gets all the memory bandwidth > and has no waiting time, therefore giving very fast results. When I would > calculate speedup from here, the algorithm does not scale very well. I say just do this and live with the poor intra-node scaling numbers. Some architectures actually scale memory within the node (e.g. BlueGene), but most don't. People expect to see the memory bottleneck in these results, it's nothing to be ashamed of. > Another idea would be to spawn 12 identical jobs on 12 PEs and take the > average runtime. Unfortunately, there is only one PETSC_COMM_WORLD, so I > think this is impossible to do from within one program (MPI_COMM_WORLD). You could split MPI_COMM_WORLD and run a separate PETSC_COMM_WORLD on each group, but I think this option will not be reproducible (the instances will slightly out of sync, so memory and communication bottlenecks will be loaded in different ways on subsequent runs) and is a bit disingenuous because this is not a configuration that you would ever run in practice. Jed From lvankampenhout at gmail.com Tue Sep 21 06:49:42 2010 From: lvankampenhout at gmail.com (Leo van Kampenhout) Date: Tue, 21 Sep 2010 13:49:42 +0200 Subject: [petsc-users] profiling question In-Reply-To: References: Message-ID: Thanks for the helpful response Jed. I was not aware of the possibility to run seperate PETSC_COMM_WORLDS in the same program, at least this is not clear from the documentation (e.g. http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-dev/docs/manualpages/Sys/PetscInitialize.html) I'll probably still try this out just out of curiosity. About presenting scaling results, the most appealing to me seems to show two graphs, one with intra-node scaling (1-12) and the other going upwards from there (12, 24, 36, ...) Leo 2010/9/21 Jed Brown > On Tue, Sep 21, 2010 at 10:41, Leo van Kampenhout > wrote: > > At the cluster I am currently working on, each node is made up by 12 PEs > and > > have shared memory. When I would just reserve 1 PE for my job, the other > 11 > > processors are given to other users, therefore giving dynamic load on the > > memory system resulting into inaccurate timings. The solve-times I get > are > > ranging between 1 and 5 minutes. For me, this is not very scientific > either. > > First, shared memory and especially NUMA architectures are very > difficult to draw meaningful intra-node scalability conclusions on. > If at all possible, try to compare inter-node scalability instead > since it is a far more reliable estimate and less > architecture-dependent (provided the network is decent). That said, > you should be looking for reproducibility much more than "good" > scaling. It's well known that intra-node memory contention is a major > issue, the STREAM benchmarks actually show _lower_ total bandwidth > when running on all 6 cores per socket with Istanbul than when using > only 4 (and 2 cores is within a few percent). > > > The second idea was to reserve all 12 PEs on the node and just let 1 PE > run > > the job. However, in this way the single CPU gets all the memory > bandwidth > > and has no waiting time, therefore giving very fast results. When I would > > calculate speedup from here, the algorithm does not scale very well. > > I say just do this and live with the poor intra-node scaling numbers. > Some architectures actually scale memory within the node (e.g. > BlueGene), but most don't. People expect to see the memory bottleneck > in these results, it's nothing to be ashamed of. > > > Another idea would be to spawn 12 identical jobs on 12 PEs and take the > > average runtime. Unfortunately, there is only one PETSC_COMM_WORLD, so I > > think this is impossible to do from within one program (MPI_COMM_WORLD). > > You could split MPI_COMM_WORLD and run a separate PETSC_COMM_WORLD on > each group, but I think this option will not be reproducible (the > instances will slightly out of sync, so memory and communication > bottlenecks will be loaded in different ways on subsequent runs) and > is a bit disingenuous because this is not a configuration that you > would ever run in practice. > > Jed > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at 59A2.org Tue Sep 21 07:19:31 2010 From: jed at 59A2.org (Jed Brown) Date: Tue, 21 Sep 2010 14:19:31 +0200 Subject: [petsc-users] profiling question In-Reply-To: References: Message-ID: On Tue, Sep 21, 2010 at 13:49, Leo van Kampenhout wrote: > Thanks for the helpful response Jed. I was not aware of the possibility to > run seperate PETSC_COMM_WORLDS in the same program,? at least this is not > clear from the documentation This is really an MPI thing, you are using a different PETSC_COMM_WORLD in different processes. The way it works is that you call MPI_Init, then MPI_Comm_split or whatever to create new communicators for each group of processes, then PETSC_COMM_WORLD = new_comm; PetscInitialize(...); The instances of PETSc with different PETSC_COMM_WORLDs are entirely independent. > About presenting scaling results, the most appealing to me seems to show two > graphs, one with intra-node scaling (1-12) and the other going upwards from > there (12, 24, 36, ...) Yes, this is good. Jed From daniel.langr at gmail.com Tue Sep 21 09:20:54 2010 From: daniel.langr at gmail.com (Daniel Langr) Date: Tue, 21 Sep 2010 16:20:54 +0200 Subject: [petsc-users] Storage space for symmetric (SBAIJ) matrix In-Reply-To: References: <4C934DDD.2010000@gmail.com> Message-ID: <4C98BF46.1090009@gmail.com> Jed, thanks much for your comprehensive answer, it will certainly help. I will look at the example codes. As for matrix assembly process, I would prefer constructing a matrix from arrays (to avoid dynamic assembly and additional memory costs) but there is nothing like MatCreateMPISBAIJWithArrays() or better MatCreateMPISBAIJWithSplitArrays() for symmetric matrices as for unsymmetric ones in PETSc. Regards, Daniel From jed at 59A2.org Tue Sep 21 09:24:06 2010 From: jed at 59A2.org (Jed Brown) Date: Tue, 21 Sep 2010 16:24:06 +0200 Subject: [petsc-users] Storage space for symmetric (SBAIJ) matrix In-Reply-To: <4C98BF46.1090009@gmail.com> References: <4C934DDD.2010000@gmail.com> <4C98BF46.1090009@gmail.com> Message-ID: On Tue, Sep 21, 2010 at 16:20, Daniel Langr wrote: > thanks much for your comprehensive answer, it will certainly help. I will > look at the example codes. As for matrix assembly process, I would prefer > constructing a matrix from arrays (to avoid dynamic assembly and additional > memory costs) but there is nothing like MatCreateMPISBAIJWithArrays() or > better MatCreateMPISBAIJWithSplitArrays() for symmetric matrices as for > unsymmetric ones in PETSc. This would be easy to add, but how would you go about building the arrays yourself? What sort of problems are you solving? Jed From kenway at utias.utoronto.ca Tue Sep 21 09:30:21 2010 From: kenway at utias.utoronto.ca (Gaetan Kenway) Date: Tue, 21 Sep 2010 10:30:21 -0400 Subject: [petsc-users] MatView with relatively large matrix in Parallel Message-ID: <4C98C17D.2050706@utias.utoronto.ca> Hello I am a PETSc user and have run into a problem using MatView. I am trying to output a matrix to file so I can load it instead of computing it for faster debugging. The matrix I'm trying to output is drdwt. It is a parallel block aij matrix with block size of 5. The matrix is assembled and the following code works when I run it in serial: call PetscViewerBinaryOpen(sumb_petsc_comm_world,drdw_name,FILE_MODE_WRITE,bin_viewer,ierr) call MatView(drdwt,bin_viewer,ierr) call PetscViewerDestroy(bin_viewer,ierr) The matrix size is approximately 300k by 300k and I get an output file that is approximately 245MB in size which is expected. However, when I run the same code in parallel on 3 processors it hangs at the MatView call until I am forced to kill the processes. I've let it go for 20 minutes with no sign of stopping. I am not sure what is causing this. I'm using openmpi-1.4.1 and petsc3.1 on 32 bit Ubuntu 10.10. Thank you, Gaetan Kenway From daniel.langr at gmail.com Tue Sep 21 09:35:13 2010 From: daniel.langr at gmail.com (Daniel Langr) Date: Tue, 21 Sep 2010 16:35:13 +0200 Subject: [petsc-users] Storage space for symmetric (SBAIJ) matrix In-Reply-To: References: <4C934DDD.2010000@gmail.com> <4C98BF46.1090009@gmail.com> Message-ID: <4C98C2A1.7030507@gmail.com> Our preliminary idea is to construct a matrix with some legacy code, store matrix into a file (as we would do anyway for checkpointing purposes) and then load it into a solver. We are free to choose matrix storage scheme for a file, so we could prepare data to be in the format of arrays to be loaded into PETSc. For binary I/O we are experimenting with parallel HDF5 capabilities using MPI-I/O underneath. (PETSc has a HDF5 viewer, but if I am not wrong, it does not use parallel I/O). For really big problems parallel I/O is a must for us. We are solving a nuclear structure problem, particularly a symmetry-adapted no-core shell model computations of a nuclei. (I do not understand much that kind of physics, my part is the eigensolver :). Daniel Dne 21.9.2010 16:24, Jed Brown napsal(a): > On Tue, Sep 21, 2010 at 16:20, Daniel Langr wrote: >> thanks much for your comprehensive answer, it will certainly help. I will >> look at the example codes. As for matrix assembly process, I would prefer >> constructing a matrix from arrays (to avoid dynamic assembly and additional >> memory costs) but there is nothing like MatCreateMPISBAIJWithArrays() or >> better MatCreateMPISBAIJWithSplitArrays() for symmetric matrices as for >> unsymmetric ones in PETSc. > > This would be easy to add, but how would you go about building the > arrays yourself? What sort of problems are you solving? > > Jed From jed at 59A2.org Tue Sep 21 09:44:56 2010 From: jed at 59A2.org (Jed Brown) Date: Tue, 21 Sep 2010 16:44:56 +0200 Subject: [petsc-users] Storage space for symmetric (SBAIJ) matrix In-Reply-To: <4C98C2A1.7030507@gmail.com> References: <4C934DDD.2010000@gmail.com> <4C98BF46.1090009@gmail.com> <4C98C2A1.7030507@gmail.com> Message-ID: On Tue, Sep 21, 2010 at 16:35, Daniel Langr wrote: > Our preliminary idea is to construct a matrix with some legacy code, store > matrix into a file (as we would do anyway for checkpointing purposes) and > then load it into a solver. We are free to choose matrix storage scheme for > a file, so we could prepare data to be in the format of arrays to be loaded > into PETSc. For binary I/O we are experimenting with parallel HDF5 > capabilities using MPI-I/O underneath. (PETSc has a HDF5 viewer, but if I am > not wrong, it does not use parallel I/O). For really big problems parallel > I/O is a must for us. PETSc's HDF5 viewer does not currently write matrices. Any overhead in PETSc will be dwarfed by the cost of writing to the file (even using MPI-IO). I recommend against writing the actual matrix to a file, just write the state (much less data, but an equivalent amount of information because it's enough to produce the matrix). Also, if you can choose the file format, then just build an SBAIJ in memory, VecView it if necessary (this uses collective MPI-IO), and use the SBAIJ. > We are solving a nuclear structure problem, particularly a symmetry-adapted > no-core shell model computations of a nuclei. (I do not understand much that > kind of physics, my part is the eigensolver :). These problems usually have a few nonzeros per row, so don't worry about the two auxiliary arrays for dynamic assembly, they will be trivially small compared to the rest. Jed From jed at 59A2.org Tue Sep 21 09:51:46 2010 From: jed at 59A2.org (Jed Brown) Date: Tue, 21 Sep 2010 16:51:46 +0200 Subject: [petsc-users] MatView with relatively large matrix in Parallel In-Reply-To: <4C98C17D.2050706@utias.utoronto.ca> References: <4C98C17D.2050706@utias.utoronto.ca> Message-ID: On Tue, Sep 21, 2010 at 16:30, Gaetan Kenway wrote: > I am trying to output a matrix to file so I can load it instead of computing > it for faster debugging. It is almost certainly faster to assemble it using the physics module than to read it from file. > The matrix I'm trying to output is drdwt. It is a > parallel block aij matrix with block size of 5. The matrix is assembled and > the following code works when I run it in serial: Are you using a DA? If so, then try setting the format to PETSC_VIEWER_NATIVE. This will use the "PETSc" ordering instead of transforming the system to the "natural" ordering which is an expensive operation. Note that the matrix will have a different ordering on a different number of processors if you do this. > I've let it go for 20 minutes with no sign of stopping. Is the system swapping? Have you tried attaching a debugger to see where it was hung ('pgrep yourapp' to see process IDs, then gdb -pid , backtrace)? Jed From bsmith at mcs.anl.gov Tue Sep 21 10:58:31 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 21 Sep 2010 10:58:31 -0500 Subject: [petsc-users] MatView with relatively large matrix in Parallel In-Reply-To: <4C98C17D.2050706@utias.utoronto.ca> References: <4C98C17D.2050706@utias.utoronto.ca> Message-ID: There is currently no fast binary view for parallel BAIJ matrices (when we wrote this stuff originally we didn't think people would ever be saving big sparse matrices). You can do MatConvert() to MPIAIJ and then MatView(). Don't worry the conversion won't be a very noticable time sink. Barry On Sep 21, 2010, at 9:30 AM, Gaetan Kenway wrote: > Hello > > I am a PETSc user and have run into a problem using MatView. I am trying to output a matrix to file so I can load it instead of computing it for faster debugging. The matrix I'm trying to output is drdwt. It is a parallel block aij matrix with block size of 5. The matrix is assembled and the following code works when I run it in serial: > > call PetscViewerBinaryOpen(sumb_petsc_comm_world,drdw_name,FILE_MODE_WRITE,bin_viewer,ierr) > call MatView(drdwt,bin_viewer,ierr) > call PetscViewerDestroy(bin_viewer,ierr) > > The matrix size is approximately 300k by 300k and I get an output file that is approximately 245MB in size which is expected. However, when I run the same code in parallel on 3 processors it hangs at the MatView call until I am forced to kill the processes. I've let it go for 20 minutes with no sign of stopping. > > I am not sure what is causing this. I'm using openmpi-1.4.1 and petsc3.1 on 32 bit Ubuntu 10.10. > > Thank you, > > Gaetan Kenway > > From bsmith at mcs.anl.gov Tue Sep 21 11:01:58 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 21 Sep 2010 11:01:58 -0500 Subject: [petsc-users] Storage space for symmetric (SBAIJ) matrix In-Reply-To: <4C98C2A1.7030507@gmail.com> References: <4C934DDD.2010000@gmail.com> <4C98BF46.1090009@gmail.com> <4C98C2A1.7030507@gmail.com> Message-ID: <123A62E1-A8EB-41CF-A387-5DB46C9FB71F@mcs.anl.gov> On Sep 21, 2010, at 9:35 AM, Daniel Langr wrote: > Our preliminary idea is to construct a matrix with some legacy code, store matrix into a file (as we would do anyway for checkpointing purposes) Is this legacy code parallel? If not when you could only create SeqSBAIJ matrices anyways, correct? So you don't need parallel create from arrays just sequential from arrays? All the MatView() and MatLoad() would be handled by PETSc so you don't need to worry about reading in the matrix in parallel (we do it for you). Barry > and then load it into a solver. We are free to choose matrix storage scheme for a file, so we could prepare data to be in the format of arrays to be loaded into PETSc. For binary I/O we are experimenting with parallel HDF5 capabilities using MPI-I/O underneath. (PETSc has a HDF5 viewer, but if I am not wrong, it does not use parallel I/O). For really big problems parallel I/O is a must for us. > > We are solving a nuclear structure problem, particularly a symmetry-adapted no-core shell model computations of a nuclei. (I do not understand much that kind of physics, my part is the eigensolver :). > > Daniel > > > > > Dne 21.9.2010 16:24, Jed Brown napsal(a): >> On Tue, Sep 21, 2010 at 16:20, Daniel Langr wrote: >>> thanks much for your comprehensive answer, it will certainly help. I will >>> look at the example codes. As for matrix assembly process, I would prefer >>> constructing a matrix from arrays (to avoid dynamic assembly and additional >>> memory costs) but there is nothing like MatCreateMPISBAIJWithArrays() or >>> better MatCreateMPISBAIJWithSplitArrays() for symmetric matrices as for >>> unsymmetric ones in PETSc. >> >> This would be easy to add, but how would you go about building the >> arrays yourself? What sort of problems are you solving? >> >> Jed > From daniel.langr at gmail.com Tue Sep 21 13:44:12 2010 From: daniel.langr at gmail.com (Daniel Langr) Date: Tue, 21 Sep 2010 20:44:12 +0200 Subject: [petsc-users] Storage space for symmetric (SBAIJ) matrix In-Reply-To: <123A62E1-A8EB-41CF-A387-5DB46C9FB71F@mcs.anl.gov> References: <4C934DDD.2010000@gmail.com> <4C98BF46.1090009@gmail.com> <4C98C2A1.7030507@gmail.com> <123A62E1-A8EB-41CF-A387-5DB46C9FB71F@mcs.anl.gov> Message-ID: <4C98FCFC.4030608@gmail.com> Barry, we do not need and do not want to use PETSc for writing a matrix into a file. Such file should be independent of any particular solver. That's why we want do use HDF5 library with parallel I/O capabilities. I can simply store CSR (or COO or any other scheme) arrays for the upper triangular part of a matrix to the file and some supporting information such as number of rows and nonzeroes. Then, if I want to solve a problem with PETSc/SLEPc, I need to effectively load the matrix into it. Supposing hundreds or thousands of nodes (maybe not the same as for matrix construction procedure) the parallel I/O again would be essential. Anyway, the matrix data is supposed to be much bigger than the memory of one node. We (or our physics) need to exploit all the memory available and there exist no upper bounds for them :). As Jed mentioned, we can write only the state to the file. But parallel I/O is also part of our project and research, that's why we bother :). Also, when we want to compare different methods for solution, we would need to construct the matrix multiple times instead of just read it from the file, which can by quicker. Daniel > On Sep 21, 2010, at 9:35 AM, Daniel Langr wrote: > >> Our preliminary idea is to construct a matrix with some legacy code, store matrix into a file (as we would do anyway for checkpointing purposes) > > Is this legacy code parallel? If not when you could only create SeqSBAIJ matrices anyways, correct? So you don't need parallel create from arrays just sequential from arrays? > > All the MatView() and MatLoad() would be handled by PETSc so you don't need to worry about reading in the matrix in parallel (we do it for you). > > Barry > > >> and then load it into a solver. We are free to choose matrix storage scheme for a file, so we could prepare data to be in the format of arrays to be loaded into PETSc. For binary I/O we are experimenting with parallel HDF5 capabilities using MPI-I/O underneath. (PETSc has a HDF5 viewer, but if I am not wrong, it does not use parallel I/O). For really big problems parallel I/O is a must for us. >> >> We are solving a nuclear structure problem, particularly a symmetry-adapted no-core shell model computations of a nuclei. (I do not understand much that kind of physics, my part is the eigensolver :). >> >> Daniel >> >> >> >> >> Dne 21.9.2010 16:24, Jed Brown napsal(a): >>> On Tue, Sep 21, 2010 at 16:20, Daniel Langr wrote: >>>> thanks much for your comprehensive answer, it will certainly help. I will >>>> look at the example codes. As for matrix assembly process, I would prefer >>>> constructing a matrix from arrays (to avoid dynamic assembly and additional >>>> memory costs) but there is nothing like MatCreateMPISBAIJWithArrays() or >>>> better MatCreateMPISBAIJWithSplitArrays() for symmetric matrices as for >>>> unsymmetric ones in PETSc. >>> >>> This would be easy to add, but how would you go about building the >>> arrays yourself? What sort of problems are you solving? >>> >>> Jed From jed at 59A2.org Tue Sep 21 13:53:28 2010 From: jed at 59A2.org (Jed Brown) Date: Tue, 21 Sep 2010 20:53:28 +0200 Subject: [petsc-users] Storage space for symmetric (SBAIJ) matrix In-Reply-To: <4C98FCFC.4030608@gmail.com> References: <4C934DDD.2010000@gmail.com> <4C98BF46.1090009@gmail.com> <4C98C2A1.7030507@gmail.com> <123A62E1-A8EB-41CF-A387-5DB46C9FB71F@mcs.anl.gov> <4C98FCFC.4030608@gmail.com> Message-ID: On Tue, Sep 21, 2010 at 20:44, Daniel Langr wrote: > we do not need and do not want to use PETSc for writing a matrix into a > file. Such file should be independent of any particular solver. That's why > we want do use HDF5 library with parallel I/O capabilities. I can simply > store CSR (or COO or any other scheme) arrays for the upper triangular part > of a matrix to the file and some supporting information such as number of > rows and nonzeroes. You are tying yourself much closer to the solver if you write the matrix out in partitioned split arrays (a PETSc-specific format, plus a very specific decomposition). > As Jed mentioned, we can write only the state to the file. But parallel I/O > is also part of our project and research, that's why we bother :). Also, > when we want to compare different methods for solution, we would need to > construct the matrix multiple times instead of just read it from the file, > which can by quicker. Benchmark it, reading the matrix in will always be slower than assembling it in parallel (a factor >100 would not be surprising). How long does it take to write 100 TB (any file system in the world, any number of IO nodes)? Compare that to the couple of seconds it would take to assemble the 100 TB matrix with 100k procs. Jed From daniel.langr at gmail.com Tue Sep 21 14:20:49 2010 From: daniel.langr at gmail.com (Daniel Langr) Date: Tue, 21 Sep 2010 21:20:49 +0200 Subject: [petsc-users] Storage space for symmetric (SBAIJ) matrix In-Reply-To: References: <4C934DDD.2010000@gmail.com> <4C98BF46.1090009@gmail.com> <4C98C2A1.7030507@gmail.com> <123A62E1-A8EB-41CF-A387-5DB46C9FB71F@mcs.anl.gov> <4C98FCFC.4030608@gmail.com> Message-ID: <4C990591.1060809@gmail.com> Dne 21.9.2010 20:53, Jed Brown napsal(a): > On Tue, Sep 21, 2010 at 20:44, Daniel Langr wrote: >> we do not need and do not want to use PETSc for writing a matrix into a >> file. Such file should be independent of any particular solver. That's why >> we want do use HDF5 library with parallel I/O capabilities. I can simply >> store CSR (or COO or any other scheme) arrays for the upper triangular part >> of a matrix to the file and some supporting information such as number of >> rows and nonzeroes. > > You are tying yourself much closer to the solver if you write the > matrix out in partitioned split arrays (a PETSc-specific format, plus > a very specific decomposition). That's right, thanks. Very firstly I thought that partitioned split arrays are simply the CSR arrays of the partitioned submatrix. >> As Jed mentioned, we can write only the state to the file. But parallel I/O >> is also part of our project and research, that's why we bother :). Also, >> when we want to compare different methods for solution, we would need to >> construct the matrix multiple times instead of just read it from the file, >> which can by quicker. > > Benchmark it, reading the matrix in will always be slower than > assembling it in parallel (a factor>100 would not be surprising). > How long does it take to write 100 TB (any file system in the world, > any number of IO nodes)? Compare that to the couple of seconds it > would take to assemble the 100 TB matrix with 100k procs. > > Jed Certainly true, but the time for computing matrix elements will take much more time than the assembly process. To say truth we don't know yet how much, because new method for computing Hamiltonian are still being developed. Within competing methods this takes about 20 to 30 percent of the time of the iterative eigensolver run, which is not neglectable. If not stored, the elements would need to be evaluated multiple times. Daniel From jed at 59A2.org Tue Sep 21 14:26:03 2010 From: jed at 59A2.org (Jed Brown) Date: Tue, 21 Sep 2010 21:26:03 +0200 Subject: [petsc-users] Storage space for symmetric (SBAIJ) matrix In-Reply-To: <4C990591.1060809@gmail.com> References: <4C934DDD.2010000@gmail.com> <4C98BF46.1090009@gmail.com> <4C98C2A1.7030507@gmail.com> <123A62E1-A8EB-41CF-A387-5DB46C9FB71F@mcs.anl.gov> <4C98FCFC.4030608@gmail.com> <4C990591.1060809@gmail.com> Message-ID: On Tue, Sep 21, 2010 at 21:20, Daniel Langr wrote: > Certainly true, but the time for computing matrix elements will take much > more time than the assembly process. To say truth we don't know yet how > much, because new method for computing Hamiltonian are still being > developed. Within competing methods this takes about 20 to 30 percent of the > time of the iterative eigensolver run, which is not neglectable. If not > stored, the elements would need to be evaluated multiple times. I can't guarantee it for your particular problem, but it is exceedingly rare that loading the matrix from disk is cheaper than recomputing it. Jed From bsmith at mcs.anl.gov Tue Sep 21 16:23:21 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 21 Sep 2010 16:23:21 -0500 Subject: [petsc-users] MatView with relatively large matrix in Parallel In-Reply-To: <4C98C17D.2050706@utias.utoronto.ca> References: <4C98C17D.2050706@utias.utoronto.ca> Message-ID: <8EDE3059-78A2-4C6A-9B69-43F8BD40AEC1@mcs.anl.gov> I have implemented the efficient MatView() for MPIBAIJ matrices in parallel for binary storage in petsc-dev http://www.mcs.anl.gov/petsc/petsc-as/developers/index.html with this you can save parallel MPIBAIJ matrices without first converting to AIJ format. Barry On Sep 21, 2010, at 9:30 AM, Gaetan Kenway wrote: > Hello > > I am a PETSc user and have run into a problem using MatView. I am trying to output a matrix to file so I can load it instead of computing it for faster debugging. The matrix I'm trying to output is drdwt. It is a parallel block aij matrix with block size of 5. The matrix is assembled and the following code works when I run it in serial: > > call PetscViewerBinaryOpen(sumb_petsc_comm_world,drdw_name,FILE_MODE_WRITE,bin_viewer,ierr) > call MatView(drdwt,bin_viewer,ierr) > call PetscViewerDestroy(bin_viewer,ierr) > > The matrix size is approximately 300k by 300k and I get an output file that is approximately 245MB in size which is expected. However, when I run the same code in parallel on 3 processors it hangs at the MatView call until I am forced to kill the processes. I've let it go for 20 minutes with no sign of stopping. > > I am not sure what is causing this. I'm using openmpi-1.4.1 and petsc3.1 on 32 bit Ubuntu 10.10. > > Thank you, > > Gaetan Kenway > > From bsmith at mcs.anl.gov Tue Sep 21 21:26:03 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 21 Sep 2010 21:26:03 -0500 Subject: [petsc-users] Storage space for symmetric (SBAIJ) matrix In-Reply-To: <4C98FCFC.4030608@gmail.com> References: <4C934DDD.2010000@gmail.com> <4C98BF46.1090009@gmail.com> <4C98C2A1.7030507@gmail.com> <123A62E1-A8EB-41CF-A387-5DB46C9FB71F@mcs.anl.gov> <4C98FCFC.4030608@gmail.com> Message-ID: <14976726-A895-4583-888C-4C10FC4D834D@mcs.anl.gov> I have added MatCreateMPISBAIJWithArrays() and MatCreateMPIBAIJWithArrays() to petsc-dev http://www.mcs.anl.gov/petsc/petsc-as/developers/index.html if you have any problem with it please report the bug to petsc-maint at mcs.anl.gov Barry On Sep 21, 2010, at 1:44 PM, Daniel Langr wrote: > Barry, > > we do not need and do not want to use PETSc for writing a matrix into a file. Such file should be independent of any particular solver. That's why we want do use HDF5 library with parallel I/O capabilities. I can simply store CSR (or COO or any other scheme) arrays for the upper triangular part of a matrix to the file and some supporting information such as number of rows and nonzeroes. Then, if I want to solve a problem with PETSc/SLEPc, I need to effectively load the matrix into it. Supposing hundreds or thousands of nodes (maybe not the same as for matrix construction procedure) the parallel I/O again would be essential. Anyway, the matrix data is supposed to be much bigger than the memory of one node. We (or our physics) need to exploit all the memory available and there exist no upper bounds for them :). > > As Jed mentioned, we can write only the state to the file. But parallel I/O is also part of our project and research, that's why we bother :). Also, when we want to compare different methods for solution, we would need to construct the matrix multiple times instead of just read it from the file, which can by quicker. > > Daniel > > > >> On Sep 21, 2010, at 9:35 AM, Daniel Langr wrote: >> >>> Our preliminary idea is to construct a matrix with some legacy code, store matrix into a file (as we would do anyway for checkpointing purposes) >> >> Is this legacy code parallel? If not when you could only create SeqSBAIJ matrices anyways, correct? So you don't need parallel create from arrays just sequential from arrays? >> >> All the MatView() and MatLoad() would be handled by PETSc so you don't need to worry about reading in the matrix in parallel (we do it for you). >> >> Barry >> >> >>> and then load it into a solver. We are free to choose matrix storage scheme for a file, so we could prepare data to be in the format of arrays to be loaded into PETSc. For binary I/O we are experimenting with parallel HDF5 capabilities using MPI-I/O underneath. (PETSc has a HDF5 viewer, but if I am not wrong, it does not use parallel I/O). For really big problems parallel I/O is a must for us. >>> >>> We are solving a nuclear structure problem, particularly a symmetry-adapted no-core shell model computations of a nuclei. (I do not understand much that kind of physics, my part is the eigensolver :). >>> >>> Daniel >>> >>> >>> >>> >>> Dne 21.9.2010 16:24, Jed Brown napsal(a): >>>> On Tue, Sep 21, 2010 at 16:20, Daniel Langr wrote: >>>>> thanks much for your comprehensive answer, it will certainly help. I will >>>>> look at the example codes. As for matrix assembly process, I would prefer >>>>> constructing a matrix from arrays (to avoid dynamic assembly and additional >>>>> memory costs) but there is nothing like MatCreateMPISBAIJWithArrays() or >>>>> better MatCreateMPISBAIJWithSplitArrays() for symmetric matrices as for >>>>> unsymmetric ones in PETSc. >>>> >>>> This would be easy to add, but how would you go about building the >>>> arrays yourself? What sort of problems are you solving? >>>> >>>> Jed From daniel.langr at gmail.com Wed Sep 22 02:39:14 2010 From: daniel.langr at gmail.com (Daniel Langr) Date: Wed, 22 Sep 2010 09:39:14 +0200 Subject: [petsc-users] Storage space for symmetric (SBAIJ) matrix In-Reply-To: <14976726-A895-4583-888C-4C10FC4D834D@mcs.anl.gov> References: <4C934DDD.2010000@gmail.com> <4C98BF46.1090009@gmail.com> <4C98C2A1.7030507@gmail.com> <123A62E1-A8EB-41CF-A387-5DB46C9FB71F@mcs.anl.gov> <4C98FCFC.4030608@gmail.com> <14976726-A895-4583-888C-4C10FC4D834D@mcs.anl.gov> Message-ID: <4C99B2A2.5020702@gmail.com> Ok, thank you both for your help and hints :) Daniel Dne 22.9.2010 4:26, Barry Smith napsal(a): > > I have added MatCreateMPISBAIJWithArrays() and MatCreateMPIBAIJWithArrays() to petsc-dev http://www.mcs.anl.gov/petsc/petsc-as/developers/index.html if you have any problem with it please report the bug to petsc-maint at mcs.anl.gov > > Barry > > > On Sep 21, 2010, at 1:44 PM, Daniel Langr wrote: > >> Barry, >> >> we do not need and do not want to use PETSc for writing a matrix into a file. Such file should be independent of any particular solver. That's why we want do use HDF5 library with parallel I/O capabilities. I can simply store CSR (or COO or any other scheme) arrays for the upper triangular part of a matrix to the file and some supporting information such as number of rows and nonzeroes. Then, if I want to solve a problem with PETSc/SLEPc, I need to effectively load the matrix into it. Supposing hundreds or thousands of nodes (maybe not the same as for matrix construction procedure) the parallel I/O again would be essential. Anyway, the matrix data is supposed to be much bigger than the memory of one node. We (or our physics) need to exploit all the memory available and there exist no upper bounds for them :). >> >> As Jed mentioned, we can write only the state to the file. But parallel I/O is also part of our project and research, that's why we bother :). Also, when we want to compare different methods for solution, we would need to construct the matrix multiple times instead of just read it from the file, which can by quicker. >> >> Daniel >> >> >> >>> On Sep 21, 2010, at 9:35 AM, Daniel Langr wrote: >>> >>>> Our preliminary idea is to construct a matrix with some legacy code, store matrix into a file (as we would do anyway for checkpointing purposes) >>> >>> Is this legacy code parallel? If not when you could only create SeqSBAIJ matrices anyways, correct? So you don't need parallel create from arrays just sequential from arrays? >>> >>> All the MatView() and MatLoad() would be handled by PETSc so you don't need to worry about reading in the matrix in parallel (we do it for you). >>> >>> Barry >>> >>> >>>> and then load it into a solver. We are free to choose matrix storage scheme for a file, so we could prepare data to be in the format of arrays to be loaded into PETSc. For binary I/O we are experimenting with parallel HDF5 capabilities using MPI-I/O underneath. (PETSc has a HDF5 viewer, but if I am not wrong, it does not use parallel I/O). For really big problems parallel I/O is a must for us. >>>> >>>> We are solving a nuclear structure problem, particularly a symmetry-adapted no-core shell model computations of a nuclei. (I do not understand much that kind of physics, my part is the eigensolver :). >>>> >>>> Daniel >>>> >>>> >>>> >>>> >>>> Dne 21.9.2010 16:24, Jed Brown napsal(a): >>>>> On Tue, Sep 21, 2010 at 16:20, Daniel Langr wrote: >>>>>> thanks much for your comprehensive answer, it will certainly help. I will >>>>>> look at the example codes. As for matrix assembly process, I would prefer >>>>>> constructing a matrix from arrays (to avoid dynamic assembly and additional >>>>>> memory costs) but there is nothing like MatCreateMPISBAIJWithArrays() or >>>>>> better MatCreateMPISBAIJWithSplitArrays() for symmetric matrices as for >>>>>> unsymmetric ones in PETSc. >>>>> >>>>> This would be easy to add, but how would you go about building the >>>>> arrays yourself? What sort of problems are you solving? >>>>> >>>>> Jed > From kenway at utias.utoronto.ca Wed Sep 22 12:45:23 2010 From: kenway at utias.utoronto.ca (Gaetan Kenway) Date: Wed, 22 Sep 2010 13:45:23 -0400 Subject: [petsc-users] MatConvert using mpi block aij matrix In-Reply-To: References: Message-ID: <4C9A40B3.2060800@utias.utoronto.ca> Hello I'm still trying to write out my mpiblockaij matrix. I see the development version now supports the baij format writing, but I was just trying to do the matconvert fix to write out the matrix. The code I'm trying to run to convert the matrix is: call MatConvert(drdwt,'mpiaij',MAT_INITIAL_MATRIX,drdwt_copy,ierr) drdwt was created using: call MatCreateMPIBAIJ(SUMB_PETSC_COMM_WORLD, nw, nDimW, nDimW, ETSC_DETERMINE, PETSC_DETERMINE, & nzDiagonalW, nnzDiagonal, nzOffDiag, nnzOffDiag, dRdWT, PETScIerr) However, the code hangs on the conversion. The drdwt is assembled and drdwt_copy matrix has nothing done to it at this point. Any suggestions other than to build the dev version? Gaetan From bsmith at mcs.anl.gov Wed Sep 22 13:09:23 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 22 Sep 2010 13:09:23 -0500 Subject: [petsc-users] MatConvert using mpi block aij matrix In-Reply-To: <4C9A40B3.2060800@utias.utoronto.ca> References: <4C9A40B3.2060800@utias.utoronto.ca> Message-ID: <6A86097D-8E2E-4048-8928-950619428BBC@mcs.anl.gov> Same basic problem. It is using a slow converter that does not do preallocation. Since the writer is now available for you, you should just switch to petsc-dev and you won't need this. Barry On Sep 22, 2010, at 12:45 PM, Gaetan Kenway wrote: > Hello > > I'm still trying to write out my mpiblockaij matrix. I see the development version now supports the baij format writing, but I was just trying to do the matconvert fix to write out the matrix. The code I'm trying to run to convert the matrix is: > > call MatConvert(drdwt,'mpiaij',MAT_INITIAL_MATRIX,drdwt_copy,ierr) > > drdwt was created using: > > call MatCreateMPIBAIJ(SUMB_PETSC_COMM_WORLD, nw, nDimW, nDimW, ETSC_DETERMINE, PETSC_DETERMINE, & > nzDiagonalW, nnzDiagonal, nzOffDiag, nnzOffDiag, dRdWT, PETScIerr) > > However, the code hangs on the conversion. The drdwt is assembled and drdwt_copy matrix has nothing done to it at this point. > > Any suggestions other than to build the dev version? > > Gaetan From lvankampenhout at gmail.com Thu Sep 23 06:51:55 2010 From: lvankampenhout at gmail.com (Leo van Kampenhout) Date: Thu, 23 Sep 2010 13:51:55 +0200 Subject: [petsc-users] question about PC_BJACOBI Message-ID: Hi all, With p number of processors in the communicator, the block preconditioner PC_BJACOBI will by default use p blocks. So far, so good. However, in order to compare this algorithmic efficiency decrease (since the bigger p, the less efficient the preconditioner), i ran the commands mpirun -n 1 ./program -pc_bjacobi_blocks 8 mpirun -n 8 ./program -pc_bjacobi_blocks 8 I expected the preconditioning to be equally efficient in this case. However, GMRES makes more iterations in the first case (30 against 28) which I cannot explain. Are there more subtle differences about the preconditioner or the KSP that i'm overlooking here? regards, Leo -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Sep 23 06:55:59 2010 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 23 Sep 2010 07:55:59 -0400 Subject: [petsc-users] question about PC_BJACOBI In-Reply-To: References: Message-ID: Exact numbers of iterations are sensitive to details of computations, so you need to check a) that the solvers are exactly the same using -ksp_view b) the convergence history, which can be different in parallel due to non-commutativity of floating point arithmetic Matt On Thu, Sep 23, 2010 at 7:51 AM, Leo van Kampenhout < lvankampenhout at gmail.com> wrote: > Hi all, > > With p number of processors in the communicator, the block preconditioner > PC_BJACOBI will by default use p blocks. So far, so good. However, in order > to compare this algorithmic efficiency decrease (since the bigger p, the > less efficient the preconditioner), i ran the commands > > mpirun -n 1 ./program -pc_bjacobi_blocks 8 > mpirun -n 8 ./program -pc_bjacobi_blocks 8 > > I expected the preconditioning to be equally efficient in this case. > However, GMRES makes more iterations in the first case (30 against 28) which > I cannot explain. Are there more subtle differences about the preconditioner > or the KSP that i'm overlooking here? > > regards, > > Leo > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at 59A2.org Thu Sep 23 08:28:55 2010 From: jed at 59A2.org (Jed Brown) Date: Thu, 23 Sep 2010 15:28:55 +0200 Subject: [petsc-users] question about PC_BJACOBI In-Reply-To: References: Message-ID: The latter is using the partition provided by the DA (or user) which looks to be better than the one computed in the serial run. If you have Parmetis, then it will be used by PCBJACOBI, otherwise the partition is naive. You can specify subdomains manually if you want. Jed On Sep 23, 2010 1:51 PM, "Leo van Kampenhout" wrote: Hi all, With p number of processors in the communicator, the block preconditioner PC_BJACOBI will by default use p blocks. So far, so good. However, in order to compare this algorithmic efficiency decrease (since the bigger p, the less efficient the preconditioner), i ran the commands mpirun -n 1 ./program -pc_bjacobi_blocks 8 mpirun -n 8 ./program -pc_bjacobi_blocks 8 I expected the preconditioning to be equally efficient in this case. However, GMRES makes more iterations in the first case (30 against 28) which I cannot explain. Are there more subtle differences about the preconditioner or the KSP that i'm overlooking here? regards, Leo -------------- next part -------------- An HTML attachment was scrubbed... URL: From lvankampenhout at gmail.com Thu Sep 23 09:41:04 2010 From: lvankampenhout at gmail.com (Leo van Kampenhout) Date: Thu, 23 Sep 2010 16:41:04 +0200 Subject: [petsc-users] question about PC_BJACOBI In-Reply-To: References: Message-ID: Thank you both. The solvers are the same, I double checked that. It could be the case that the type of partitioning plays a role here, since i'm indeed using a DA. However, why is it that for example a run on 2 processors the number of iterations is higher than on 8? Both use DA-partitioning in this case. To specify subdomains manually, where do i start? Leo 2010/9/23 Jed Brown > The latter is using the partition provided by the DA (or user) which looks > to be better than the one computed in the serial run. If you have Parmetis, > then it will be used by PCBJACOBI, otherwise the partition is naive. You can > specify subdomains manually if you want. > > Jed > > On Sep 23, 2010 1:51 PM, "Leo van Kampenhout" > wrote: > > Hi all, > > With p number of processors in the communicator, the block preconditioner > PC_BJACOBI will by default use p blocks. So far, so good. However, in order > to compare this algorithmic efficiency decrease (since the bigger p, the > less efficient the preconditioner), i ran the commands > > mpirun -n 1 ./program -pc_bjacobi_blocks 8 > mpirun -n 8 ./program -pc_bjacobi_blocks 8 > > I expected the preconditioning to be equally efficient in this case. > However, GMRES makes more iterations in the first case (30 against 28) which > I cannot explain. Are there more subtle differences about the preconditioner > or the KSP that i'm overlooking here? > > regards, > > Leo > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at 59A2.org Thu Sep 23 09:51:36 2010 From: jed at 59A2.org (Jed Brown) Date: Thu, 23 Sep 2010 16:51:36 +0200 Subject: [petsc-users] question about PC_BJACOBI In-Reply-To: References: Message-ID: On Thu, Sep 23, 2010 at 16:41, Leo van Kampenhout wrote: > > Thank you both. The solvers are the same, I double checked that. It could be > the case that the type of partitioning plays a role here, since i'm indeed > using a DA. However, why is it that for example a run on 2 processors the > number of iterations is higher than on 8? Does this also happen with direct subdomain solves? -sub_pc_type lu > Both use DA-partitioning in this > case. To specify subdomains manually, where do i start? PCASMSetLocalSubdomains (use ASM with overlap 0 for this: -pc_type asm -pc_asm_overlap 0). Jed From knepley at gmail.com Thu Sep 23 10:07:06 2010 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 23 Sep 2010 11:07:06 -0400 Subject: [petsc-users] question about PC_BJACOBI In-Reply-To: References: Message-ID: On Thu, Sep 23, 2010 at 10:41 AM, Leo van Kampenhout < lvankampenhout at gmail.com> wrote: > > Thank you both. The solvers are the same, I double checked that. It could > be the case that the type of partitioning plays a role here, since i'm > indeed using a DA. However, why is it that for example a run on 2 processors > the number of iterations is higher than on 8? Both use DA-partitioning in > this case. To specify subdomains manually, where do i start? > It is an open secret that Krylov methods are incredibly sensitive to orderings, especially when combined with incomplete factorization preconditioners. Since the ordering depends on the division (see tutorials for a picture of "petsc" orderings which are just contiguous per process), you can get non-intuitive effects. Matt > Leo > > > 2010/9/23 Jed Brown > >> The latter is using the partition provided by the DA (or user) which looks >> to be better than the one computed in the serial run. If you have Parmetis, >> then it will be used by PCBJACOBI, otherwise the partition is naive. You can >> specify subdomains manually if you want. >> >> Jed >> >> On Sep 23, 2010 1:51 PM, "Leo van Kampenhout" >> wrote: >> >> Hi all, >> >> With p number of processors in the communicator, the block preconditioner >> PC_BJACOBI will by default use p blocks. So far, so good. However, in order >> to compare this algorithmic efficiency decrease (since the bigger p, the >> less efficient the preconditioner), i ran the commands >> >> mpirun -n 1 ./program -pc_bjacobi_blocks 8 >> mpirun -n 8 ./program -pc_bjacobi_blocks 8 >> >> I expected the preconditioning to be equally efficient in this case. >> However, GMRES makes more iterations in the first case (30 against 28) which >> I cannot explain. Are there more subtle differences about the preconditioner >> or the KSP that i'm overlooking here? >> >> regards, >> >> Leo >> >> >> >> >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at 59A2.org Thu Sep 23 10:13:59 2010 From: jed at 59A2.org (Jed Brown) Date: Thu, 23 Sep 2010 17:13:59 +0200 Subject: [petsc-users] question about PC_BJACOBI In-Reply-To: References: Message-ID: On Thu, Sep 23, 2010 at 17:07, Matthew Knepley wrote: > It is an open secret that Krylov methods are incredibly sensitive to > orderings, especially when combined with incomplete factorization > preconditioners. The ordering only affects the Krylov method in the form of rounding for dot products and such. But as Matt says, almost every preconditioner is significantly order-dependent. Jed From rlmackie862 at gmail.com Fri Sep 24 15:12:08 2010 From: rlmackie862 at gmail.com (Randall Mackie) Date: Fri, 24 Sep 2010 13:12:08 -0700 Subject: [petsc-users] question about PCMG Message-ID: I am interested in exploring whether or not the PCMG would be beneficial for my problem which I currently solve using a KSP of bcgs and an ILU PC on a DA. So, as a first step, I just wanted to see what happens if I switched my PC from ILU to PCMG using the Galerkin option. So my command line options are: mpiexec -np 1 /home/rmackie/d3fwd/V3_2_PETSc_DA_PCMG/d3fwd \ -ksp_monitor_true_residual \ -ksp_type bcgs \ -pc_type mg \ -pc_mg_levels 1 \ -pc_mg_cycles v \ -pc_mg_galerkin \ -help \ << EOF However, I keep getting this error message: Preconditioner (PC) Options ------------------------------------------------- -pc_type : Preconditioner (one of) none jacobi pbjacobi bjacobi sor lu shell mg eisenstat ilu icc cholesky asm ksp composite redundant nn mat fieldsplit galerkin exotic openmp asa cp bfbt lsc redistribute (PCSetType) [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[0]PETSCERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [0]PETSC ERROR: likely location of problem given in stack below [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------ Multigrid options [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [0]PETSC ERROR: INSTEAD the line number of the start of the function [0]PETSC ERROR: is given. [0]PETSC ERROR: [0] PCSetFromOptions_MG line 318 src/ksp/pc/impls/mg/mg.c [0]PETSC ERROR: [0] PCSetFromOptions line 166 src/ksp/pc/interface/pcset.c [0]PETSC ERROR: [0] KSPSetFromOptions line 320 src/ksp/ksp/interface/itcl.c [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Signal received! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 4, Fri Jul 30 14:42:02 CDT 2010 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: /home/rmackie/d3fwd/V3_2_PETSc_DA_PCMG/d3fwd on a linux-int named he999.prod.houston.nam.slb.com by rmackie Fri Sep 24 13:06:12 2010 [0]PETSC ERROR: Libraries linked from /home/rmackie/SPARSE/PETsc/petsc-3.1-p4/linux-intel-debug/lib [0]PETSC ERROR: Configure run at Fri Sep 24 13:02:26 2010 [0]PETSC ERROR: Configure options --with-fortran --with-fortran-kernels=1 --with-blas-lapack-dir=/opt/intel/cmkl/10.2.2.025/lib/em64t-with-scalar-type=complex --with-debugging=1 --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 I am running the latest version of Petsc. Any help on getting past this error message would be appreciated. Randy M. -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Sep 24 15:17:32 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 24 Sep 2010 15:17:32 -0500 Subject: [petsc-users] question about PCMG In-Reply-To: References: Message-ID: <77F89206-C155-4670-AEFF-B347BECE6D0F@mcs.anl.gov> Randy, In general if the number of levels is larger than 1 you cannot just "turn on" MG with a command line flag. You need to add to your code the computation of interpolation/restriction between levels and set with PCMGSetInterpolation(). That said, if you only use 1 level then it does not need any interpolation/restriction and should in theory run without crashing (of course it is just using one level of multigrid and hence is not any different then your previous solver). So why does it crash? You need to use -start_in_debugger and see exactly why it crashes? Barry On Sep 24, 2010, at 3:12 PM, Randall Mackie wrote: > I am interested in exploring whether or not the PCMG would be beneficial for my problem which I currently > solve using a KSP of bcgs and an ILU PC on a DA. > > So, as a first step, I just wanted to see what happens if I switched my PC from ILU to PCMG using the > Galerkin option. > > So my command line options are: > > mpiexec -np 1 /home/rmackie/d3fwd/V3_2_PETSc_DA_PCMG/d3fwd \ > -ksp_monitor_true_residual \ > -ksp_type bcgs \ > -pc_type mg \ > -pc_mg_levels 1 \ > -pc_mg_cycles v \ > -pc_mg_galerkin \ > -help \ > << EOF > > > However, I keep getting this error message: > > Preconditioner (PC) Options ------------------------------------------------- > -pc_type : Preconditioner (one of) none jacobi pbjacobi bjacobi sor lu shell mg > eisenstat ilu icc cholesky asm ksp composite redundant nn mat fieldsplit galerkin exotic openmp asa cp bfbt lsc redistribute (PCSetType) > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors > [0]PETSC ERROR: likely location of problem given in stack below > [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------ > Multigrid options > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, > [0]PETSC ERROR: INSTEAD the line number of the start of the function > [0]PETSC ERROR: is given. > [0]PETSC ERROR: [0] PCSetFromOptions_MG line 318 src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: [0] PCSetFromOptions line 166 src/ksp/pc/interface/pcset.c > [0]PETSC ERROR: [0] KSPSetFromOptions line 320 src/ksp/ksp/interface/itcl.c > [0]PETSC ERROR: --------------------- Error Message ------------------------------------ > [0]PETSC ERROR: Signal received! > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 4, Fri Jul 30 14:42:02 CDT 2010 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: /home/rmackie/d3fwd/V3_2_PETSc_DA_PCMG/d3fwd on a linux-int named he999.prod.houston.nam.slb.com by rmackie Fri Sep 24 13:06:12 2010 > [0]PETSC ERROR: Libraries linked from /home/rmackie/SPARSE/PETsc/petsc-3.1-p4/linux-intel-debug/lib > [0]PETSC ERROR: Configure run at Fri Sep 24 13:02:26 2010 > [0]PETSC ERROR: Configure options --with-fortran --with-fortran-kernels=1 --with-blas-lapack-dir=/opt/intel/cmkl/10.2.2.025/lib/em64t -with-scalar-type=complex --with-debugging=1 --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=mpiifort > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: User provided function() line 0 in unknown directory unknown file > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 > > > I am running the latest version of Petsc. Any help on getting past this error message would be appreciated. > > Randy M. > From knepley at gmail.com Fri Sep 24 15:19:16 2010 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 24 Sep 2010 16:19:16 -0400 Subject: [petsc-users] question about PCMG In-Reply-To: References: Message-ID: On Fri, Sep 24, 2010 at 4:12 PM, Randall Mackie wrote: > I am interested in exploring whether or not the PCMG would be beneficial > for my problem which I currently > solve using a KSP of bcgs and an ILU PC on a DA. > > So, as a first step, I just wanted to see what happens if I switched my PC > from ILU to PCMG using the > Galerkin option. > PCMG is not really a preconditioner. It is the Platonic Ideal for a family of preconditioners. Without knowledge of the grid hierarchy, we cannot automatically determine interpolation and restriction operators (and really we need to know about the discretization here as well). That is why DMMG is in PETSc. If we know you are on a DA, we can do a reasonable job of computing these for you. Thanks, Matt > So my command line options are: > > mpiexec -np 1 /home/rmackie/d3fwd/V3_2_PETSc_DA_PCMG/d3fwd \ > -ksp_monitor_true_residual \ > -ksp_type bcgs \ > -pc_type mg \ > -pc_mg_levels 1 \ > -pc_mg_cycles v \ > -pc_mg_galerkin \ > -help \ > << EOF > > > However, I keep getting this error message: > > Preconditioner (PC) Options > ------------------------------------------------- > -pc_type : Preconditioner (one of) none jacobi pbjacobi bjacobi sor > lu shell mg > eisenstat ilu icc cholesky asm ksp composite redundant nn mat > fieldsplit galerkin exotic openmp asa cp bfbt lsc redistribute (PCSetType) > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > probably memory access out of range > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [0]PETSC ERROR: or see > http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[0]PETSCERROR: or try > http://valgrind.org on GNU/linux and Apple Mac OS X to find memory > corruption errors > [0]PETSC ERROR: likely location of problem given in stack below > [0]PETSC ERROR: --------------------- Stack Frames > ------------------------------------ > Multigrid options > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not > available, > [0]PETSC ERROR: INSTEAD the line number of the start of the function > [0]PETSC ERROR: is given. > [0]PETSC ERROR: [0] PCSetFromOptions_MG line 318 src/ksp/pc/impls/mg/mg.c > [0]PETSC ERROR: [0] PCSetFromOptions line 166 src/ksp/pc/interface/pcset.c > [0]PETSC ERROR: [0] KSPSetFromOptions line 320 src/ksp/ksp/interface/itcl.c > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: Signal received! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 4, Fri Jul 30 14:42:02 > CDT 2010 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: /home/rmackie/d3fwd/V3_2_PETSc_DA_PCMG/d3fwd on a linux-int > named he999.prod.houston.nam.slb.com by rmackie Fri Sep 24 13:06:12 2010 > [0]PETSC ERROR: Libraries linked from > /home/rmackie/SPARSE/PETsc/petsc-3.1-p4/linux-intel-debug/lib > [0]PETSC ERROR: Configure run at Fri Sep 24 13:02:26 2010 > [0]PETSC ERROR: Configure options --with-fortran --with-fortran-kernels=1 > --with-blas-lapack-dir=/opt/intel/cmkl/10.2.2.025/lib/em64t-with-scalar-type=complex --with-debugging=1 --with-cc=mpiicc > --with-cxx=mpiicpc --with-fc=mpiifort > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: User provided function() line 0 in unknown directory > unknown file > application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 > > > I am running the latest version of Petsc. Any help on getting past this > error message would be appreciated. > > Randy M. > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From kutuzovnp at gmail.com Sun Sep 26 05:11:37 2010 From: kutuzovnp at gmail.com (=?KOI8-R?B?7snLz8zByiDr1dTV2s/X?=) Date: Sun, 26 Sep 2010 14:11:37 +0400 Subject: [petsc-users] petsc4py Message-ID: 1) First of all, can you describe in a bit more detailed way the usage of AppCtx class of matfree.py module to solve ODE systems (determined as in rober.py), without jacobian initialisation, in other words how can change rober.py to solve this issue? 2) Does THETA integration implement time step adaptation? 3) Suppose i have a large ODE system, how can i implement multiprocessor (parallel) integration in a way similar with those (function definition and plotting) in rober.py? -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at 59A2.org Sun Sep 26 05:54:46 2010 From: jed at 59A2.org (Jed Brown) Date: Sun, 26 Sep 2010 12:54:46 +0200 Subject: [petsc-users] petsc4py In-Reply-To: References: Message-ID: 2010/9/26 ??????? ??????? : > 1) First of all, can you describe in a? bit more detailed way the usage of > AppCtx class of matfree.py module to solve ODE systems > (determined as in rober.py), without jacobian initialisation, in other words > how can change rober.py to solve this issue? You can just skip the ts.setIJacobian call and run with -snes_mf. Or set an approximate Jacobian that way, but use it only for preconditioning, with -snes_mf_operator. Run with -ts_view to confirm that you are running what you think you are. > 2) Does THETA integration implement time step adaptation? No, and it doesn't come with a built-in error estimate. TSGL does adaptation, but the controller for adaptive order (-ts_adapt_type both) is not at all robust, so I recommend using -ts_adapt_type step or writing your own controller (see src/ts/impls/implicit/gl/gladapt.c for examples). > 3) Suppose i have a large ODE system, how can i implement multiprocessor > (parallel) integration in a way similar with those (function definition and > plotting) in rober.py? Lisandro might have other suggestions, but src/ts/examples/tutorials/ex8.py solves a transient Bratu problem in parallel. Get it from dev, the copy in 3.1 does not work correctly in parallel for superficial indexing reasons: http://petsc.cs.iit.edu/petsc/petsc-dev/file/c03db8f211dd/src/ts/examples/tutorials/ex8.py You can run it like (it uses TSGL by default) mpiexec -n 4 python ex8.py -M 40 -ts_monitor -ts_monitor_solution -ts_max_steps 1000 -ts_adapt_type size Note that theta=0.5 is highly oscillatory for this problem, use something like -ts_type theta -ts_theta_theta 0.8 for a more stable solution. You could of course plot the solution using Matplotlib (as in, e.g. bratu2d.py) instead, but you would have to gather it to process 0 because Matplotlib is not parallel. Other options include just writing out the state at the time steps you are interested in, or (much more effort) using libsim from VisIt to get live visualization and interaction with your running parallel program. Jed From griffith at cims.nyu.edu Mon Sep 27 00:56:03 2010 From: griffith at cims.nyu.edu (Boyce Griffith) Date: Mon, 27 Sep 2010 01:56:03 -0400 Subject: [petsc-users] matrix-matrix addition Message-ID: <4CA031F3.8070002@cims.nyu.edu> I am trying to compute a matrix A that is of the form: A = B + P^t C P using MatPtAP and MatAXPY. The call to MatAXPY is very slow --- presumably because I haven't allocated the correct nonzero structure. I can easily determine the nonzero structures of B, C, and P, but I do not think that I can easily compute the nonzero structure of P^t C P (although I can compute its nonzero structure with some difficulty). Is it possible to add extra non-zero locations to an existing matrix? Or should I try to extract the nonzero structures of B and P^t C P to pre-allocate A correctly? (Or am I overlooking some functionality in PETSc that will do most of this for me?) A tangentially related question is: is there any way to find out the actual fill (or fill ratio) following a call to MatPtAP? Thanks in advance for any suggestions! -- Boyce From knepley at gmail.com Mon Sep 27 07:57:22 2010 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 27 Sep 2010 08:57:22 -0400 Subject: [petsc-users] matrix-matrix addition In-Reply-To: <4CA031F3.8070002@cims.nyu.edu> References: <4CA031F3.8070002@cims.nyu.edu> Message-ID: The easiest way I can think of to handle this is to make P^T C P, then extract the nonzero structure with Mat with GetIJ() and create B. You would have to explicitly put zeros in B too or else AssemblyEnd() will shrink it. Maybe someone else can think of an easier way. Matt On Mon, Sep 27, 2010 at 1:56 AM, Boyce Griffith wrote: > I am trying to compute a matrix A that is of the form: > > A = B + P^t C P > > using MatPtAP and MatAXPY. The call to MatAXPY is very slow --- presumably > because I haven't allocated the correct nonzero structure. I can easily > determine the nonzero structures of B, C, and P, but I do not think that I > can easily compute the nonzero structure of P^t C P (although I can compute > its nonzero structure with some difficulty). > > Is it possible to add extra non-zero locations to an existing matrix? Or > should I try to extract the nonzero structures of B and P^t C P to > pre-allocate A correctly? (Or am I overlooking some functionality in PETSc > that will do most of this for me?) > > A tangentially related question is: is there any way to find out the actual > fill (or fill ratio) following a call to MatPtAP? > > Thanks in advance for any suggestions! > > -- Boyce > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Mon Sep 27 09:47:34 2010 From: hzhang at mcs.anl.gov (Hong Zhang) Date: Mon, 27 Sep 2010 09:47:34 -0500 Subject: [petsc-users] matrix-matrix addition In-Reply-To: References: <4CA031F3.8070002@cims.nyu.edu> Message-ID: Boyce, > The easiest way I can think of to handle this is to make P^T C P, then > extract the nonzero > structure with Mat with GetIJ() and create B. You would have to explicitly > put zeros in B > too or else AssemblyEnd() will shrink it. Maybe someone else can think of an > easier way. This is what I would suggest as well. >> A tangentially related question is: is there any way to find out the >> actual fill (or fill ratio) following a call to MatPtAP? You may run your code with option '-info' and grep MatPtAP to get actual fill, e.g., src/mat/examples/tests>./ex93 -info |grep MatPtAP [0] MatPtAPSymbolic_SeqAIJ_SeqAIJ(): Reallocs 0; Fill ratio: given 4 needed 1.6. [0] MatPtAPSymbolic_SeqAIJ_SeqAIJ(): Use MatPtAP(A,P,MatReuse,1.6,&C) for best performance. Hong >> >> Thanks in advance for any suggestions! >> >> -- Boyce > > > > -- > What most experimenters take for granted before they begin their experiments > is infinitely more interesting than any results to which their experiments > lead. > -- Norbert Wiener > From amal.ghamdi at kaust.edu.sa Mon Sep 27 11:12:01 2010 From: amal.ghamdi at kaust.edu.sa (Amal Alghamdi) Date: Mon, 27 Sep 2010 19:12:01 +0300 Subject: [petsc-users] petsc4py, plotting DA and writing to file Message-ID: Dear All, I'm using the DA structure in petsc4py. I'd like to know please what is the right way to: plot the DA vector (the global vector). write the global vector to a file. Is that each process writes or draws its own part? or I should communicate all the data to one process? or none of these?!! Thank you very much Amal -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Sep 27 11:14:09 2010 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 27 Sep 2010 12:14:09 -0400 Subject: [petsc-users] petsc4py, plotting DA and writing to file In-Reply-To: References: Message-ID: The right way is to use a PetscViewer. Matt On Mon, Sep 27, 2010 at 12:12 PM, Amal Alghamdi wrote: > Dear All, > > I'm using the DA structure in petsc4py. I'd like to know please what is the > right way to: > plot the DA vector (the global vector). > write the global vector to a file. > Is that each process writes or draws its own part? or I should communicate > all the data to one process? or none of these?!! > > Thank you very much > Amal > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mhender at us.ibm.com Mon Sep 27 11:20:58 2010 From: mhender at us.ibm.com (Michael E Henderson) Date: Mon, 27 Sep 2010 12:20:58 -0400 Subject: [petsc-users] installation of 3.1-p4 under cygwin Message-ID: I'm trying to install petsc-3.1-p4 under cygwin. I'm getting errors in the make related to PetscMap that I can't figure out. They're like: vhyp.c: In function `VecHYPRE_IJVectorCreate': vhyp.c:19: error: invalid type argument of `->' (have `PetscMap') vhyp.c:19: error: invalid type argument of `->' (have `PetscMap') vhyp.c: In function `VecHYPRE_IJVectorCopy': vhyp.c:36: error: invalid type argument of `->' (have `PetscMap') vhyp.c: In function `VecHYPRE_IJVectorCopyFrom': vhyp.c:51: error: invalid type argument of `->' (have `PetscMap') Thanks, Mike Henderson ------------------------------------------------------------------------------------------------------------------------------------ Mathematical Sciences, TJ Watson Research Center mhender at watson.ibm.com http://www.research.ibm.com/people/h/henderson/ http://multifario.sourceforge.net/ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: application/octet-stream Size: 80 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: make.log Type: application/octet-stream Size: 164941 bytes Desc: not available URL: From bsmith at mcs.anl.gov Mon Sep 27 11:24:42 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 27 Sep 2010 11:24:42 -0500 Subject: [petsc-users] matrix-matrix addition In-Reply-To: <4CA031F3.8070002@cims.nyu.edu> References: <4CA031F3.8070002@cims.nyu.edu> Message-ID: <0EB5C255-BBCA-47F2-9384-C9FECC611E34@mcs.anl.gov> We need an efficient MatAXPY() that works well for any combination of nonzero patterns. This is not terribly difficult to write. Basically for each row create a linked list of the nonzeros in it from the first matrix and then merge in the nonzeros from the second matrix for that row, much like in the LU symbolic factorizations. This will give you row counts, then preallocate the correct nonzero matrix and do the MatSetValues() for the first matrix first row then the second matrix first row etc. Finally swap in the new matrix body into the current matrix. If this is done for SeqAIJ then MPIAIJ simply needs to call this for the two submatrices. Similar beasty can be done for BAIJ. Do you need AIJ or BAIJ? Shri, Could you please start on this (drop the VI for a couple of days to get this done)? Let me know if you have any questions. Thanks Barry On Sep 27, 2010, at 12:56 AM, Boyce Griffith wrote: > I am trying to compute a matrix A that is of the form: > > A = B + P^t C P > > using MatPtAP and MatAXPY. The call to MatAXPY is very slow --- presumably because I haven't allocated the correct nonzero structure. I can easily determine the nonzero structures of B, C, and P, but I do not think that I can easily compute the nonzero structure of P^t C P (although I can compute its nonzero structure with some difficulty). > > Is it possible to add extra non-zero locations to an existing matrix? Or should I try to extract the nonzero structures of B and P^t C P to pre-allocate A correctly? (Or am I overlooking some functionality in PETSc that will do most of this for me?) > > A tangentially related question is: is there any way to find out the actual fill (or fill ratio) following a call to MatPtAP? > > Thanks in advance for any suggestions! > > -- Boyce From dalcinl at gmail.com Mon Sep 27 11:27:06 2010 From: dalcinl at gmail.com (Lisandro Dalcin) Date: Mon, 27 Sep 2010 13:27:06 -0300 Subject: [petsc-users] petsc4py, plotting DA and writing to file In-Reply-To: References: Message-ID: On 27 September 2010 13:12, Amal Alghamdi wrote: > Dear All, > I'm using the DA structure in petsc4py. I'd like to know please what is the > right way to: > plot the DA vector (the global vector). > write the global vector to a file. > Is that each process writes or draws its own part? or I should communicate > all the data to one process? or none of these?!! > Thank you very much > Amal Use a PETSc.Viewer().createBinary() to save the global vector. Each process will save its own part. In order to plot it, I think you should use numpy.fromfile() to load the data, and then plot the array. Other way would be to use a PETSc.Scatter.toZero() to get the data at process 0, and then plot it scatter, seqvec = PETSc.Scatter.toZero(globalvec) im = PETSc.InsertMode.INSERT_VALUES sm = PETSc.ScatterMode.FORWARD scatter.scatter(globalvec, seqvec, im, sm) if globalvec.comm.rank == 0: plot(seqvec.array) -- Lisandro Dalcin --------------- CIMEC (INTEC/CONICET-UNL) Predio CONICET-Santa Fe Colectora RN 168 Km 472, Paraje El Pozo Tel: +54-342-4511594 (ext 1011) Tel/Fax: +54-342-4511169 From griffith at cims.nyu.edu Mon Sep 27 11:30:18 2010 From: griffith at cims.nyu.edu (Boyce Griffith) Date: Mon, 27 Sep 2010 12:30:18 -0400 Subject: [petsc-users] matrix-matrix addition In-Reply-To: References: <4CA031F3.8070002@cims.nyu.edu> Message-ID: <4CA0C69A.9050407@cims.nyu.edu> On 9/27/10 10:47 AM, Hong Zhang wrote: > Boyce, > >> The easiest way I can think of to handle this is to make P^T C P, then >> extract the nonzero >> structure with Mat with GetIJ() and create B. You would have to explicitly >> put zeros in B >> too or else AssemblyEnd() will shrink it. Maybe someone else can think of an >> easier way. > > This is what I would suggest as well. It looks like MatGetRowIJ will do the trick for SeqAIJ matrices, but it appears that similar functionality does not exist for MPIAIJ matrices. Is there an approach that you would recommend for the parallel case? Thanks! -- Boyce From jed at 59A2.org Mon Sep 27 11:30:03 2010 From: jed at 59A2.org (Jed Brown) Date: Mon, 27 Sep 2010 18:30:03 +0200 Subject: [petsc-users] matrix-matrix addition In-Reply-To: <0EB5C255-BBCA-47F2-9384-C9FECC611E34@mcs.anl.gov> References: <4CA031F3.8070002@cims.nyu.edu> <0EB5C255-BBCA-47F2-9384-C9FECC611E34@mcs.anl.gov> Message-ID: On Mon, Sep 27, 2010 at 18:24, Barry Smith wrote: > > ?We need an efficient MatAXPY() that works well for any combination of nonzero patterns. This is not terribly difficult to write. ?Basically for each row create a linked list of the nonzeros in it from the first matrix and then merge in the nonzeros from the second matrix for that row, much like in the LU symbolic factorizations. This will give you row counts, then preallocate the correct nonzero matrix and do the MatSetValues() for the first matrix first row then the second matrix first row etc. Finally swap in the new matrix body into the current matrix. I don't see why you'd need a linked list for this, you have two sorted arrays (one for each row) and just need to count the number of unique elements. It's one loop. Jed From griffith at cims.nyu.edu Mon Sep 27 11:32:03 2010 From: griffith at cims.nyu.edu (Boyce Griffith) Date: Mon, 27 Sep 2010 12:32:03 -0400 Subject: [petsc-users] matrix-matrix addition In-Reply-To: <0EB5C255-BBCA-47F2-9384-C9FECC611E34@mcs.anl.gov> References: <4CA031F3.8070002@cims.nyu.edu> <0EB5C255-BBCA-47F2-9384-C9FECC611E34@mcs.anl.gov> Message-ID: <4CA0C703.4030500@cims.nyu.edu> Hi, Barry -- I am working with MPIAIJ matrices. -- Boyce On 9/27/10 12:24 PM, Barry Smith wrote: > > We need an efficient MatAXPY() that works well for any combination of nonzero patterns. This is not terribly difficult to write. Basically for each row create a linked list of the nonzeros in it from the first matrix and then merge in the nonzeros from the second matrix for that row, much like in the LU symbolic factorizations. This will give you row counts, then preallocate the correct nonzero matrix and do the MatSetValues() for the first matrix first row then the second matrix first row etc. Finally swap in the new matrix body into the current matrix. > > If this is done for SeqAIJ then MPIAIJ simply needs to call this for the two submatrices. Similar beasty can be done for BAIJ. Do you need AIJ or BAIJ? > > Shri, > > Could you please start on this (drop the VI for a couple of days to get this done)? Let me know if you have any questions. > > Thanks > > Barry > > > > On Sep 27, 2010, at 12:56 AM, Boyce Griffith wrote: > >> I am trying to compute a matrix A that is of the form: >> >> A = B + P^t C P >> >> using MatPtAP and MatAXPY. The call to MatAXPY is very slow --- presumably because I haven't allocated the correct nonzero structure. I can easily determine the nonzero structures of B, C, and P, but I do not think that I can easily compute the nonzero structure of P^t C P (although I can compute its nonzero structure with some difficulty). >> >> Is it possible to add extra non-zero locations to an existing matrix? Or should I try to extract the nonzero structures of B and P^t C P to pre-allocate A correctly? (Or am I overlooking some functionality in PETSc that will do most of this for me?) >> >> A tangentially related question is: is there any way to find out the actual fill (or fill ratio) following a call to MatPtAP? >> >> Thanks in advance for any suggestions! >> >> -- Boyce > > > From balay at mcs.anl.gov Mon Sep 27 11:30:23 2010 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 27 Sep 2010 11:30:23 -0500 (CDT) Subject: [petsc-users] installation of 3.1-p4 under cygwin In-Reply-To: References: Message-ID: > --prefix=/home/mhender/InstalledSoftware > Using include paths: > -I/home/mhender/InstalledSoftware/petsc-3.1-p4/cygwin-c-debug/include > -I/home/mhender/InstalledSoftware/petsc-3.1-p4/include > -I/home/mhender/InstalledSoftware/petsc-3.1-p4/cygwin-c-debug/include > -I/home/mhender/include -I/home/mhender/InstalledSoftware/include Do you have old install of petsc lurking in this prefix location? What do you have for: ls /home/mhender/include ls /home/mhender/InstalledSoftware/include Also - the attached configure.log was a link-file - not the actual logfile. [if needed - you can resend configure.log to petsc-maint - and not the mailing list] Satish On Mon, 27 Sep 2010, Michael E Henderson wrote: > I'm trying to install petsc-3.1-p4 under cygwin. > > I'm getting errors in the make related to PetscMap that I can't figure > out. They're like: > > vhyp.c: In function `VecHYPRE_IJVectorCreate': > vhyp.c:19: error: invalid type argument of `->' (have `PetscMap') > vhyp.c:19: error: invalid type argument of `->' (have `PetscMap') > vhyp.c: In function `VecHYPRE_IJVectorCopy': > vhyp.c:36: error: invalid type argument of `->' (have `PetscMap') > vhyp.c: In function `VecHYPRE_IJVectorCopyFrom': > vhyp.c:51: error: invalid type argument of `->' (have `PetscMap') > > > > Thanks, > > Mike Henderson > ------------------------------------------------------------------------------------------------------------------------------------ > Mathematical Sciences, TJ Watson Research Center > mhender at watson.ibm.com > http://www.research.ibm.com/people/h/henderson/ > http://multifario.sourceforge.net/ > From bsmith at mcs.anl.gov Mon Sep 27 11:31:13 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 27 Sep 2010 11:31:13 -0500 Subject: [petsc-users] installation of 3.1-p4 under cygwin In-Reply-To: References: Message-ID: PetscMap does not exist in 3.1 you must have some environmental variable pointing to an old PETSc or some old files mixed with new files to get this problem. Barry On Sep 27, 2010, at 11:20 AM, Michael E Henderson wrote: > I'm trying to install petsc-3.1-p4 under cygwin. > > I'm getting errors in the make related to PetscMap that I can't figure out. They're like: > > vhyp.c: In function `VecHYPRE_IJVectorCreate': > vhyp.c:19: error: invalid type argument of `->' (have `PetscMap') > vhyp.c:19: error: invalid type argument of `->' (have `PetscMap') > vhyp.c: In function `VecHYPRE_IJVectorCopy': > vhyp.c:36: error: invalid type argument of `->' (have `PetscMap') > vhyp.c: In function `VecHYPRE_IJVectorCopyFrom': > vhyp.c:51: error: invalid type argument of `->' (have `PetscMap') > > > > Thanks, > > Mike Henderson > ------------------------------------------------------------------------------------------------------------------------------------ > Mathematical Sciences, TJ Watson Research Center > mhender at watson.ibm.com > http://www.research.ibm.com/people/h/henderson/ > http://multifario.sourceforge.net/ > From bsmith at mcs.anl.gov Mon Sep 27 11:33:03 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 27 Sep 2010 11:33:03 -0500 Subject: [petsc-users] petsc4py, plotting DA and writing to file In-Reply-To: References: Message-ID: <32FDC599-5386-426E-82BF-FF6B6BFE3B05@mcs.anl.gov> Lisandro In C he could do VecView(vec,PETSC_VIEWER_DRAW_WORLD); can he not do the equivalent in Python? What syntax would he use? Perhaps the 2d bratu python example could demonstrate this since it is simple and there is no monkeying with copying to one process etc. Barry On Sep 27, 2010, at 11:27 AM, Lisandro Dalcin wrote: > On 27 September 2010 13:12, Amal Alghamdi wrote: >> Dear All, >> I'm using the DA structure in petsc4py. I'd like to know please what is the >> right way to: >> plot the DA vector (the global vector). >> write the global vector to a file. >> Is that each process writes or draws its own part? or I should communicate >> all the data to one process? or none of these?!! >> Thank you very much >> Amal > > Use a PETSc.Viewer().createBinary() to save the global vector. Each > process will save its own part. > > In order to plot it, I think you should use numpy.fromfile() to load > the data, and then plot the array. > > Other way would be to use a PETSc.Scatter.toZero() to get the data at > process 0, and then plot it > > scatter, seqvec = PETSc.Scatter.toZero(globalvec) > im = PETSc.InsertMode.INSERT_VALUES > sm = PETSc.ScatterMode.FORWARD > scatter.scatter(globalvec, seqvec, im, sm) > if globalvec.comm.rank == 0: > plot(seqvec.array) > > > -- > Lisandro Dalcin > --------------- > CIMEC (INTEC/CONICET-UNL) > Predio CONICET-Santa Fe > Colectora RN 168 Km 472, Paraje El Pozo > Tel: +54-342-4511594 (ext 1011) > Tel/Fax: +54-342-4511169 From bsmith at mcs.anl.gov Mon Sep 27 11:34:35 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 27 Sep 2010 11:34:35 -0500 Subject: [petsc-users] matrix-matrix addition In-Reply-To: <4CA0C69A.9050407@cims.nyu.edu> References: <4CA031F3.8070002@cims.nyu.edu> <4CA0C69A.9050407@cims.nyu.edu> Message-ID: <47B2D926-3DA1-4DF4-B772-187FCFF9AE77@mcs.anl.gov> Boyce, Don't mess with this bad advice. It is just making your live complicated. Use a fast MatAXPY() either write it yourself or wait for use. Barry On Sep 27, 2010, at 11:30 AM, Boyce Griffith wrote: > > > On 9/27/10 10:47 AM, Hong Zhang wrote: >> Boyce, >> >>> The easiest way I can think of to handle this is to make P^T C P, then >>> extract the nonzero >>> structure with Mat with GetIJ() and create B. You would have to explicitly >>> put zeros in B >>> too or else AssemblyEnd() will shrink it. Maybe someone else can think of an >>> easier way. >> >> This is what I would suggest as well. > > It looks like MatGetRowIJ will do the trick for SeqAIJ matrices, but it appears that similar functionality does not exist for MPIAIJ matrices. Is there an approach that you would recommend for the parallel case? > > Thanks! > > -- Boyce From mhender at us.ibm.com Mon Sep 27 11:33:28 2010 From: mhender at us.ibm.com (Michael E Henderson) Date: Mon, 27 Sep 2010 12:33:28 -0400 Subject: [petsc-users] installation of 3.1-p4 under cygwin In-Reply-To: References: Message-ID: Yes! I've made this mistake before. I did have older petsc include files from the days when I "installed" petsc. I don't do that anymore by the way. Thanks, Mike ------------------------------------------------------------------------------------------------------------------------------------ Mathematical Sciences, TJ Watson Research Center mhender at watson.ibm.com http://www.research.ibm.com/people/h/henderson/ http://multifario.sourceforge.net/ From: Satish Balay To: PETSc users list Date: 09/27/2010 12:30 PM Subject: Re: [petsc-users] installation of 3.1-p4 under cygwin Sent by: petsc-users-bounces at mcs.anl.gov > --prefix=/home/mhender/InstalledSoftware > Using include paths: > -I/home/mhender/InstalledSoftware/petsc-3.1-p4/cygwin-c-debug/include > -I/home/mhender/InstalledSoftware/petsc-3.1-p4/include > -I/home/mhender/InstalledSoftware/petsc-3.1-p4/cygwin-c-debug/include > -I/home/mhender/include -I/home/mhender/InstalledSoftware/include Do you have old install of petsc lurking in this prefix location? What do you have for: ls /home/mhender/include ls /home/mhender/InstalledSoftware/include Also - the attached configure.log was a link-file - not the actual logfile. [if needed - you can resend configure.log to petsc-maint - and not the mailing list] Satish On Mon, 27 Sep 2010, Michael E Henderson wrote: > I'm trying to install petsc-3.1-p4 under cygwin. > > I'm getting errors in the make related to PetscMap that I can't figure > out. They're like: > > vhyp.c: In function `VecHYPRE_IJVectorCreate': > vhyp.c:19: error: invalid type argument of `->' (have `PetscMap') > vhyp.c:19: error: invalid type argument of `->' (have `PetscMap') > vhyp.c: In function `VecHYPRE_IJVectorCopy': > vhyp.c:36: error: invalid type argument of `->' (have `PetscMap') > vhyp.c: In function `VecHYPRE_IJVectorCopyFrom': > vhyp.c:51: error: invalid type argument of `->' (have `PetscMap') > > > > Thanks, > > Mike Henderson > ------------------------------------------------------------------------------------------------------------------------------------ > Mathematical Sciences, TJ Watson Research Center > mhender at watson.ibm.com > http://www.research.ibm.com/people/h/henderson/ > http://multifario.sourceforge.net/ > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Mon Sep 27 11:36:11 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 27 Sep 2010 11:36:11 -0500 Subject: [petsc-users] matrix-matrix addition In-Reply-To: References: <4CA031F3.8070002@cims.nyu.edu> <0EB5C255-BBCA-47F2-9384-C9FECC611E34@mcs.anl.gov> Message-ID: <3720FE21-58F0-4688-9BA9-89795A1FD1F5@mcs.anl.gov> On Sep 27, 2010, at 11:30 AM, Jed Brown wrote: > On Mon, Sep 27, 2010 at 18:24, Barry Smith wrote: >> >> We need an efficient MatAXPY() that works well for any combination of nonzero patterns. This is not terribly difficult to write. Basically for each row create a linked list of the nonzeros in it from the first matrix and then merge in the nonzeros from the second matrix for that row, much like in the LU symbolic factorizations. This will give you row counts, then preallocate the correct nonzero matrix and do the MatSetValues() for the first matrix first row then the second matrix first row etc. Finally swap in the new matrix body into the current matrix. > > I don't see why you'd need a linked list for this, you have two sorted > arrays (one for each row) and just need to count the number of unique > elements. It's one loop. You need to merge the two arrays, don't you? I was suggested the linked list to do the merge, but I am sure you are correct there are other better ways to do merges. Barry > > Jed From griffith at cims.nyu.edu Mon Sep 27 11:38:58 2010 From: griffith at cims.nyu.edu (Boyce Griffith) Date: Mon, 27 Sep 2010 12:38:58 -0400 Subject: [petsc-users] matrix-matrix addition In-Reply-To: <47B2D926-3DA1-4DF4-B772-187FCFF9AE77@mcs.anl.gov> References: <4CA031F3.8070002@cims.nyu.edu> <4CA0C69A.9050407@cims.nyu.edu> <47B2D926-3DA1-4DF4-B772-187FCFF9AE77@mcs.anl.gov> Message-ID: <4CA0C8A2.1010201@cims.nyu.edu> OK, well if I get something working before you all do, I will send along the code. Thanks, -- Boyce On 9/27/10 12:34 PM, Barry Smith wrote: > > Boyce, > > Don't mess with this bad advice. It is just making your live complicated. Use a fast MatAXPY() either write it yourself or wait for use. > > Barry > > On Sep 27, 2010, at 11:30 AM, Boyce Griffith wrote: > >> >> >> On 9/27/10 10:47 AM, Hong Zhang wrote: >>> Boyce, >>> >>>> The easiest way I can think of to handle this is to make P^T C P, then >>>> extract the nonzero >>>> structure with Mat with GetIJ() and create B. You would have to explicitly >>>> put zeros in B >>>> too or else AssemblyEnd() will shrink it. Maybe someone else can think of an >>>> easier way. >>> >>> This is what I would suggest as well. >> >> It looks like MatGetRowIJ will do the trick for SeqAIJ matrices, but it appears that similar functionality does not exist for MPIAIJ matrices. Is there an approach that you would recommend for the parallel case? >> >> Thanks! >> >> -- Boyce > > > From u.tabak at tudelft.nl Mon Sep 27 11:38:12 2010 From: u.tabak at tudelft.nl (Umut Tabak) Date: Mon, 27 Sep 2010 18:38:12 +0200 Subject: [petsc-users] reorder a sparse matrix Message-ID: <4CA0C874.2000404@tudelft.nl> Dear all, How can I reorder a sparse matrix in Petsc? Sth like in a dirty MATLAB like pseudo-code: A = diag([1 3 4 5]); neworder = [2 4 1 3]; B = A(neworder, neworder) I know I will destroy the sparsity pattern, but for my application there is no other way, the matrices should be reordered for some partitioning. Best regards, Umut From jed at 59A2.org Mon Sep 27 11:39:30 2010 From: jed at 59A2.org (Jed Brown) Date: Mon, 27 Sep 2010 18:39:30 +0200 Subject: [petsc-users] matrix-matrix addition In-Reply-To: <3720FE21-58F0-4688-9BA9-89795A1FD1F5@mcs.anl.gov> References: <4CA031F3.8070002@cims.nyu.edu> <0EB5C255-BBCA-47F2-9384-C9FECC611E34@mcs.anl.gov> <3720FE21-58F0-4688-9BA9-89795A1FD1F5@mcs.anl.gov> Message-ID: On Mon, Sep 27, 2010 at 18:36, Barry Smith wrote: > ? You need to merge the two arrays, don't you? I was suggested the linked list to do the merge, but I am sure you are correct there are other better ways to do merges. Why do you need to actually merge them? You need to count the number in the diagonal and off-diagonal part, but then aren't you going to build a matrix using MatSetValues(...,ADD_VALUES)? You could of course merge them in a temporary buffer (trivial as long as it has been allocated large enough) to get one MatSetValues for the whole row, instead of one for the row from A and another for the row from B. The rows are both sorted, so they can be merged into a buffer in a single pass. Jed From balay at mcs.anl.gov Mon Sep 27 11:41:18 2010 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 27 Sep 2010 11:41:18 -0500 (CDT) Subject: [petsc-users] installation of 3.1-p4 under cygwin In-Reply-To: References: Message-ID: When using --prefix - I normally install each package/version in separate locations. i.e mpich configure --prefix=$HOME/soft/mpich2-1.2.1p1 etc.. petsc configure --with-mpi-dir=$HOME/soft/mpich2-1.2.1p1 ... Satish On Mon, 27 Sep 2010, Michael E Henderson wrote: > Yes! I've made this mistake before. I did have older petsc include files > from the days when I "installed" petsc. I don't do that anymore by the > way. > > Thanks, > > Mike > ------------------------------------------------------------------------------------------------------------------------------------ > Mathematical Sciences, TJ Watson Research Center > mhender at watson.ibm.com > http://www.research.ibm.com/people/h/henderson/ > http://multifario.sourceforge.net/ > > > > > From: > Satish Balay > To: > PETSc users list > Date: > 09/27/2010 12:30 PM > Subject: > Re: [petsc-users] installation of 3.1-p4 under cygwin > Sent by: > petsc-users-bounces at mcs.anl.gov > > > > > --prefix=/home/mhender/InstalledSoftware > > > Using include paths: > > -I/home/mhender/InstalledSoftware/petsc-3.1-p4/cygwin-c-debug/include > > -I/home/mhender/InstalledSoftware/petsc-3.1-p4/include > > -I/home/mhender/InstalledSoftware/petsc-3.1-p4/cygwin-c-debug/include > > -I/home/mhender/include -I/home/mhender/InstalledSoftware/include > > > Do you have old install of petsc lurking in this prefix location? What > do you have for: > > ls /home/mhender/include > ls /home/mhender/InstalledSoftware/include > > Also - the attached configure.log was a link-file - not the actual > logfile. [if needed - you can resend configure.log to petsc-maint - > and not the mailing list] > > Satish > > > On Mon, 27 Sep 2010, Michael E Henderson wrote: > > > I'm trying to install petsc-3.1-p4 under cygwin. > > > > I'm getting errors in the make related to PetscMap that I can't figure > > out. They're like: > > > > vhyp.c: In function `VecHYPRE_IJVectorCreate': > > vhyp.c:19: error: invalid type argument of `->' (have `PetscMap') > > vhyp.c:19: error: invalid type argument of `->' (have `PetscMap') > > vhyp.c: In function `VecHYPRE_IJVectorCopy': > > vhyp.c:36: error: invalid type argument of `->' (have `PetscMap') > > vhyp.c: In function `VecHYPRE_IJVectorCopyFrom': > > vhyp.c:51: error: invalid type argument of `->' (have `PetscMap') > > > > > > > > Thanks, > > > > Mike Henderson > > > ------------------------------------------------------------------------------------------------------------------------------------ > > Mathematical Sciences, TJ Watson Research Center > > mhender at watson.ibm.com > > http://www.research.ibm.com/people/h/henderson/ > > http://multifario.sourceforge.net/ > > > > > > From knepley at gmail.com Mon Sep 27 11:45:53 2010 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 27 Sep 2010 12:45:53 -0400 Subject: [petsc-users] reorder a sparse matrix In-Reply-To: <4CA0C874.2000404@tudelft.nl> References: <4CA0C874.2000404@tudelft.nl> Message-ID: http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/Mat/MatPermute.html Matt On Mon, Sep 27, 2010 at 12:38 PM, Umut Tabak wrote: > Dear all, > > How can I reorder a sparse matrix in Petsc? > > Sth like in a dirty MATLAB like pseudo-code: > A = diag([1 3 4 5]); > neworder = [2 4 1 3]; > B = A(neworder, neworder) > > I know I will destroy the sparsity pattern, but for my application there is > no other way, the matrices should be reordered for some partitioning. > > Best regards, > Umut > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at 59A2.org Mon Sep 27 11:46:42 2010 From: jed at 59A2.org (Jed Brown) Date: Mon, 27 Sep 2010 18:46:42 +0200 Subject: [petsc-users] reorder a sparse matrix In-Reply-To: <4CA0C874.2000404@tudelft.nl> References: <4CA0C874.2000404@tudelft.nl> Message-ID: On Mon, Sep 27, 2010 at 18:38, Umut Tabak wrote: > Dear all, > > How can I reorder a sparse matrix in Petsc? > > Sth like in a dirty MATLAB like pseudo-code: > A = diag([1 3 4 5]); > neworder = [2 4 1 3]; > B = A(neworder, neworder) Use MatPermute or (less storage) put the permutation in a wrapper to MatSetValues (like the localtoglobalmapping with MatSetValuesLocal), so that you assemble the matrix you want to begin with. Jed From dalcinl at gmail.com Mon Sep 27 11:47:41 2010 From: dalcinl at gmail.com (Lisandro Dalcin) Date: Mon, 27 Sep 2010 13:47:41 -0300 Subject: [petsc-users] petsc4py, plotting DA and writing to file In-Reply-To: <32FDC599-5386-426E-82BF-FF6B6BFE3B05@mcs.anl.gov> References: <32FDC599-5386-426E-82BF-FF6B6BFE3B05@mcs.anl.gov> Message-ID: On 27 September 2010 13:33, Barry Smith wrote: > > ?Lisandro > > ? ?In C he could do VecView(vec,PETSC_VIEWER_DRAW_WORLD); can he not do the equivalent in Python? > What syntax would he use? viewer = PETSc.Viewer.DRAW(globalvec.comm) globalvec.view(viewer) # or viewer(globalvec) > > Perhaps the 2d bratu python example could demonstrate this since it is simple and there is no monkeying with copying to one process etc. > Well, that example does not even use a DA, the plot is going to be a bit ugly. > ? Barry > > On Sep 27, 2010, at 11:27 AM, Lisandro Dalcin wrote: > >> On 27 September 2010 13:12, Amal Alghamdi wrote: >>> Dear All, >>> I'm using the DA structure in petsc4py. I'd like to know please what is the >>> right way to: >>> plot the DA vector (the global vector). >>> write the global vector to a file. >>> Is that each process writes or draws its own part? or I should communicate >>> all the data to one process? or none of these?!! >>> Thank you very much >>> Amal >> >> Use a PETSc.Viewer().createBinary() to save the global vector. Each >> process will save its own part. >> >> In order to plot it, I think you should use numpy.fromfile() to load >> the data, and then plot the array. >> >> Other way would be to use a PETSc.Scatter.toZero() to get the data at >> process 0, and then plot it >> >> scatter, seqvec = PETSc.Scatter.toZero(globalvec) >> im = PETSc.InsertMode.INSERT_VALUES >> sm = PETSc.ScatterMode.FORWARD >> scatter.scatter(globalvec, seqvec, im, sm) >> if globalvec.comm.rank == 0: >> ? ?plot(seqvec.array) >> >> >> -- >> Lisandro Dalcin >> --------------- >> CIMEC (INTEC/CONICET-UNL) >> Predio CONICET-Santa Fe >> Colectora RN 168 Km 472, Paraje El Pozo >> Tel: +54-342-4511594 (ext 1011) >> Tel/Fax: +54-342-4511169 > > -- Lisandro Dalcin --------------- CIMEC (INTEC/CONICET-UNL) Predio CONICET-Santa Fe Colectora RN 168 Km 472, Paraje El Pozo Tel: +54-342-4511594 (ext 1011) Tel/Fax: +54-342-4511169 From jed at 59A2.org Mon Sep 27 11:54:34 2010 From: jed at 59A2.org (Jed Brown) Date: Mon, 27 Sep 2010 18:54:34 +0200 Subject: [petsc-users] petsc4py, plotting DA and writing to file In-Reply-To: References: <32FDC599-5386-426E-82BF-FF6B6BFE3B05@mcs.anl.gov> Message-ID: On Mon, Sep 27, 2010 at 18:47, Lisandro Dalcin wrote: > viewer = PETSc.Viewer.DRAW(globalvec.comm) > globalvec.view(viewer) # or viewer(globalvec) What do you think about having a top-level matplotlib viewer? Ideally it would have an option to drop you into an interactive python session after the initial view. I know this brings up the multiple dispatch problem, since the viewer would naturally be distributed with petsc4p, but then you don't get to modify the case statement in VecView_XX. Jed From bsmith at mcs.anl.gov Mon Sep 27 12:01:07 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 27 Sep 2010 12:01:07 -0500 Subject: [petsc-users] matrix-matrix addition In-Reply-To: References: <4CA031F3.8070002@cims.nyu.edu> <0EB5C255-BBCA-47F2-9384-C9FECC611E34@mcs.anl.gov> <3720FE21-58F0-4688-9BA9-89795A1FD1F5@mcs.anl.gov> Message-ID: <911B11BA-C7F7-4308-870E-97BEE42359EA@mcs.anl.gov> On Sep 27, 2010, at 11:39 AM, Jed Brown wrote: > On Mon, Sep 27, 2010 at 18:36, Barry Smith wrote: >> You need to merge the two arrays, don't you? I was suggested the linked list to do the merge, but I am sure you are correct there are other better ways to do merges. > > Why do you need to actually merge them? You need to count the number > in the diagonal and off-diagonal part, but then aren't you going to > build a matrix using MatSetValues(...,ADD_VALUES)? You could of > course merge them in a temporary buffer (trivial as long as it has > been allocated large enough) to get one MatSetValues for the whole > row, instead of one for the row from A and another for the row from B. > The rows are both sorted, so they can be merged into a buffer in a > single pass. > > Jed Jed, (1) this has nothing to do with parallel matrices and diagonal and off diagonal parts (2) this is about preallocation, not setting the values. If you do not merge the column indices from A and B but simply count them all you will get over allocation, at worst a factor of two, though usually it would just be a little. Barry From bsmith at mcs.anl.gov Mon Sep 27 12:02:16 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 27 Sep 2010 12:02:16 -0500 Subject: [petsc-users] petsc4py, plotting DA and writing to file In-Reply-To: References: <32FDC599-5386-426E-82BF-FF6B6BFE3B05@mcs.anl.gov> Message-ID: <14B3B436-A578-4745-9783-B6F86E20DC01@mcs.anl.gov> On Sep 27, 2010, at 11:47 AM, Lisandro Dalcin wrote: > On 27 September 2010 13:33, Barry Smith wrote: >> >> Lisandro >> >> In C he could do VecView(vec,PETSC_VIEWER_DRAW_WORLD); can he not do the equivalent in Python? >> What syntax would he use? > > viewer = PETSc.Viewer.DRAW(globalvec.comm) > globalvec.view(viewer) # or viewer(globalvec) > >> >> Perhaps the 2d bratu python example could demonstrate this since it is simple and there is no monkeying with copying to one process etc. >> > > Well, that example does not even use a DA, the plot is going to be a bit ugly. Sorry. Barry > >> Barry >> >> On Sep 27, 2010, at 11:27 AM, Lisandro Dalcin wrote: >> >>> On 27 September 2010 13:12, Amal Alghamdi wrote: >>>> Dear All, >>>> I'm using the DA structure in petsc4py. I'd like to know please what is the >>>> right way to: >>>> plot the DA vector (the global vector). >>>> write the global vector to a file. >>>> Is that each process writes or draws its own part? or I should communicate >>>> all the data to one process? or none of these?!! >>>> Thank you very much >>>> Amal >>> >>> Use a PETSc.Viewer().createBinary() to save the global vector. Each >>> process will save its own part. >>> >>> In order to plot it, I think you should use numpy.fromfile() to load >>> the data, and then plot the array. >>> >>> Other way would be to use a PETSc.Scatter.toZero() to get the data at >>> process 0, and then plot it >>> >>> scatter, seqvec = PETSc.Scatter.toZero(globalvec) >>> im = PETSc.InsertMode.INSERT_VALUES >>> sm = PETSc.ScatterMode.FORWARD >>> scatter.scatter(globalvec, seqvec, im, sm) >>> if globalvec.comm.rank == 0: >>> plot(seqvec.array) >>> >>> >>> -- >>> Lisandro Dalcin >>> --------------- >>> CIMEC (INTEC/CONICET-UNL) >>> Predio CONICET-Santa Fe >>> Colectora RN 168 Km 472, Paraje El Pozo >>> Tel: +54-342-4511594 (ext 1011) >>> Tel/Fax: +54-342-4511169 >> >> > > > > -- > Lisandro Dalcin > --------------- > CIMEC (INTEC/CONICET-UNL) > Predio CONICET-Santa Fe > Colectora RN 168 Km 472, Paraje El Pozo > Tel: +54-342-4511594 (ext 1011) > Tel/Fax: +54-342-4511169 From dalcinl at gmail.com Mon Sep 27 12:03:34 2010 From: dalcinl at gmail.com (Lisandro Dalcin) Date: Mon, 27 Sep 2010 14:03:34 -0300 Subject: [petsc-users] petsc4py, plotting DA and writing to file In-Reply-To: <14B3B436-A578-4745-9783-B6F86E20DC01@mcs.anl.gov> References: <32FDC599-5386-426E-82BF-FF6B6BFE3B05@mcs.anl.gov> <14B3B436-A578-4745-9783-B6F86E20DC01@mcs.anl.gov> Message-ID: On 27 September 2010 14:02, Barry Smith wrote: > > On Sep 27, 2010, at 11:47 AM, Lisandro Dalcin wrote: > >> On 27 September 2010 13:33, Barry Smith wrote: >>> >>> ?Lisandro >>> >>> ? ?In C he could do VecView(vec,PETSC_VIEWER_DRAW_WORLD); can he not do the equivalent in Python? >>> What syntax would he use? >> >> viewer = PETSc.Viewer.DRAW(globalvec.comm) >> globalvec.view(viewer) # or viewer(globalvec) >> >>> >>> Perhaps the 2d bratu python example could demonstrate this since it is simple and there is no monkeying with copying to one process etc. >>> >> >> Well, that example does not even use a DA, the plot is going to be a bit ugly. > > ?Sorry. Barry > I could add a poisson2d implemented with DA and working in parallel. >> >>> ? Barry >>> >>> On Sep 27, 2010, at 11:27 AM, Lisandro Dalcin wrote: >>> >>>> On 27 September 2010 13:12, Amal Alghamdi wrote: >>>>> Dear All, >>>>> I'm using the DA structure in petsc4py. I'd like to know please what is the >>>>> right way to: >>>>> plot the DA vector (the global vector). >>>>> write the global vector to a file. >>>>> Is that each process writes or draws its own part? or I should communicate >>>>> all the data to one process? or none of these?!! >>>>> Thank you very much >>>>> Amal >>>> >>>> Use a PETSc.Viewer().createBinary() to save the global vector. Each >>>> process will save its own part. >>>> >>>> In order to plot it, I think you should use numpy.fromfile() to load >>>> the data, and then plot the array. >>>> >>>> Other way would be to use a PETSc.Scatter.toZero() to get the data at >>>> process 0, and then plot it >>>> >>>> scatter, seqvec = PETSc.Scatter.toZero(globalvec) >>>> im = PETSc.InsertMode.INSERT_VALUES >>>> sm = PETSc.ScatterMode.FORWARD >>>> scatter.scatter(globalvec, seqvec, im, sm) >>>> if globalvec.comm.rank == 0: >>>> ? ?plot(seqvec.array) >>>> >>>> >>>> -- >>>> Lisandro Dalcin >>>> --------------- >>>> CIMEC (INTEC/CONICET-UNL) >>>> Predio CONICET-Santa Fe >>>> Colectora RN 168 Km 472, Paraje El Pozo >>>> Tel: +54-342-4511594 (ext 1011) >>>> Tel/Fax: +54-342-4511169 >>> >>> >> >> >> >> -- >> Lisandro Dalcin >> --------------- >> CIMEC (INTEC/CONICET-UNL) >> Predio CONICET-Santa Fe >> Colectora RN 168 Km 472, Paraje El Pozo >> Tel: +54-342-4511594 (ext 1011) >> Tel/Fax: +54-342-4511169 > > -- Lisandro Dalcin --------------- CIMEC (INTEC/CONICET-UNL) Predio CONICET-Santa Fe Colectora RN 168 Km 472, Paraje El Pozo Tel: +54-342-4511594 (ext 1011) Tel/Fax: +54-342-4511169 From jed at 59A2.org Mon Sep 27 12:04:22 2010 From: jed at 59A2.org (Jed Brown) Date: Mon, 27 Sep 2010 19:04:22 +0200 Subject: [petsc-users] matrix-matrix addition In-Reply-To: <911B11BA-C7F7-4308-870E-97BEE42359EA@mcs.anl.gov> References: <4CA031F3.8070002@cims.nyu.edu> <0EB5C255-BBCA-47F2-9384-C9FECC611E34@mcs.anl.gov> <3720FE21-58F0-4688-9BA9-89795A1FD1F5@mcs.anl.gov> <911B11BA-C7F7-4308-870E-97BEE42359EA@mcs.anl.gov> Message-ID: On Mon, Sep 27, 2010 at 19:01, Barry Smith wrote: > If you do not merge the column indices from A and B but simply count them all you will get over allocation, at worst a factor of two, though usually it would just be a little. You count the number of *unique* entries. You have two indices that run through the arrays aj and bj, counting duplicates only once. This is easy to do since they are both sorted. Jed From dalcinl at gmail.com Mon Sep 27 12:06:44 2010 From: dalcinl at gmail.com (Lisandro Dalcin) Date: Mon, 27 Sep 2010 14:06:44 -0300 Subject: [petsc-users] petsc4py, plotting DA and writing to file In-Reply-To: References: <32FDC599-5386-426E-82BF-FF6B6BFE3B05@mcs.anl.gov> Message-ID: On 27 September 2010 13:54, Jed Brown wrote: > On Mon, Sep 27, 2010 at 18:47, Lisandro Dalcin wrote: >> viewer = PETSc.Viewer.DRAW(globalvec.comm) >> globalvec.view(viewer) # or viewer(globalvec) > > What do you think about having a top-level matplotlib viewer? Yes, we could have it, and a VisIt plugin, and a ParaView one, and what about Mayavi, and ... This could go to a separate petsc4py.plotting module (I prefer to not contaminate the petsc4py.PETSc namespace with stuff that is not pure-PETSc) >?Ideally > it would have an option to drop you into an interactive python session > after the initial view. > Could you elaborate? > I know this brings up the multiple dispatch problem, since the viewer > would naturally be distributed with petsc4p, but then you don't get to > modify the case statement in VecView_XX. > Sorry, now I'm confused. Are you suggesting this for petsc4py, or core PETSc? -- Lisandro Dalcin --------------- CIMEC (INTEC/CONICET-UNL) Predio CONICET-Santa Fe Colectora RN 168 Km 472, Paraje El Pozo Tel: +54-342-4511594 (ext 1011) Tel/Fax: +54-342-4511169 From bsmith at mcs.anl.gov Mon Sep 27 12:06:58 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 27 Sep 2010 12:06:58 -0500 Subject: [petsc-users] matrix-matrix addition In-Reply-To: References: <4CA031F3.8070002@cims.nyu.edu> <0EB5C255-BBCA-47F2-9384-C9FECC611E34@mcs.anl.gov> <3720FE21-58F0-4688-9BA9-89795A1FD1F5@mcs.anl.gov> <911B11BA-C7F7-4308-870E-97BEE42359EA@mcs.anl.gov> Message-ID: On Sep 27, 2010, at 12:04 PM, Jed Brown wrote: > On Mon, Sep 27, 2010 at 19:01, Barry Smith wrote: >> If you do not merge the column indices from A and B but simply count them all you will get over allocation, at worst a factor of two, though usually it would just be a little. > > You count the number of *unique* entries. You have two indices that > run through the arrays aj and bj, counting duplicates only once. This > is easy to do since they are both sorted. Yes, sounds like it could work and definitely better than my suggestion. Shri, can you try this? Barry > > Jed From bsmith at mcs.anl.gov Mon Sep 27 12:07:14 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 27 Sep 2010 12:07:14 -0500 Subject: [petsc-users] petsc4py, plotting DA and writing to file In-Reply-To: References: <32FDC599-5386-426E-82BF-FF6B6BFE3B05@mcs.anl.gov> Message-ID: On Sep 27, 2010, at 11:54 AM, Jed Brown wrote: > On Mon, Sep 27, 2010 at 18:47, Lisandro Dalcin wrote: >> viewer = PETSc.Viewer.DRAW(globalvec.comm) >> globalvec.view(viewer) # or viewer(globalvec) > > What do you think about having a top-level matplotlib viewer? Ideally > it would have an option to drop you into an interactive python session > after the initial view. > > I know this brings up the multiple dispatch problem, since the viewer > would naturally be distributed with petsc4p, but then you don't get to > modify the case statement in VecView_XX. This could be done by putting into the C code the usual "case" handling for the new viewer and then have it dispatch back to the python code. Barry > > Jed From jed at 59A2.org Mon Sep 27 12:09:36 2010 From: jed at 59A2.org (Jed Brown) Date: Mon, 27 Sep 2010 19:09:36 +0200 Subject: [petsc-users] petsc4py, plotting DA and writing to file In-Reply-To: References: <32FDC599-5386-426E-82BF-FF6B6BFE3B05@mcs.anl.gov> Message-ID: On Mon, Sep 27, 2010 at 19:07, Barry Smith wrote: > This could be done by putting into the C code the usual "case" handling for the new viewer and then have > it dispatch back to the python code. Right, but then it sounds like you're distributing the viewer with PETSc (instead of petsc4py of some third-party plugin). We had a long thread a while ago about making multiple dispatch runtime-extensible in both arguments. Jed From bsmith at mcs.anl.gov Mon Sep 27 12:13:55 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 27 Sep 2010 12:13:55 -0500 Subject: [petsc-users] petsc4py, plotting DA and writing to file In-Reply-To: References: <32FDC599-5386-426E-82BF-FF6B6BFE3B05@mcs.anl.gov> Message-ID: <1DF41E16-5149-4B8F-811C-B0F7A6815DE2@mcs.anl.gov> On Sep 27, 2010, at 12:09 PM, Jed Brown wrote: > On Mon, Sep 27, 2010 at 19:07, Barry Smith wrote: >> This could be done by putting into the C code the usual "case" handling for the new viewer and then have >> it dispatch back to the python code. > > Right, but then it sounds like you're distributing the viewer with > PETSc (instead of petsc4py of some third-party plugin). No, the viewer is distributed with petsc4py or some other package, but yes the PETSc source code is augmented also. In fact one could organize it so one extra "dispatcher" could support many different python viewers; essentially a shell Viewer :-) > We had a long > thread a while ago about making multiple dispatch runtime-extensible > in both arguments. I was just pointing out that something "quick and dirty" can be done now without the multiple dispatch system. I am not opposed to a multiple dispatch system to handle this; but no one has proposed specifics for such a system that pass the "good enough for PETSc" test. Barry > > Jed From jed at 59A2.org Mon Sep 27 12:14:37 2010 From: jed at 59A2.org (Jed Brown) Date: Mon, 27 Sep 2010 19:14:37 +0200 Subject: [petsc-users] petsc4py, plotting DA and writing to file In-Reply-To: References: <32FDC599-5386-426E-82BF-FF6B6BFE3B05@mcs.anl.gov> Message-ID: On Mon, Sep 27, 2010 at 19:06, Lisandro Dalcin wrote: > On 27 September 2010 13:54, Jed Brown wrote: >> On Mon, Sep 27, 2010 at 18:47, Lisandro Dalcin wrote: >>> viewer = PETSc.Viewer.DRAW(globalvec.comm) >>> globalvec.view(viewer) # or viewer(globalvec) >> >> What do you think about having a top-level matplotlib viewer? > > Yes, we could have it, and a VisIt plugin, and a ParaView one, and > what about Mayavi, and ... This could > go to a separate petsc4py.plotting module (I prefer to not contaminate > the petsc4py.PETSc namespace with stuff that is not pure-PETSc) > >>?Ideally >> it would have an option to drop you into an interactive python session >> after the initial view. >> > > Could you elaborate? Usually you need a bit of annotation to produce a figure for a talk or publication. I don't find it reasonable to expose an API on the PETSc side that is capable of everything the package can do (i suppose you could shuttle strings, but that's lame for debugging in the event of an API change, for example). So another option would be to get an interactive Python session that the user could use to manipulate the figure and write it out in whatever formats they like (probably by passing it through a filter that they have written). >> I know this brings up the multiple dispatch problem, since the viewer >> would naturally be distributed with petsc4p, but then you don't get to >> modify the case statement in VecView_XX. >> > > Sorry, now I'm confused. Are you suggesting this for petsc4py, or core PETSc? petsc4py or any other non-core PETSc. Jed From jed at 59A2.org Mon Sep 27 12:15:38 2010 From: jed at 59A2.org (Jed Brown) Date: Mon, 27 Sep 2010 19:15:38 +0200 Subject: [petsc-users] petsc4py, plotting DA and writing to file In-Reply-To: <1DF41E16-5149-4B8F-811C-B0F7A6815DE2@mcs.anl.gov> References: <32FDC599-5386-426E-82BF-FF6B6BFE3B05@mcs.anl.gov> <1DF41E16-5149-4B8F-811C-B0F7A6815DE2@mcs.anl.gov> Message-ID: Agree on both points. On Mon, Sep 27, 2010 at 19:13, Barry Smith wrote: > > On Sep 27, 2010, at 12:09 PM, Jed Brown wrote: > >> On Mon, Sep 27, 2010 at 19:07, Barry Smith wrote: >>> This could be done by putting into the C code the usual "case" handling for the new viewer and then have >>> it dispatch back to the python code. >> >> Right, but then it sounds like you're distributing the viewer with >> PETSc (instead of petsc4py of some third-party plugin). > > ?No, the viewer is distributed with petsc4py or some other package, but yes the PETSc source code is augmented also. In fact one could organize it so one extra "dispatcher" could support many different python viewers; essentially a shell Viewer :-) > > >> ?We had a long >> thread a while ago about making multiple dispatch runtime-extensible >> in both arguments. > > ? I was just pointing out that something "quick and dirty" can be done now without the multiple dispatch system. > > ? I am not opposed to a multiple dispatch system to handle this; but no one has proposed specifics for such a system that pass the "good enough for PETSc" test. > > ?Barry > >> >> Jed > > From abhyshr at mcs.anl.gov Mon Sep 27 12:24:03 2010 From: abhyshr at mcs.anl.gov (Shri) Date: Mon, 27 Sep 2010 11:24:03 -0600 (GMT-06:00) Subject: [petsc-users] matrix-matrix addition In-Reply-To: Message-ID: <1043392716.275931285608243167.JavaMail.root@zimbra.anl.gov> Ok. I'll work on it this afternoon. ----- Barry Smith wrote: > > On Sep 27, 2010, at 12:04 PM, Jed Brown wrote: > > > On Mon, Sep 27, 2010 at 19:01, Barry Smith wrote: > >> If you do not merge the column indices from A and B but simply count them all you will get over allocation, at worst a factor of two, though usually it would just be a little. > > > > You count the number of *unique* entries. You have two indices that > > run through the arrays aj and bj, counting duplicates only once. This > > is easy to do since they are both sorted. > > Yes, sounds like it could work and definitely better than my suggestion. > > Shri, can you try this? > > Barry > > > > > Jed > From amal.ghamdi at kaust.edu.sa Mon Sep 27 12:34:40 2010 From: amal.ghamdi at kaust.edu.sa (Amal Alghamdi) Date: Mon, 27 Sep 2010 20:34:40 +0300 Subject: [petsc-users] petsc4py, plotting DA and writing to file In-Reply-To: References: <32FDC599-5386-426E-82BF-FF6B6BFE3B05@mcs.anl.gov> <1DF41E16-5149-4B8F-811C-B0F7A6815DE2@mcs.anl.gov> Message-ID: Thank you very much everybody for the prompt help. Amal On Mon, Sep 27, 2010 at 8:15 PM, Jed Brown wrote: > Agree on both points. > > On Mon, Sep 27, 2010 at 19:13, Barry Smith wrote: > > > > On Sep 27, 2010, at 12:09 PM, Jed Brown wrote: > > > >> On Mon, Sep 27, 2010 at 19:07, Barry Smith wrote: > >>> This could be done by putting into the C code the usual "case" handling > for the new viewer and then have > >>> it dispatch back to the python code. > >> > >> Right, but then it sounds like you're distributing the viewer with > >> PETSc (instead of petsc4py of some third-party plugin). > > > > No, the viewer is distributed with petsc4py or some other package, but > yes the PETSc source code is augmented also. In fact one could organize it > so one extra "dispatcher" could support many different python viewers; > essentially a shell Viewer :-) > > > > > >> We had a long > >> thread a while ago about making multiple dispatch runtime-extensible > >> in both arguments. > > > > I was just pointing out that something "quick and dirty" can be done > now without the multiple dispatch system. > > > > I am not opposed to a multiple dispatch system to handle this; but no > one has proposed specifics for such a system that pass the "good enough for > PETSc" test. > > > > Barry > > > >> > >> Jed > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dalcinl at gmail.com Mon Sep 27 14:57:37 2010 From: dalcinl at gmail.com (Lisandro Dalcin) Date: Mon, 27 Sep 2010 16:57:37 -0300 Subject: [petsc-users] petsc4py, plotting DA and writing to file In-Reply-To: References: <32FDC599-5386-426E-82BF-FF6B6BFE3B05@mcs.anl.gov> <1DF41E16-5149-4B8F-811C-B0F7A6815DE2@mcs.anl.gov> Message-ID: On 27 September 2010 14:34, Amal Alghamdi wrote: > Thank you very much everybody for the prompt help. > Amal > I've just pushed a new demo: matrix-free Poisson 2D http://code.google.com/p/petsc4py/source/browse/demo/poisson2d/poisson2d.py At the very end you have: draw = PETSc.Viewer.DRAW(x.comm) OptDB['draw_pause'] = 1 draw(x) Additionally, I've updated other demos. Take a look here for matplotlib usage gathering values to processor 0: http://code.google.com/p/petsc4py/source/browse/demo/bratu3d/bratu3d.py (note that this demo is 3D, I'm plotting the solution at plane Z=0.5) -- Lisandro Dalcin --------------- CIMEC (INTEC/CONICET-UNL) Predio CONICET-Santa Fe Colectora RN 168 Km 472, Paraje El Pozo Tel: +54-342-4511594 (ext 1011) Tel/Fax: +54-342-4511169 From amorris at mtroyal.ca Mon Sep 27 17:10:56 2010 From: amorris at mtroyal.ca (Alexis Morris) Date: Mon, 27 Sep 2010 16:10:56 -0600 Subject: [petsc-users] Getting a matrix into PETSc Message-ID: <4CA11670.4010806@mtroyal.ca> Hi everyone, I am new to PETSc, and am really excited about using it in combination with SLEPc to find the eigenvalues of a large hermitian sparse matrix. I have read through the user manual, and one thing is really confusing me: how do I get my matrix into PETSc? My code is organized into separate modules (I am using Fortran). One of the modules calculates my matrix H and stores it in a compressed sparse row format. I would like to then call from my main program an eigenvalue subroutine and pass the matrix H to it. In this eigenvalue subroutine, I would use PETSc/SLEPc. How do I get it to use my pre-calculated matrix? For example, here is some code for my eigenvalue subroutine. In this particular problem, my input matrix is just stored in the regular full matrix format so that I could more easily test the routine. subroutine eigen(A) #include "finclude/petscdef.h" #include "finclude/slepcepsdef.h" use slepceps use kinds_module !Contains data type definitions, like r8=double precision real. implicit none real (kind=r8), dimension(:,:), intent(in) :: A ! local variables PetscErrorCode :: ierr PetscMPIInt :: N ! Size of A PetscMPIInt :: nconv ! number of converged eigenpairs Mat :: A_PETSC ! The PETSc equivalent of my input A ! Get size of A N = size(A, dim=1) ! Get my matrix into SLEPc. call SlepcInitialize(PETSC_NULL_CHARACTER,ierr); CHKERRQ(ierr) < This is where I don't know what to do > < How to I do A_PETSC = A> call MatAssemblyBegin(A_PETSC,MAT_FINAL_ASSEMBLY,ierr) call MatAssemblyEnd(A_PETSC,MAT_FINAL_ASSEMBLY,ierr) ! Find eigenpairs with SLEPc EPSCreate( PETSC_COMM_WORLD, eps, ierr) EPSSetOperators( eps, A_PETSC, PETSC_NULL ) EPSSetProblemType( eps, EPS_HERMITIAN ); EPSSetFromOptions( eps ); EPSSolve( eps ); EPSGetConverged( eps, nconv ); ... EPSDestroy( eps ); call SlepcFinalize(ierr) end subroutine eigen Another thing I am confused about: my main program uses the usual fortran data types. Should I worry about these clashing with the PETSc ones like PetscMPIInt, Vec, Mat, etc.? I apologize if these things have already been discussed at length. I have been searching the internet and this mailing list, and can't seem to find anything about getting a matrix into PETSc. I appreciate any help. Best regards, Alexis -- Dr Alexis Morris Assistant Professor Department of Mathematics, Physics and Engineering Mount Royal University 4825 Mount Royal Gate SW Calgary, Alberta, Canada T3E 6K6 Phone: (403) 440-8507 Fax: (403) 440-6505 From u.tabak at tudelft.nl Mon Sep 27 18:10:25 2010 From: u.tabak at tudelft.nl (Umut Tabak) Date: Tue, 28 Sep 2010 01:10:25 +0200 Subject: [petsc-users] Getting a matrix into PETSc In-Reply-To: <4CA11670.4010806@mtroyal.ca> References: <4CA11670.4010806@mtroyal.ca> Message-ID: <4CA12461.6000702@tudelft.nl> Alexis Morris wrote: > Hi everyone, > > I am new to PETSc, and am really excited about using it in > combination with SLEPc to find the eigenvalues of a large hermitian > sparse matrix. I have read through the user manual, and one thing is > really confusing me: how do I get my matrix into PETSc? > > My code is organized into separate modules (I am using Fortran). One > of the modules calculates my matrix H and stores it in a compressed > sparse row format. I would like to then call from my main program an > eigenvalue subroutine and pass the matrix H to it. In this > eigenvalue subroutine, I would use PETSc/SLEPc. How do I get it to > use my pre-calculated matrix? Assuming the matrices are sparse(otherwise using SLEPc does not mean anything.), you should write some routines to form the matrices in Petsc format. For this operation, you should start playing with MatSetValues. I guess this should give you a starting point. HTH, Umut From knepley at gmail.com Mon Sep 27 18:10:33 2010 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 27 Sep 2010 19:10:33 -0400 Subject: [petsc-users] Getting a matrix into PETSc In-Reply-To: <4CA11670.4010806@mtroyal.ca> References: <4CA11670.4010806@mtroyal.ca> Message-ID: On Mon, Sep 27, 2010 at 6:10 PM, Alexis Morris wrote: > Hi everyone, > > I am new to PETSc, and am really excited about using it in combination > with SLEPc to find the eigenvalues of a large hermitian sparse matrix. I > have read through the user manual, and one thing is really confusing me: how > do I get my matrix into PETSc? > > My code is organized into separate modules (I am using Fortran). One of > the modules calculates my matrix H and stores it in a compressed sparse row > format. I would like to then call from my main program an eigenvalue > subroutine and pass the matrix H to it. In this eigenvalue subroutine, I > would use PETSc/SLEPc. How do I get it to use my pre-calculated matrix? > The best thing to do is replace your data structure with the PETSc Mat object. I suppose that you construct your CSR matrix one row at a time. You can do this with PETSc using http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/Mat/MatSetValues.html so it should be simple translation. To be efficient, there is the one additional step of telling PETSc the length of each row, but you must do this to assemble your CSR structure, so again it should be no extra work. For example, here is some code for my eigenvalue subroutine. In this > particular problem, my input matrix is just stored in the regular full > matrix format so that I could more easily test the routine. > > subroutine eigen(A) > #include "finclude/petscdef.h" > #include "finclude/slepcepsdef.h" > use slepceps > use kinds_module !Contains data type definitions, like r8=double > precision real. > implicit none > real (kind=r8), dimension(:,:), intent(in) :: A > > ! local variables > > PetscErrorCode :: ierr > PetscMPIInt :: N ! Size of A > PetscMPIInt :: nconv ! number of converged eigenpairs > Mat :: A_PETSC ! The PETSc equivalent of my input A > > > ! Get size of A > > N = size(A, dim=1) > > ! Get my matrix into SLEPc. > > call SlepcInitialize(PETSC_NULL_CHARACTER,ierr); CHKERRQ(ierr) > > < This is where I don't know what to do > > < How to I do A_PETSC = A> > > call MatAssemblyBegin(A_PETSC,MAT_FINAL_ASSEMBLY,ierr) > call MatAssemblyEnd(A_PETSC,MAT_FINAL_ASSEMBLY,ierr) > > ! Find eigenpairs with SLEPc > > EPSCreate( PETSC_COMM_WORLD, eps, ierr) > EPSSetOperators( eps, A_PETSC, PETSC_NULL ) > EPSSetProblemType( eps, EPS_HERMITIAN ); > EPSSetFromOptions( eps ); > EPSSolve( eps ); > EPSGetConverged( eps, nconv ); > ... > EPSDestroy( eps ); > > call SlepcFinalize(ierr) > end subroutine eigen > > > Another thing I am confused about: my main program uses the usual fortran > data types. Should I worry about these clashing with the PETSc ones like > PetscMPIInt, Vec, Mat, etc.? > When calling PETSc functions, it is best to use the PETSc types, such as PetscInt. Thanks, Matt > I apologize if these things have already been discussed at length. I have > been searching the internet and this mailing list, and can't seem to find > anything about getting a matrix into PETSc. I appreciate any help. > > Best regards, > Alexis > > > -- > Dr Alexis Morris > Assistant Professor > Department of Mathematics, Physics and Engineering > Mount Royal University > 4825 Mount Royal Gate SW > Calgary, Alberta, Canada > T3E 6K6 > Phone: (403) 440-8507 > Fax: (403) 440-6505 > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Mon Sep 27 19:33:10 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 27 Sep 2010 19:33:10 -0500 Subject: [petsc-users] Getting a matrix into PETSc In-Reply-To: <4CA11670.4010806@mtroyal.ca> References: <4CA11670.4010806@mtroyal.ca> Message-ID: <5D12734E-9B84-4454-A5BD-925A255310B0@mcs.anl.gov> On Sep 27, 2010, at 5:10 PM, Alexis Morris wrote: > Hi everyone, > > I am new to PETSc, and am really excited about using it in combination with SLEPc to find the eigenvalues of a large hermitian sparse matrix. I have read through the user manual, and one thing is really confusing me: how do I get my matrix into PETSc? > > My code is organized into separate modules (I am using Fortran). One of the modules calculates my matrix H and stores it in a compressed sparse row format. What do you mean calculates and stores in compressed sparse row format? Are you calculating it in parallel? Do you want to calculate in parallel? Do you want to solve the eigenvalue problem in parallel? If you just want a sequential code you can use MatCreateSeqAIJWithArrays() to directly use the CSR matrix you created this will be simplier than reorganizing your code to use MatSetValues(). If your code is not in parallel but you want it to be in parallel then you will need to do a lot of reorganization and use MatSetValues. Barry > I would like to then call from my main program an eigenvalue subroutine and pass the matrix H to it. In this eigenvalue subroutine, I would use PETSc/SLEPc. How do I get it to use my pre-calculated matrix? > > For example, here is some code for my eigenvalue subroutine. In this particular problem, my input matrix is just stored in the regular full matrix format so that I could more easily test the routine. > > subroutine eigen(A) > #include "finclude/petscdef.h" > #include "finclude/slepcepsdef.h" > use slepceps > use kinds_module !Contains data type definitions, like r8=double precision real. > implicit none > real (kind=r8), dimension(:,:), intent(in) :: A > > ! local variables > > PetscErrorCode :: ierr > PetscMPIInt :: N ! Size of A > PetscMPIInt :: nconv ! number of converged eigenpairs > Mat :: A_PETSC ! The PETSc equivalent of my input A > > > ! Get size of A > > N = size(A, dim=1) > > ! Get my matrix into SLEPc. > > call SlepcInitialize(PETSC_NULL_CHARACTER,ierr); CHKERRQ(ierr) > > < This is where I don't know what to do > > < How to I do A_PETSC = A> > > call MatAssemblyBegin(A_PETSC,MAT_FINAL_ASSEMBLY,ierr) > call MatAssemblyEnd(A_PETSC,MAT_FINAL_ASSEMBLY,ierr) > > ! Find eigenpairs with SLEPc > > EPSCreate( PETSC_COMM_WORLD, eps, ierr) > EPSSetOperators( eps, A_PETSC, PETSC_NULL ) > EPSSetProblemType( eps, EPS_HERMITIAN ); > EPSSetFromOptions( eps ); > EPSSolve( eps ); > EPSGetConverged( eps, nconv ); > ... > EPSDestroy( eps ); > > call SlepcFinalize(ierr) > end subroutine eigen > > > Another thing I am confused about: my main program uses the usual fortran data types. Should I worry about these clashing with the PETSc ones like PetscMPIInt, Vec, Mat, etc.? > > I apologize if these things have already been discussed at length. I have been searching the internet and this mailing list, and can't seem to find anything about getting a matrix into PETSc. I appreciate any help. > > Best regards, > Alexis > > > -- > Dr Alexis Morris > Assistant Professor > Department of Mathematics, Physics and Engineering > Mount Royal University > 4825 Mount Royal Gate SW > Calgary, Alberta, Canada > T3E 6K6 > Phone: (403) 440-8507 > Fax: (403) 440-6505 > From amal.ghamdi at kaust.edu.sa Mon Sep 27 19:35:38 2010 From: amal.ghamdi at kaust.edu.sa (Amal Alghamdi) Date: Tue, 28 Sep 2010 03:35:38 +0300 Subject: [petsc-users] petsc4py, plotting DA and writing to file In-Reply-To: References: <32FDC599-5386-426E-82BF-FF6B6BFE3B05@mcs.anl.gov> <1DF41E16-5149-4B8F-811C-B0F7A6815DE2@mcs.anl.gov> Message-ID: Thank you very much Amal On Mon, Sep 27, 2010 at 10:57 PM, Lisandro Dalcin wrote: > On 27 September 2010 14:34, Amal Alghamdi > wrote: > > Thank you very much everybody for the prompt help. > > Amal > > > > I've just pushed a new demo: matrix-free Poisson 2D > > http://code.google.com/p/petsc4py/source/browse/demo/poisson2d/poisson2d.py > > > At the very end you have: > > draw = PETSc.Viewer.DRAW(x.comm) > OptDB['draw_pause'] = 1 > draw(x) > > Additionally, I've updated other demos. Take a look here for > matplotlib usage gathering values to processor 0: > > http://code.google.com/p/petsc4py/source/browse/demo/bratu3d/bratu3d.py > > (note that this demo is 3D, I'm plotting the solution at plane Z=0.5) > > > > -- > Lisandro Dalcin > --------------- > CIMEC (INTEC/CONICET-UNL) > Predio CONICET-Santa Fe > Colectora RN 168 Km 472, Paraje El Pozo > Tel: +54-342-4511594 (ext 1011) > Tel/Fax: +54-342-4511169 > -------------- next part -------------- An HTML attachment was scrubbed... URL: From Chun.SUN at 3ds.com Tue Sep 28 08:48:00 2010 From: Chun.SUN at 3ds.com (SUN Chun) Date: Tue, 28 Sep 2010 13:48:00 +0000 Subject: [petsc-users] memory management of external libraries In-Reply-To: References: <32FDC599-5386-426E-82BF-FF6B6BFE3B05@mcs.anl.gov><1DF41E16-5149-4B8F-811C-B0F7A6815DE2@mcs.anl.gov> Message-ID: Hi PETSc developers, Sorry if this might be related to ML rather than PETSc. I have used PetscSetMalloc to override memory management of PETSc with my own code. Everything looks fine with PETSc. However I'm noticing that when using external library such as ML, we can't pass down our own allocator down. I believe this is a limitation from the interface of ML, though I'm not 100% sure. Could you please comment on this? If this is true, is there any way around to replace the allocation from ML? Thanks a lot, Chun This email and any attachments are intended solely for the use of the individual or entity to whom it is addressed and may be confidential and/or privileged. If you are not one of the named recipients or have received this email in error, (i) you should not read, disclose, or copy it, (ii) please notify sender of your receipt by reply email and delete this email and all attachments, (iii) Dassault Systemes does not accept or assume any liability or responsibility for any use of or reliance on this email.For other languages, go to http://www.3ds.com/terms/email-disclaimer. -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at 59A2.org Tue Sep 28 08:56:37 2010 From: jed at 59A2.org (Jed Brown) Date: Tue, 28 Sep 2010 15:56:37 +0200 Subject: [petsc-users] memory management of external libraries In-Reply-To: References: <32FDC599-5386-426E-82BF-FF6B6BFE3B05@mcs.anl.gov> <1DF41E16-5149-4B8F-811C-B0F7A6815DE2@mcs.anl.gov> Message-ID: On Tue, Sep 28, 2010 at 15:48, SUN Chun wrote: > Sorry if this might be related to ML rather than PETSc. I have used > PetscSetMalloc to override memory management of PETSc with my own code. > Everything looks fine with PETSc. However I'm noticing that when using > external library such as ML, we can't pass down our own allocator down. I > believe this is a limitation from the interface of ML, though I'm not 100% > sure. ML just calls an unwrapped malloc. You can link in a custom allocator (with LD_PRELOAD or otherwise, see docs for any third-party malloc such as tcmalloc), in which case everything that calls malloc will use it (including C++ new on most systems). Jed From fpacull at fluorem.com Tue Sep 28 12:08:16 2010 From: fpacull at fluorem.com (francois pacull) Date: Tue, 28 Sep 2010 19:08:16 +0200 Subject: [petsc-users] parmetis weights Message-ID: <4CA22100.9040103@fluorem.com> Dear PETSc team, I just have a small and naive question about graph weights in Parmetis. When looking at mat/partition/impls/pmetis/pmetis.c, it seems that weight arrays (for vertices or edges) can be given to parmetis, however the wgtflag variable is always set to 0 from what I understand, which means that no weights are used in ParMETIS_V3_PartKway. So does this mean that the parmetis routine uses the weight arrays whenever they are not NULL, even if wgtflag is 0? I didn't try to use any weight yet, but will probably try it soon... Thank you for any help you can provide, francois pacull. From bsmith at mcs.anl.gov Tue Sep 28 13:41:40 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 28 Sep 2010 13:41:40 -0500 Subject: [petsc-users] parmetis weights In-Reply-To: <4CA22100.9040103@fluorem.com> References: <4CA22100.9040103@fluorem.com> Message-ID: <872EB3A7-16AD-49ED-8CFB-6F96A64E543C@mcs.anl.gov> It is possible that the code is not tested in that mode. If you determine that something is wrong when you get to this let us know. Perhaps you need to set that flag and see if it works and then tell us what change you made. Barry On Sep 28, 2010, at 12:08 PM, francois pacull wrote: > Dear PETSc team, > I just have a small and naive question about graph weights in Parmetis. When looking at mat/partition/impls/pmetis/pmetis.c, it seems that weight arrays (for vertices or edges) can be given to parmetis, however the wgtflag variable is always set to 0 from what I understand, which means that no weights are used in ParMETIS_V3_PartKway. So does this mean that the parmetis routine uses the weight arrays whenever they are not NULL, even if wgtflag is 0? I didn't try to use any weight yet, but will probably try it soon... > Thank you for any help you can provide, > francois pacull. > > From rlmackie862 at gmail.com Tue Sep 28 13:59:26 2010 From: rlmackie862 at gmail.com (Randall Mackie) Date: Tue, 28 Sep 2010 11:59:26 -0700 Subject: [petsc-users] DASetCoordinates Message-ID: I'm trying to use the DMMG structure to do multi-grid as a preconditioner. My grids have non-uniform spacing. I can compute the non-uniform spacing at each level, and I assume I can get that information to the multigrid solver using DASetCoordinates. I also assume that this is needed to correctly compute the matrices that allow you to go from one level to another. My questions are: 1) Where exactly should DASetCoordinates be called? 2) Is it called at each level? 3) Or is it just called for the finest grid? Thanks, Randy -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at 59A2.org Tue Sep 28 16:22:29 2010 From: jed at 59A2.org (Jed Brown) Date: Tue, 28 Sep 2010 23:22:29 +0200 Subject: [petsc-users] DASetCoordinates In-Reply-To: References: Message-ID: On Tue, Sep 28, 2010 at 20:59, Randall Mackie wrote: > I'm trying to use the DMMG structure to do multi-grid as a preconditioner. > My grids have non-uniform > spacing. I can compute the non-uniform spacing at each level, and I assume I > can get that information > to the multigrid solver using DASetCoordinates. I also assume that this is > needed to correctly compute > the matrices that allow you to go from one level to another. My questions > are: > > 1) Where exactly should DASetCoordinates be called? You will have to call it at every level, DARefine and DACoarsen do not currently interpolate coordinates. Maybe they should? Jed From bsmith at mcs.anl.gov Tue Sep 28 16:29:57 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 28 Sep 2010 16:29:57 -0500 Subject: [petsc-users] DASetCoordinates In-Reply-To: References: Message-ID: On Sep 28, 2010, at 4:22 PM, Jed Brown wrote: > On Tue, Sep 28, 2010 at 20:59, Randall Mackie wrote: >> I'm trying to use the DMMG structure to do multi-grid as a preconditioner. >> My grids have non-uniform >> spacing. I can compute the non-uniform spacing at each level, and I assume I >> can get that information >> to the multigrid solver using DASetCoordinates. I also assume that this is >> needed to correctly compute >> the matrices that allow you to go from one level to another. My questions >> are: >> >> 1) Where exactly should DASetCoordinates be called? > > You will have to call it at every level, DARefine and DACoarsen do not > currently interpolate coordinates. Maybe they should? Sounds reasonable for them to automatically handle the coordinates if they exist. Barry The reason the coordinates are treated like second cousins is that they are second cousins and were added later and never fully integrated with the DA experience. Probably someone with energy and time on their hands could go through and integrate the coordinates more fully into the DA infrastructure. > > Jed From rlmackie862 at gmail.com Tue Sep 28 17:14:12 2010 From: rlmackie862 at gmail.com (Randall Mackie) Date: Tue, 28 Sep 2010 15:14:12 -0700 Subject: [petsc-users] DASetCoordinates In-Reply-To: References: Message-ID: <5AFDEDBA-9F3F-4F46-8997-162BA44DEB4E@gmail.com> Okay, thanks for that answer. Maybe I'm just being dense, but I don't see how or where I set these at each level within the DMMG framework. Isn't all that buried within the DMMG routines, or is there some way for me to specify this? Thanks, Randy On Sep 28, 2010, at 2:22 PM, Jed Brown wrote: > On Tue, Sep 28, 2010 at 20:59, Randall Mackie wrote: >> I'm trying to use the DMMG structure to do multi-grid as a preconditioner. >> My grids have non-uniform >> spacing. I can compute the non-uniform spacing at each level, and I assume I >> can get that information >> to the multigrid solver using DASetCoordinates. I also assume that this is >> needed to correctly compute >> the matrices that allow you to go from one level to another. My questions >> are: >> >> 1) Where exactly should DASetCoordinates be called? > > You will have to call it at every level, DARefine and DACoarsen do not > currently interpolate coordinates. Maybe they should? > > Jed From jed at 59A2.org Tue Sep 28 17:19:10 2010 From: jed at 59A2.org (Jed Brown) Date: Wed, 29 Sep 2010 00:19:10 +0200 Subject: [petsc-users] DASetCoordinates In-Reply-To: <5AFDEDBA-9F3F-4F46-8997-162BA44DEB4E@gmail.com> References: <5AFDEDBA-9F3F-4F46-8997-162BA44DEB4E@gmail.com> Message-ID: On Wed, Sep 29, 2010 at 00:14, Randall Mackie wrote: > Okay, thanks for that answer. > > Maybe I'm just being dense, but I don't see how or where I set these at each level within > the DMMG framework. ?Isn't all that buried within the DMMG routines, or is there some > way for me to specify this? It's kinda raw: for (i=0; idm; /* Set coordinates */ } From rlmackie862 at gmail.com Tue Sep 28 18:32:53 2010 From: rlmackie862 at gmail.com (Randall Mackie) Date: Tue, 28 Sep 2010 16:32:53 -0700 Subject: [petsc-users] DASetCoordinates In-Reply-To: References: <5AFDEDBA-9F3F-4F46-8997-162BA44DEB4E@gmail.com> Message-ID: <88BBB8CB-9E31-417C-933F-F9A73ED1E1C3@gmail.com> On Sep 28, 2010, at 3:19 PM, Jed Brown wrote: > On Wed, Sep 29, 2010 at 00:14, Randall Mackie wrote: >> Okay, thanks for that answer. >> >> Maybe I'm just being dense, but I don't see how or where I set these at each level within >> the DMMG framework. Isn't all that buried within the DMMG routines, or is there some >> way for me to specify this? > > It's kinda raw: > > for (i=0; i DA da = (DA)dmmg[i]->dm; > /* Set coordinates */ > } Thanks Jed. One more question: how the heck do I get the (DA)dmmg[i]-> if my code is in Fortran? Randy From jed at 59A2.org Tue Sep 28 19:18:35 2010 From: jed at 59A2.org (Jed Brown) Date: Wed, 29 Sep 2010 02:18:35 +0200 Subject: [petsc-users] DASetCoordinates In-Reply-To: References: <5AFDEDBA-9F3F-4F46-8997-162BA44DEB4E@gmail.com> <88BBB8CB-9E31-417C-933F-F9A73ED1E1C3@gmail.com> Message-ID: I don't think there is a Fortran API for that currently, but the DM its the first slot in the DMMG so you can probably do DASetCoordinates(dmmg(i),cvec,ierr) (completely untested) Jed On Sep 29, 2010 1:33 AM, "Randall Mackie" wrote: On Sep 28, 2010, at 3:19 PM, Jed Brown wrote: > On Wed, Sep 29, 2010 at 00:14, Randall Mackie if my code is in Fortran? Randy -------------- next part -------------- An HTML attachment was scrubbed... URL: From huangsc at gmail.com Tue Sep 28 20:19:06 2010 From: huangsc at gmail.com (Shao-Ching Huang) Date: Tue, 28 Sep 2010 18:19:06 -0700 Subject: [petsc-users] DMMG preconditioner used in KSP Message-ID: Hi I can trying to solve a linear system (A+B)x=b where A is the standard finite difference Laplacian matrix and B has only a few elements (i.e. B's number of non-zero entries is much much smaller than A's). Is it possible to precondition by applying DMMG on A (like in ex34.c) in a KSP solver? Is there an example that I can follow to set this up? Thanks, Shao-Ching From bsmith at mcs.anl.gov Tue Sep 28 21:13:07 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 28 Sep 2010 21:13:07 -0500 Subject: [petsc-users] DMMG preconditioner used in KSP In-Reply-To: References: Message-ID: <90A91BA8-A817-4760-B1FE-96B3500F0C01@mcs.anl.gov> If you are using DMMGSetKSP() then just have the third argument (func) compute A+B as the first matrix and A as the second matrix. Then the smoothing and coarse grid solve will only use the A entries. That is A+B will be preconditioned with geometric multigrid applied to A. But, I suspect that you will do just as well using A+B. Barry On Sep 28, 2010, at 8:19 PM, Shao-Ching Huang wrote: > Hi > > I can trying to solve a linear system > > (A+B)x=b > > where A is the standard finite difference Laplacian matrix and B has > only a few elements (i.e. B's number of non-zero entries is much much > smaller than A's). > Is it possible to precondition by applying DMMG on A (like in ex34.c) > in a KSP solver? Is there an example that I can follow to set this up? > > Thanks, > > Shao-Ching From huangsc at gmail.com Tue Sep 28 21:34:58 2010 From: huangsc at gmail.com (Shao-Ching Huang) Date: Tue, 28 Sep 2010 19:34:58 -0700 Subject: [petsc-users] DMMG preconditioner used in KSP In-Reply-To: <90A91BA8-A817-4760-B1FE-96B3500F0C01@mcs.anl.gov> References: <90A91BA8-A817-4760-B1FE-96B3500F0C01@mcs.anl.gov> Message-ID: Thanks Barry. I will look into DMMGSetKSP and your suggestions. Shao-Ching On Tue, Sep 28, 2010 at 7:13 PM, Barry Smith wrote: > > ?If you are using DMMGSetKSP() then just have the third argument (func) compute A+B as the first matrix and A as the second matrix. Then the smoothing and coarse grid solve will > only use the A entries. That is A+B will be preconditioned with geometric multigrid applied to A. > > ? But, I suspect that you will do just as well using A+B. > > ? Barry > > On Sep 28, 2010, at 8:19 PM, Shao-Ching Huang wrote: > >> Hi >> >> I can trying to solve a linear system >> >> (A+B)x=b >> >> where A is the standard finite difference Laplacian matrix and B has >> only a few elements (i.e. B's number of non-zero entries is much much >> smaller than A's). >> Is it possible to precondition by applying DMMG on A (like in ex34.c) >> in a KSP solver? Is there an example that I can follow to set this up? >> >> Thanks, >> >> Shao-Ching > > From bsmith at mcs.anl.gov Tue Sep 28 21:51:13 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 28 Sep 2010 21:51:13 -0500 Subject: [petsc-users] DMMG preconditioner used in KSP In-Reply-To: References: <90A91BA8-A817-4760-B1FE-96B3500F0C01@mcs.anl.gov> Message-ID: If you are using DMMGSetSNES you can do a similar thing. That is why there are two matrix arguments. Barry On Sep 28, 2010, at 9:34 PM, Shao-Ching Huang wrote: > Thanks Barry. I will look into DMMGSetKSP and your suggestions. > > Shao-Ching > > On Tue, Sep 28, 2010 at 7:13 PM, Barry Smith wrote: >> >> If you are using DMMGSetKSP() then just have the third argument (func) compute A+B as the first matrix and A as the second matrix. Then the smoothing and coarse grid solve will >> only use the A entries. That is A+B will be preconditioned with geometric multigrid applied to A. >> >> But, I suspect that you will do just as well using A+B. >> >> Barry >> >> On Sep 28, 2010, at 8:19 PM, Shao-Ching Huang wrote: >> >>> Hi >>> >>> I can trying to solve a linear system >>> >>> (A+B)x=b >>> >>> where A is the standard finite difference Laplacian matrix and B has >>> only a few elements (i.e. B's number of non-zero entries is much much >>> smaller than A's). >>> Is it possible to precondition by applying DMMG on A (like in ex34.c) >>> in a KSP solver? Is there an example that I can follow to set this up? >>> >>> Thanks, >>> >>> Shao-Ching >> >> From Pierre.Moinier at baesystems.com Wed Sep 29 04:46:14 2010 From: Pierre.Moinier at baesystems.com (Moinier, Pierre (UK)) Date: Wed, 29 Sep 2010 10:46:14 +0100 Subject: [petsc-users] Read in sequential, solve in parallel Message-ID: <32845768EC63B04EB132BC2C4351B22699B8FF@GLKMS2114.GREENLNK.NET> Dear Developers, I have a Matrix and a Right Hand Side (RHS) in an ASCII format stored in 2 different files. I have written the code that reads the data and solve the system in sequential. I would like to solve the same problem in parallel. I read the FAQ section that says: Never read or write in parallel an ASCII matrix file, instead for reading: read in sequentially then save the matrix with the binary viewer PetscBinaryViewerOpen() and load the matrix in parallel with MatLoad(). So far, I did not manage to implement this. Could any one help me? My matrix is in the Matrix Market format. I have put below my current implementation. Kind regards, -Pierre. PetscInitialize(&argc,&argv,(char *)0,help); ierr = MPI_Comm_size(PETSC_COMM_WORLD,&size);CHKERRQ(ierr); if (size != 1) SETERRQ(1,"This is a uniprocessor example only!"); /* Read in matrix */ ierr = PetscOptionsGetString(PETSC_NULL,"-fin",filein,127,PETSC_NULL);CHKERRQ(i err); ierr = PetscFOpen(PETSC_COMM_SELF,filein,"r",&file);CHKERRQ(ierr); /* find out size of sparse matrix .... */ fgets(buf,PETSC_MAX_PATH_LEN-1,file); printf("%s",buf); fscanf(file,"%d %d %d\n",&M,&N,&nz); ierr = PetscPrintf(PETSC_COMM_SELF,"M: %d, N: %d, nz: %d\n",M,N,nz);CHKERRQ(ierr); MatCreate(PETSC_COMM_WORLD,&A); MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,M,N); MatSetType(A,MATAIJ); MatSetUp(A); for (i=0; i From jed at 59A2.org Wed Sep 29 05:18:13 2010 From: jed at 59A2.org (Jed Brown) Date: Wed, 29 Sep 2010 12:18:13 +0200 Subject: [petsc-users] Read in sequential, solve in parallel In-Reply-To: <32845768EC63B04EB132BC2C4351B22699B8FF@GLKMS2114.GREENLNK.NET> References: <32845768EC63B04EB132BC2C4351B22699B8FF@GLKMS2114.GREENLNK.NET> Message-ID: On Wed, Sep 29, 2010 at 11:46, Moinier, Pierre (UK) wrote: > I have a Matrix and a Right Hand Side (RHS) in an ASCII format stored in?2 > different files. I have written the code that reads the data and solve the > system in sequential. I would like to solve the same problem in parallel. I > read the FAQ section that says: > > Never read or write in parallel an ASCII matrix file, instead for reading: > read in sequentially then save the matrix with the binary viewer > PetscBinaryViewerOpen() and load the matrix in parallel with MatLoad(). > > So far, I did not manage to implement this. Could any one help me? With your matrix assembled in serial: ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"A.dat",FILE_MODE_WRITE,&viewer);CHKERRQ(ierr); ierr = MatView(A,viewer);CHKERRQ(ierr); ierr = PetscViewerDestroy(viewer);CHKERRQ(ierr); In parallel: ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"A.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr); ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr); ierr = MatLoad(A,viewer);CHKERRQ(ierr); ierr = PetscViewerDestroy(viewer);CHKERRQ(ierr); ierr = KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN);CHKERRQ(ierr); Jed From balay at mcs.anl.gov Wed Sep 29 05:58:58 2010 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 29 Sep 2010 05:58:58 -0500 (CDT) Subject: [petsc-users] Read in sequential, solve in parallel In-Reply-To: References: <32845768EC63B04EB132BC2C4351B22699B8FF@GLKMS2114.GREENLNK.NET> Message-ID: check mat/examples/tests/ex72.c,ex50.c for examples. Also ksp/ksp/examples/tutorials/ex10.c - for loading up this matrix and solving parallely. Satish On Wed, 29 Sep 2010, Jed Brown wrote: > On Wed, Sep 29, 2010 at 11:46, Moinier, Pierre (UK) > wrote: > > I have a Matrix and a Right Hand Side (RHS) in an ASCII format stored in?2 > > different files. I have written the code that reads the data and solve the > > system in sequential. I would like to solve the same problem in parallel. I > > read the FAQ section that says: > > > > Never read or write in parallel an ASCII matrix file, instead for reading: > > read in sequentially then save the matrix with the binary viewer > > PetscBinaryViewerOpen() and load the matrix in parallel with MatLoad(). > > > > So far, I did not manage to implement this. Could any one help me? > > With your matrix assembled in serial: > > ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"A.dat",FILE_MODE_WRITE,&viewer);CHKERRQ(ierr); > ierr = MatView(A,viewer);CHKERRQ(ierr); > ierr = PetscViewerDestroy(viewer);CHKERRQ(ierr); > > In parallel: > > ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"A.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr); > ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr); > ierr = MatLoad(A,viewer);CHKERRQ(ierr); > ierr = PetscViewerDestroy(viewer);CHKERRQ(ierr); > ierr = KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN);CHKERRQ(ierr); > > > Jed > From lvankampenhout at gmail.com Wed Sep 29 07:01:48 2010 From: lvankampenhout at gmail.com (Leo van Kampenhout) Date: Wed, 29 Sep 2010 14:01:48 +0200 Subject: [petsc-users] Read in sequential, solve in parallel In-Reply-To: References: <32845768EC63B04EB132BC2C4351B22699B8FF@GLKMS2114.GREENLNK.NET> Message-ID: I have run into this same problem some weeks ago. The idea as described by Jed is that you read in the matrix only once, sequentially, from the original file. After doing this, you can store the Mat file in Petsc binary format with MatView. Now for all subsequent runs you can use this binary file to read in the matrix in parallel, with MatLoad. Good luck, Leo 2010/9/29 Satish Balay > check mat/examples/tests/ex72.c,ex50.c for examples. > > Also ksp/ksp/examples/tutorials/ex10.c - for loading up this matrix > and solving parallely. > > Satish > > On Wed, 29 Sep 2010, Jed Brown wrote: > > > On Wed, Sep 29, 2010 at 11:46, Moinier, Pierre (UK) > > wrote: > > > I have a Matrix and a Right Hand Side (RHS) in an ASCII format stored > in 2 > > > different files. I have written the code that reads the data and solve > the > > > system in sequential. I would like to solve the same problem in > parallel. I > > > read the FAQ section that says: > > > > > > Never read or write in parallel an ASCII matrix file, instead for > reading: > > > read in sequentially then save the matrix with the binary viewer > > > PetscBinaryViewerOpen() and load the matrix in parallel with MatLoad(). > > > > > > So far, I did not manage to implement this. Could any one help me? > > > > With your matrix assembled in serial: > > > > ierr = > PetscViewerBinaryOpen(PETSC_COMM_WORLD,"A.dat",FILE_MODE_WRITE,&viewer);CHKERRQ(ierr); > > ierr = MatView(A,viewer);CHKERRQ(ierr); > > ierr = PetscViewerDestroy(viewer);CHKERRQ(ierr); > > > > In parallel: > > > > ierr = > PetscViewerBinaryOpen(PETSC_COMM_WORLD,"A.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr); > > ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr); > > ierr = MatLoad(A,viewer);CHKERRQ(ierr); > > ierr = PetscViewerDestroy(viewer);CHKERRQ(ierr); > > ierr = > KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN);CHKERRQ(ierr); > > > > > > Jed > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From Pierre.Moinier at baesystems.com Wed Sep 29 07:34:19 2010 From: Pierre.Moinier at baesystems.com (Moinier, Pierre (UK)) Date: Wed, 29 Sep 2010 13:34:19 +0100 Subject: [petsc-users] Read in sequential, solve in parallel In-Reply-To: References: <32845768EC63B04EB132BC2C4351B22699B8FF@GLKMS2114.GREENLNK.NET> Message-ID: <32845768EC63B04EB132BC2C4351B22699B99A@GLKMS2114.GREENLNK.NET> Jed, Thanks for your help and thanks also to all of the others who have replied!. I made some progress and wrote a new code that runs in parallel. However the results seems to show that the time requires to solve the linear systems is the same whether I use 1, 2 or 4 processors... Surely I am missing something. I copied the code below. For info, I run the executable as: ./test -ksp_type cg -ksp_rtol 1.e-6 -pc_type none The code is: PetscInitialize(&argc,&argv,(char *)0,help); ierr = MPI_Comm_size(PETSC_COMM_WORLD,&size);CHKERRQ(ierr); /* Read in matrix */ ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"A.dat",FILE_MODE_READ,&view);CHKERRQ(ierr); ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr); ierr = MatLoad(view,MATMPIAIJ,&A);CHKERRQ(ierr); ierr = PetscViewerDestroy(view);CHKERRQ(ierr); ierr = PetscPrintf(PETSC_COMM_SELF,"Reading matrix completed.\n");CHKERRQ(ierr); /* Read in RHS */ ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"RHS.dat",FILE_MODE_READ,&view);CHKERRQ(ierr); ierr = VecCreate(PETSC_COMM_WORLD,&b);CHKERRQ(ierr); ierr = VecLoad(view,VECMPI,&b);CHKERRQ(ierr); ierr = PetscViewerDestroy(view);CHKERRQ(ierr); ierr = PetscPrintf(PETSC_COMM_SELF,"Reading RHS completed.\n");CHKERRQ(ierr); VecCreate(PETSC_COMM_WORLD,&x); VecDuplicate(b,&x); /* Create linear solver context */ ierr = PetscPrintf(PETSC_COMM_SELF,"Solving ... \n");CHKERRQ(ierr); KSPCreate(PETSC_COMM_WORLD,&ksp); KSPSetOperators(ksp, A, A, SAME_NONZERO_PATTERN); KSPSetFromOptions(ksp); timeval tim; gettimeofday(&tim, NULL); double t1=tim.tv_sec+(tim.tv_usec/1000000.0); KSPSolve(ksp, b, x); gettimeofday(&tim, NULL); double t2=tim.tv_sec+(tim.tv_usec/1000000.0); printf("%.6lf seconds elapsed\n", t2-t1); ierr = MatDestroy(A);CHKERRQ(ierr); ierr = VecDestroy(b);CHKERRQ(ierr); ierr = PetscFinalize();CHKERRQ(ierr); ----- Dr Pierre Moinier Principal Research Scientist Office ': +44 (0)117 302 8223 > pierre.moinier at baesystems.com | ? www.baesystems.com BAE Systems ? Advanced Technology Centre ? Sowerby Building (20R) ? FPC 267 ? PO Box 5 ? Filton ? Bristol ? BS34 7QW BAE Systems (Operations) Limited Registered Office: Warwick House, PO Box 87, Farnborough Aerospace Centre, Farnborough, Hants, GU14 6YU, UK Registered in England & Wales No: 1996687 -----Original Message----- From: petsc-users-bounces at mcs.anl.gov [mailto:petsc-users-bounces at mcs.anl.gov] On Behalf Of Jed Brown Sent: 29 September 2010 11:18 To: PETSc users list Subject: Re: [petsc-users] Read in sequential, solve in parallel *** WARNING *** This message has originated outside your organisation, either from an external partner or the Global Internet. Keep this in mind if you answer this message. On Wed, Sep 29, 2010 at 11:46, Moinier, Pierre (UK) wrote: > I have a Matrix and a Right Hand Side (RHS) in an ASCII format stored > in?2 different files. I have written the code that reads the data and > solve the system in sequential. I would like to solve the same problem > in parallel. I read the FAQ section that says: > > Never read or write in parallel an ASCII matrix file, instead for reading: > read in sequentially then save the matrix with the binary viewer > PetscBinaryViewerOpen() and load the matrix in parallel with MatLoad(). > > So far, I did not manage to implement this. Could any one help me? With your matrix assembled in serial: ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"A.dat",FILE_MODE_WRITE,&viewer);CHKERRQ(ierr); ierr = MatView(A,viewer);CHKERRQ(ierr); ierr = PetscViewerDestroy(viewer);CHKERRQ(ierr); In parallel: ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,"A.dat",FILE_MODE_READ,&viewer);CHKERRQ(ierr); ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr); ierr = MatLoad(A,viewer);CHKERRQ(ierr); ierr = PetscViewerDestroy(viewer);CHKERRQ(ierr); ierr = KSPSetOperators(ksp,A,A,DIFFERENT_NONZERO_PATTERN);CHKERRQ(ierr); Jed ******************************************************************** This email and any attachments are confidential to the intended recipient and may also be privileged. If you are not the intended recipient please delete it from your system and notify the sender. You should not copy it or use it for any purpose nor disclose or distribute its contents to any other person. ******************************************************************** From jed at 59A2.org Wed Sep 29 07:39:40 2010 From: jed at 59A2.org (Jed Brown) Date: Wed, 29 Sep 2010 14:39:40 +0200 Subject: [petsc-users] Read in sequential, solve in parallel In-Reply-To: <32845768EC63B04EB132BC2C4351B22699B99A@GLKMS2114.GREENLNK.NET> References: <32845768EC63B04EB132BC2C4351B22699B8FF@GLKMS2114.GREENLNK.NET> <32845768EC63B04EB132BC2C4351B22699B99A@GLKMS2114.GREENLNK.NET> Message-ID: On Wed, Sep 29, 2010 at 14:34, Moinier, Pierre (UK) wrote: > Jed, > > Thanks for your help and thanks also to all of the others who have replied!. I made some progress and wrote a new code that runs in parallel. However the results seems to show that the time requires to solve the linear systems is the same whether I use 1, 2 or 4 processors... Surely I am missing something. I copied the code below. For info, I run the executable as: ./test -ksp_type cg -ksp_rtol 1.e-6 -pc_type none How big is the matrix (dimensions and number of nonzeros)? Run with -log_summary and send the output. This problem is mostly memory bandwidth limited and a single core can saturate most of the memory bus for a whole socket on most architectures. If you are interested in time to solution, you almost certainly want to use a preconditioner. Sometimes these do more work per byte so you may be able to see more speedup without adding sockets. Jed From Pierre.Moinier at baesystems.com Wed Sep 29 07:51:55 2010 From: Pierre.Moinier at baesystems.com (Moinier, Pierre (UK)) Date: Wed, 29 Sep 2010 13:51:55 +0100 Subject: [petsc-users] Read in sequential, solve in parallel In-Reply-To: References: <32845768EC63B04EB132BC2C4351B22699B8FF@GLKMS2114.GREENLNK.NET><32845768EC63B04EB132BC2C4351B22699B99A@GLKMS2114.GREENLNK.NET> Message-ID: <32845768EC63B04EB132BC2C4351B22699B9B6@GLKMS2114.GREENLNK.NET> Jed, The matrix is 1000000x1000000 and I have 4996000 non zeros Here is the output for a single proc: -bash-3.2$ cat petsc.sub.o5134 Reading matrix completed. Reading RHS completed. Solving ... 44.827695 seconds elapsed ************************************************************************ ************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************ ************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- ./mm2petsc2 on a linux-gnu named comp01 with 1 processor, by moinier Wed Sep 29 13:44:03 2010 Using Petsc Release Version 3.1.0, Patch 3, Fri Jun 4 15:34:52 CDT 2010 Max Max/Min Avg Total Time (sec): 4.571e+01 1.00000 4.571e+01 Objects: 2.400e+01 1.00000 2.400e+01 Flops: 3.428e+10 1.00000 3.428e+10 3.428e+10 Flops/sec: 7.499e+08 1.00000 7.499e+08 7.499e+08 MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Reductions: 2.700e+01 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 4.5715e+01 100.0% 3.4280e+10 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 6.000e+00 22.2% ------------------------------------------------------------------------ ------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------ ------------------------------------------------ Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------ ------------------------------------------------ --- Event Stage 0: Main Stage MatMult 1633 1.0 1.6247e+01 1.0 1.47e+10 1.0 0.0e+00 0.0e+00 0.0e+00 36 43 0 0 0 36 43 0 0 0 904 MatAssemblyBegin 1 1.0 1.3158e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00 0 0 0 0 7 0 0 0 0 33 0 MatAssemblyEnd 1 1.0 5.1668e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+00 0 0 0 0 11 0 0 0 0 50 0 MatLoad 1 1.0 7.9792e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 5.0e+00 2 0 0 0 19 2 0 0 0 83 0 VecDot 3266 1.0 4.4834e+00 1.0 6.53e+09 1.0 0.0e+00 0.0e+00 0.0e+00 10 19 0 0 0 10 19 0 0 0 1457 VecNorm 1634 1.0 1.2968e+01 1.0 3.27e+09 1.0 0.0e+00 0.0e+00 0.0e+00 28 10 0 0 0 28 10 0 0 0 252 VecCopy 1636 1.0 2.9524e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 6 0 0 0 0 6 0 0 0 0 0 VecSet 1 1.0 1.7080e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 3266 1.0 5.5580e+00 1.0 6.53e+09 1.0 0.0e+00 0.0e+00 0.0e+00 12 19 0 0 0 12 19 0 0 0 1175 VecAYPX 1632 1.0 2.5961e+00 1.0 3.26e+09 1.0 0.0e+00 0.0e+00 0.0e+00 6 10 0 0 0 6 10 0 0 0 1257 VecAssemblyBegin 1 1.0 9.5367e-07 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 1 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecLoad 1 1.0 7.8766e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.0e+00 0 0 0 0 4 0 0 0 0 17 0 VecScatterBegin 1633 1.0 1.3146e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSetup 1 1.0 8.7240e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve 1 1.0 4.4828e+01 1.0 3.43e+10 1.0 0.0e+00 0.0e+00 0.0e+00 98100 0 0 0 98100 0 0 0 765 PCSetUp 1 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 PCApply 1634 1.0 2.9503e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 6 0 0 0 0 6 0 0 0 0 0 ------------------------------------------------------------------------ ------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Viewer 2 2 1136 0 Matrix 7 3 6560 0 Vec 10 2 2696 0 Vec Scatter 1 0 0 0 Index Set 2 2 1056 0 Krylov Solver 1 0 0 0 Preconditioner 1 0 0 0 ======================================================================== ================================================ Average time to get PetscTime(): 1.19209e-07 #PETSc Option Table entries: -ksp_rtol 1.e-6 -ksp_type cg -log_summary -pc_type none #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 Configure run at: Wed Sep 1 13:08:57 2010 Configure options: --known-level1-dcache-size=65536 --known-level1-dcache-linesize=64 --known-level1-dcache-assoc=2 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=8 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=8 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=8 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=8 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-mpi-dir=/apps/utils/linux64/openmpi-1.4.1 --with-batch --with-fc=0 --known-mpi-shared=1 --with-shared --with-debugging=0 COPTFLAGS=-O3 CXXOPTFLAGS=-O3 --with-clanguage=cxx --download-c-blas-lapack=yes ----------------------------------------- Libraries compiled on Wed Sep 1 13:09:50 BST 2010 on lnx102 Machine characteristics: Linux lnx102 2.6.31.12-0.2-default #1 SMP 2010-03-16 21:25:39 +0100 x86_64 x86_64 x86_64 GNU/Linux Using PETSc directory: /home/atc/neilson/opt/petsc-3.1-p3 Using PETSc arch: linux-gnu-cxx-opt-lnx102 ----------------------------------------- Using C compiler: /apps/utils/linux64/openmpi-1.4.1/bin/mpicxx -Wall -Wwrite-strings -Wno-strict-aliasing -O3 -fPIC Using Fortran compiler: ----------------------------------------- Using include paths: -I/home/atc/neilson/opt/petsc-3.1-p3/linux-gnu-cxx-opt-lnx102/include -I/home/atc/neilson/opt/petsc-3.1-p3/include -I/apps/utils/linux64/openmpi-1.4.1/include ------------------------------------------ Using C linker: /apps/utils/linux64/openmpi-1.4.1/bin/mpicxx -Wall -Wwrite-strings -Wno-strict-aliasing -O3 Using Fortran linker: Using libraries: -Wl,-rpath,/home/atc/neilson/opt/petsc-3.1-p3/linux-gnu-cxx-opt-lnx102/l ib -L/home/atc/neilson/opt/petsc-3.1-p3/linux-gnu-cxx-opt-lnx102/lib -lpetsc -lX11 -Wl,-rpath,/home/atc/neilson/opt/petsc-3.1-p3/linux-gnu-cxx-opt-lnx102/l ib -L/home/atc/neilson/opt/petsc-3.1-p3/linux-gnu-cxx-opt-lnx102/lib -lf2clapack -lf2cblas -lmpi_cxx -lstdc++ -ldl ------------------------------------------ -----Original Message----- From: petsc-users-bounces at mcs.anl.gov [mailto:petsc-users-bounces at mcs.anl.gov] On Behalf Of Jed Brown Sent: 29 September 2010 13:40 To: PETSc users list Subject: Re: [petsc-users] Read in sequential, solve in parallel *** WARNING *** This message has originated outside your organisation, either from an external partner or the Global Internet. Keep this in mind if you answer this message. On Wed, Sep 29, 2010 at 14:34, Moinier, Pierre (UK) wrote: > Jed, > > Thanks for your help and thanks also to all of the others who have > replied!. I made some progress and wrote a new code that runs in > parallel. However the results seems to show that the time requires to > solve the linear systems is the same whether I use 1, 2 or 4 > processors... Surely I am missing something. I copied the code below. > For info, I run the executable as: ./test -ksp_type cg -ksp_rtol 1.e-6 > -pc_type none How big is the matrix (dimensions and number of nonzeros)? Run with -log_summary and send the output. This problem is mostly memory bandwidth limited and a single core can saturate most of the memory bus for a whole socket on most architectures. If you are interested in time to solution, you almost certainly want to use a preconditioner. Sometimes these do more work per byte so you may be able to see more speedup without adding sockets. Jed ******************************************************************** This email and any attachments are confidential to the intended recipient and may also be privileged. If you are not the intended recipient please delete it from your system and notify the sender. You should not copy it or use it for any purpose nor disclose or distribute its contents to any other person. ******************************************************************** From jed at 59A2.org Wed Sep 29 08:16:01 2010 From: jed at 59A2.org (Jed Brown) Date: Wed, 29 Sep 2010 15:16:01 +0200 Subject: [petsc-users] Read in sequential, solve in parallel In-Reply-To: <32845768EC63B04EB132BC2C4351B22699B9B6@GLKMS2114.GREENLNK.NET> References: <32845768EC63B04EB132BC2C4351B22699B8FF@GLKMS2114.GREENLNK.NET> <32845768EC63B04EB132BC2C4351B22699B99A@GLKMS2114.GREENLNK.NET> <32845768EC63B04EB132BC2C4351B22699B9B6@GLKMS2114.GREENLNK.NET> Message-ID: On Wed, Sep 29, 2010 at 14:51, Moinier, Pierre (UK) wrote: > Jed, > > The matrix is 1000000x1000000 and I have 4996000 non zeros These statistics look like perhaps this matrix comes from a 5-point discretization of an elliptic operator. Is that true? > Here is the output for a single proc: You'll want to compare the time in each event when run in parallel: > MatMult ? ? ? ? ? ? 1633 1.0 1.6247e+01 1.0 1.47e+10 1.0 0.0e+00 0.0e+00 > 0.0e+00 36 43 ?0 ?0 ?0 ?36 43 ?0 ?0 ?0 ? 904 16 seconds in this event, 904 Mflop/s is good for a problem like this on a single core. > VecDot ? ? ? ? ? ? ?3266 1.0 4.4834e+00 1.0 6.53e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 10 19 ?0 ?0 ?0 ?10 19 ?0 ?0 ?0 ?1457 > VecNorm ? ? ? ? ? ? 1634 1.0 1.2968e+01 1.0 3.27e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 28 10 ?0 ?0 ?0 ?28 10 ?0 ?0 ?0 ? 252 This is confusing, the norms should take about a tenth of this (they should have almost double the Mflop/s of VecDot). Is there something else running on this machine? Anyone have other ideas? > VecAXPY ? ? ? ? ? ? 3266 1.0 5.5580e+00 1.0 6.53e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 12 19 ?0 ?0 ?0 ?12 19 ?0 ?0 ?0 ?1175 > VecAYPX ? ? ? ? ? ? 1632 1.0 2.5961e+00 1.0 3.26e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 ?6 10 ?0 ?0 ?0 ? 6 10 ?0 ?0 ?0 ?1257 These look normal. > KSPSolve ? ? ? ? ? ? ? 1 1.0 4.4828e+01 1.0 3.43e+10 1.0 0.0e+00 0.0e+00 > 0.0e+00 98100 ?0 ?0 ?0 ?98100 ?0 ?0 ?0 ? 765 And here's your solve total, the aggregate numbers look fine. Jed From Pierre.Moinier at baesystems.com Wed Sep 29 09:34:01 2010 From: Pierre.Moinier at baesystems.com (Moinier, Pierre (UK)) Date: Wed, 29 Sep 2010 15:34:01 +0100 Subject: [petsc-users] Read in sequential, solve in parallel In-Reply-To: References: <32845768EC63B04EB132BC2C4351B22699B8FF@GLKMS2114.GREENLNK.NET><32845768EC63B04EB132BC2C4351B22699B99A@GLKMS2114.GREENLNK.NET><32845768EC63B04EB132BC2C4351B22699B9B6@GLKMS2114.GREENLNK.NET> Message-ID: <32845768EC63B04EB132BC2C4351B2269EF098@GLKMS2114.GREENLNK.NET> Jed, You are right I built the matrix from a Poisson problem using a 5pts discretization. I have now found out why I wasn't getting the correct scaling. That was due to a silly mistake in submitting my executable. With 4 cores, I get: --- Event Stage 0: Main Stage MatMult 1633 1.0 6.9578e+00 1.2 3.67e+09 1.0 9.8e+03 8.0e+03 0.0e+00 41 43100 59 0 41 43100 59 0 2110 MatAssemblyBegin 1 1.0 1.8351e-01182.2 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 1 1.0 1.6289e-02 1.0 0.00e+00 0.0 1.2e+01 2.0e+03 7.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLoad 1 1.0 6.6239e-01 1.0 0.00e+00 0.0 2.1e+01 2.3e+06 9.0e+00 4 0 0 36 0 4 0 0 36 0 0 VecDot 3266 1.0 2.3861e+00 1.6 1.63e+09 1.0 0.0e+00 0.0e+00 3.3e+03 11 19 0 0 66 11 19 0 0 66 2737 VecNorm 1634 1.0 3.8494e+00 1.2 8.17e+08 1.0 0.0e+00 0.0e+00 1.6e+03 23 10 0 0 33 23 10 0 0 33 849 VecCopy 1636 1.0 1.0704e+00 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 5 0 0 0 0 5 0 0 0 0 0 VecSet 1 1.0 6.0201e-04 2.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 3266 1.0 2.0010e+00 1.2 1.63e+09 1.0 0.0e+00 0.0e+00 0.0e+00 11 19 0 0 0 11 19 0 0 0 3264 VecAYPX 1632 1.0 8.4769e-01 1.4 8.16e+08 1.0 0.0e+00 0.0e+00 0.0e+00 4 10 0 0 0 4 10 0 0 0 3850 VecAssemblyBegin 1 1.0 3.3454e-02477.3 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 1 1.0 3.0994e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecLoad 1 1.0 7.2319e-02 1.0 0.00e+00 0.0 3.0e+00 2.0e+06 4.0e+00 0 0 0 5 0 0 0 0 5 0 0 VecScatterBegin 1633 1.0 2.4417e-02 2.3 0.00e+00 0.0 9.8e+03 8.0e+03 0.0e+00 0 0100 59 0 0 0100 59 0 0 VecScatterEnd 1633 1.0 1.0537e+0024.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 3 0 0 0 0 3 0 0 0 0 0 KSPSetup 1 1.0 2.6400e-03 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve 1 1.0 1.5425e+01 1.0 8.57e+09 1.0 9.8e+03 8.0e+03 4.9e+03 95100100 59 99 95100100 59100 2222 PCSetUp 1 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 PCApply 1634 1.0 1.0700e+00 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 5 0 0 0 0 5 0 0 0 0 0 For a single core, I was getting 4.4828e+01 for KSPSolve. I am correct to assume that what is listed as "Event Stage 0: Main Stage" is common to each core? Finally, what is the meaning of "Event Stage 0: Main Stage" Cheers, -Pierre. ******************************************************************** This email and any attachments are confidential to the intended recipient and may also be privileged. If you are not the intended recipient please delete it from your system and notify the sender. You should not copy it or use it for any purpose nor disclose or distribute its contents to any other person. ******************************************************************** From jed at 59A2.org Wed Sep 29 10:04:00 2010 From: jed at 59A2.org (Jed Brown) Date: Wed, 29 Sep 2010 17:04:00 +0200 Subject: [petsc-users] Read in sequential, solve in parallel In-Reply-To: <32845768EC63B04EB132BC2C4351B2269EF098@GLKMS2114.GREENLNK.NET> References: <32845768EC63B04EB132BC2C4351B22699B8FF@GLKMS2114.GREENLNK.NET> <32845768EC63B04EB132BC2C4351B22699B99A@GLKMS2114.GREENLNK.NET> <32845768EC63B04EB132BC2C4351B22699B9B6@GLKMS2114.GREENLNK.NET> <32845768EC63B04EB132BC2C4351B2269EF098@GLKMS2114.GREENLNK.NET> Message-ID: The stage totals are aggregate. More info in the users manual. You can add stages to distinguish between different phases of your program. Your results look good except for the dot/norm timing but that won't make s big difference. Jed On Sep 29, 2010 4:34 PM, "Moinier, Pierre (UK)" < Pierre.Moinier at baesystems.com> wrote: Jed, You are right I built the matrix from a Poisson problem using a 5pts discretization. I have now found out why I wasn't getting the correct scaling. That was due to a silly mistake in submitting my executable. With 4 cores, I get: --- Event Stage 0: Main Stage MatMult 1633 1.0 6.9578e+00 1.2 3.67e+09 1.0 9.8e+03 8.0e+03 0.0e+00 41 43100 59 0 41 43100 59 0 2110 MatAssemblyBegin 1 1.0 1.8351e-01182.2 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 1 1.0 1.6289e-02 1.0 0.00e+00 0.0 1.2e+01 2.0e+03 7.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLoad 1 1.0 6.6239e-01 1.0 0.00e+00 0.0 2.1e+01 2.3e+06 9.0e+00 4 0 0 36 0 4 0 0 36 0 0 VecDot 3266 1.0 2.3861e+00 1.6 1.63e+09 1.0 0.0e+00 0.0e+00 3.3e+03 11 19 0 0 66 11 19 0 0 66 2737 VecNorm 1634 1.0 3.8494e+00 1.2 8.17e+08 1.0 0.0e+00 0.0e+00 1.6e+03 23 10 0 0 33 23 10 0 0 33 849 VecCopy 1636 1.0 1.0704e+00 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 5 0 0 0 0 5 0 0 0 0 0 VecSet 1 1.0 6.0201e-04 2.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 3266 1.0 2.0010e+00 1.2 1.63e+09 1.0 0.0e+00 0.0e+00 0.0e+00 11 19 0 0 0 11 19 0 0 0 3264 VecAYPX 1632 1.0 8.4769e-01 1.4 8.16e+08 1.0 0.0e+00 0.0e+00 0.0e+00 4 10 0 0 0 4 10 0 0 0 3850 VecAssemblyBegin 1 1.0 3.3454e-02477.3 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 1 1.0 3.0994e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecLoad 1 1.0 7.2319e-02 1.0 0.00e+00 0.0 3.0e+00 2.0e+06 4.0e+00 0 0 0 5 0 0 0 0 5 0 0 VecScatterBegin 1633 1.0 2.4417e-02 2.3 0.00e+00 0.0 9.8e+03 8.0e+03 0.0e+00 0 0100 59 0 0 0100 59 0 0 VecScatterEnd 1633 1.0 1.0537e+0024.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 3 0 0 0 0 3 0 0 0 0 0 KSPSetup 1 1.0 2.6400e-03 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve 1 1.0 1.5425e+01 1.0 8.57e+09 1.0 9.8e+03 8.0e+03 4.9e+03 95100100 59 99 95100100 59100 2222 PCSetUp 1 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 ... PCApply 1634 1.0 1.0700e+00 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 5 0 0 0 0 5 0 0 0 0 0 For a single core, I was getting 4.4828e+01 for KSPSolve. I am correct to assume that what is listed as "Event Stage 0: Main Stage" is common to each core? Finally, what is the meaning of "Event Stage 0: Main Stage" Cheers, -Pierre. ******************************************************************** This email and any attachmen... -------------- next part -------------- An HTML attachment was scrubbed... URL: From Pierre.Moinier at baesystems.com Wed Sep 29 10:05:07 2010 From: Pierre.Moinier at baesystems.com (Moinier, Pierre (UK)) Date: Wed, 29 Sep 2010 16:05:07 +0100 Subject: [petsc-users] Read in sequential, solve in parallel In-Reply-To: References: <32845768EC63B04EB132BC2C4351B22699B8FF@GLKMS2114.GREENLNK.NET><32845768EC63B04EB132BC2C4351B22699B99A@GLKMS2114.GREENLNK.NET><32845768EC63B04EB132BC2C4351B22699B9B6@GLKMS2114.GREENLNK.NET><32845768EC63B04EB132BC2C4351B2269EF098@GLKMS2114.GREENLNK.NET> Message-ID: <32845768EC63B04EB132BC2C4351B2269EF0B7@GLKMS2114.GREENLNK.NET> Thanks for your help! ----- Dr Pierre Moinier Principal Research Scientist Office ': +44 (0)117 302 8223 > pierre.moinier at baesystems.com | ? www.baesystems.com BAE Systems ? Advanced Technology Centre ? Sowerby Building (20R) ? FPC 267 ? PO Box 5 ? Filton ? Bristol ? BS34 7QW BAE Systems (Operations) Limited Registered Office: Warwick House, PO Box 87, Farnborough Aerospace Centre, Farnborough, Hants, GU14 6YU, UK Registered in England & Wales No: 1996687 ________________________________ From: petsc-users-bounces at mcs.anl.gov [mailto:petsc-users-bounces at mcs.anl.gov] On Behalf Of Jed Brown Sent: 29 September 2010 16:04 To: PETSc users list Subject: Re: [petsc-users] Read in sequential, solve in parallel *** WARNING *** This message has originated outside your organisation, either from an external partner or the Global Internet. Keep this in mind if you answer this message. The stage totals are aggregate. More info in the users manual. You can add stages to distinguish between different phases of your program. Your results look good except for the dot/norm timing but that won't make s big difference. Jed On Sep 29, 2010 4:34 PM, "Moinier, Pierre (UK)" wrote: Jed, You are right I built the matrix from a Poisson problem using a 5pts discretization. I have now found out why I wasn't getting the correct scaling. That was due to a silly mistake in submitting my executable. With 4 cores, I get: --- Event Stage 0: Main Stage MatMult 1633 1.0 6.9578e+00 1.2 3.67e+09 1.0 9.8e+03 8.0e+03 0.0e+00 41 43100 59 0 41 43100 59 0 2110 MatAssemblyBegin 1 1.0 1.8351e-01182.2 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 1 1.0 1.6289e-02 1.0 0.00e+00 0.0 1.2e+01 2.0e+03 7.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLoad 1 1.0 6.6239e-01 1.0 0.00e+00 0.0 2.1e+01 2.3e+06 9.0e+00 4 0 0 36 0 4 0 0 36 0 0 VecDot 3266 1.0 2.3861e+00 1.6 1.63e+09 1.0 0.0e+00 0.0e+00 3.3e+03 11 19 0 0 66 11 19 0 0 66 2737 VecNorm 1634 1.0 3.8494e+00 1.2 8.17e+08 1.0 0.0e+00 0.0e+00 1.6e+03 23 10 0 0 33 23 10 0 0 33 849 VecCopy 1636 1.0 1.0704e+00 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 5 0 0 0 0 5 0 0 0 0 0 VecSet 1 1.0 6.0201e-04 2.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 3266 1.0 2.0010e+00 1.2 1.63e+09 1.0 0.0e+00 0.0e+00 0.0e+00 11 19 0 0 0 11 19 0 0 0 3264 VecAYPX 1632 1.0 8.4769e-01 1.4 8.16e+08 1.0 0.0e+00 0.0e+00 0.0e+00 4 10 0 0 0 4 10 0 0 0 3850 VecAssemblyBegin 1 1.0 3.3454e-02477.3 0.00e+00 0.0 0.0e+00 0.0e+00 3.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 1 1.0 3.0994e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecLoad 1 1.0 7.2319e-02 1.0 0.00e+00 0.0 3.0e+00 2.0e+06 4.0e+00 0 0 0 5 0 0 0 0 5 0 0 VecScatterBegin 1633 1.0 2.4417e-02 2.3 0.00e+00 0.0 9.8e+03 8.0e+03 0.0e+00 0 0100 59 0 0 0100 59 0 0 VecScatterEnd 1633 1.0 1.0537e+0024.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 3 0 0 0 0 3 0 0 0 0 0 KSPSetup 1 1.0 2.6400e-03 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve 1 1.0 1.5425e+01 1.0 8.57e+09 1.0 9.8e+03 8.0e+03 4.9e+03 95100100 59 99 95100100 59100 2222 PCSetUp 1 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 ... PCApply 1634 1.0 1.0700e+00 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 5 0 0 0 0 5 0 0 0 0 0 For a single core, I was getting 4.4828e+01 for KSPSolve. I am correct to assume that what is listed as "Event Stage 0: Main Stage" is common to each core? Finally, what is the meaning of "Event Stage 0: Main Stage" Cheers, -Pierre. ******************************************************************** This email and any attachmen... ******************************************************************** This email and any attachments are confidential to the intended recipient and may also be privileged. If you are not the intended recipient please delete it from your system and notify the sender. You should not copy it or use it for any purpose nor disclose or distribute its contents to any other person. ******************************************************************** -------------- next part -------------- An HTML attachment was scrubbed... URL: From kutuzovnp at gmail.com Wed Sep 29 14:16:10 2010 From: kutuzovnp at gmail.com (=?KOI8-R?B?7snLz8zByiDr1dTV2s/X?=) Date: Wed, 29 Sep 2010 23:16:10 +0400 Subject: [petsc-users] petsc4py In-Reply-To: References: Message-ID: i've tried to run rober.py with ts.setIJacobian line skipped, and with --( python rober2.py -snes_mf ) got the following --- Traceback (most recent call last): File "rober2.py", line 59, in ts.setFromOptions() File "TS.pyx", line 83, in petsc4py.PETSc.TS.setFromOptions (src/petsc4py.PETSc.c:90385) petsc4py.PETSc.Error: error code 73 [0] TSSetFromOptions() line 136 in src/ts/interface/ts.c [0] SNESSetFromOptions() line 420 in src/snes/interface/snes.c [0] SNESSetUpMatrixFree_Private() line 190 in src/snes/interface/snes.c [0] MatCreateSNESMF() line 131 in src/snes/mf/snesmfj.c [0] Object is in wrong state [0] SNESSetFunction() must be called first what is SNESSetFunction()? 2010/9/26 Jed Brown > 2010/9/26 ??????? ??????? : > > 1) First of all, can you describe in a bit more detailed way the usage > of > > AppCtx class of matfree.py module to solve ODE systems > > (determined as in rober.py), without jacobian initialisation, in other > words > > how can change rober.py to solve this issue? > > You can just skip the ts.setIJacobian call and run with -snes_mf. Or > set an approximate Jacobian that way, but use it only for > preconditioning, with -snes_mf_operator. Run with -ts_view to confirm > that you are running what you think you are. > > > 2) Does THETA integration implement time step adaptation? > > No, and it doesn't come with a built-in error estimate. TSGL does > adaptation, but the controller for adaptive order (-ts_adapt_type > both) is not at all robust, so I recommend using -ts_adapt_type step > or writing your own controller (see src/ts/impls/implicit/gl/gladapt.c > for examples). > > > 3) Suppose i have a large ODE system, how can i implement multiprocessor > > (parallel) integration in a way similar with those (function definition > and > > plotting) in rober.py? > > Lisandro might have other suggestions, but > src/ts/examples/tutorials/ex8.py solves a transient Bratu problem in > parallel. Get it from dev, the copy in 3.1 does not work correctly in > parallel for superficial indexing reasons: > > > http://petsc.cs.iit.edu/petsc/petsc-dev/file/c03db8f211dd/src/ts/examples/tutorials/ex8.py > > You can run it like (it uses TSGL by default) > > mpiexec -n 4 python ex8.py -M 40 -ts_monitor -ts_monitor_solution > -ts_max_steps 1000 -ts_adapt_type size > > Note that theta=0.5 is highly oscillatory for this problem, use > something like -ts_type theta -ts_theta_theta 0.8 for a more stable > solution. > > You could of course plot the solution using Matplotlib (as in, e.g. > bratu2d.py) instead, but you would have to gather it to process 0 > because Matplotlib is not parallel. Other options include just > writing out the state at the time steps you are interested in, or > (much more effort) using libsim from VisIt to get live visualization > and interaction with your running parallel program. > > Jed > -------------- next part -------------- An HTML attachment was scrubbed... URL: From kutuzovnp at gmail.com Wed Sep 29 14:16:10 2010 From: kutuzovnp at gmail.com (=?KOI8-R?B?7snLz8zByiDr1dTV2s/X?=) Date: Wed, 29 Sep 2010 23:16:10 +0400 Subject: [petsc-users] petsc4py In-Reply-To: References: Message-ID: i've tried to run rober.py with ts.setIJacobian line skipped, and with --( python rober2.py -snes_mf ) got the following --- Traceback (most recent call last): File "rober2.py", line 59, in ts.setFromOptions() File "TS.pyx", line 83, in petsc4py.PETSc.TS.setFromOptions (src/petsc4py.PETSc.c:90385) petsc4py.PETSc.Error: error code 73 [0] TSSetFromOptions() line 136 in src/ts/interface/ts.c [0] SNESSetFromOptions() line 420 in src/snes/interface/snes.c [0] SNESSetUpMatrixFree_Private() line 190 in src/snes/interface/snes.c [0] MatCreateSNESMF() line 131 in src/snes/mf/snesmfj.c [0] Object is in wrong state [0] SNESSetFunction() must be called first what is SNESSetFunction()? 2010/9/26 Jed Brown > 2010/9/26 ??????? ??????? : > > 1) First of all, can you describe in a bit more detailed way the usage > of > > AppCtx class of matfree.py module to solve ODE systems > > (determined as in rober.py), without jacobian initialisation, in other > words > > how can change rober.py to solve this issue? > > You can just skip the ts.setIJacobian call and run with -snes_mf. Or > set an approximate Jacobian that way, but use it only for > preconditioning, with -snes_mf_operator. Run with -ts_view to confirm > that you are running what you think you are. > > > 2) Does THETA integration implement time step adaptation? > > No, and it doesn't come with a built-in error estimate. TSGL does > adaptation, but the controller for adaptive order (-ts_adapt_type > both) is not at all robust, so I recommend using -ts_adapt_type step > or writing your own controller (see src/ts/impls/implicit/gl/gladapt.c > for examples). > > > 3) Suppose i have a large ODE system, how can i implement multiprocessor > > (parallel) integration in a way similar with those (function definition > and > > plotting) in rober.py? > > Lisandro might have other suggestions, but > src/ts/examples/tutorials/ex8.py solves a transient Bratu problem in > parallel. Get it from dev, the copy in 3.1 does not work correctly in > parallel for superficial indexing reasons: > > > http://petsc.cs.iit.edu/petsc/petsc-dev/file/c03db8f211dd/src/ts/examples/tutorials/ex8.py > > You can run it like (it uses TSGL by default) > > mpiexec -n 4 python ex8.py -M 40 -ts_monitor -ts_monitor_solution > -ts_max_steps 1000 -ts_adapt_type size > > Note that theta=0.5 is highly oscillatory for this problem, use > something like -ts_type theta -ts_theta_theta 0.8 for a more stable > solution. > > You could of course plot the solution using Matplotlib (as in, e.g. > bratu2d.py) instead, but you would have to gather it to process 0 > because Matplotlib is not parallel. Other options include just > writing out the state at the time steps you are interested in, or > (much more effort) using libsim from VisIt to get live visualization > and interaction with your running parallel program. > > Jed > -------------- next part -------------- An HTML attachment was scrubbed... URL: From kutuzovnp at gmail.com Wed Sep 29 14:16:10 2010 From: kutuzovnp at gmail.com (=?KOI8-R?B?7snLz8zByiDr1dTV2s/X?=) Date: Wed, 29 Sep 2010 23:16:10 +0400 Subject: [petsc-users] petsc4py In-Reply-To: References: Message-ID: i've tried to run rober.py with ts.setIJacobian line skipped, and with --( python rober2.py -snes_mf ) got the following --- Traceback (most recent call last): File "rober2.py", line 59, in ts.setFromOptions() File "TS.pyx", line 83, in petsc4py.PETSc.TS.setFromOptions (src/petsc4py.PETSc.c:90385) petsc4py.PETSc.Error: error code 73 [0] TSSetFromOptions() line 136 in src/ts/interface/ts.c [0] SNESSetFromOptions() line 420 in src/snes/interface/snes.c [0] SNESSetUpMatrixFree_Private() line 190 in src/snes/interface/snes.c [0] MatCreateSNESMF() line 131 in src/snes/mf/snesmfj.c [0] Object is in wrong state [0] SNESSetFunction() must be called first what is SNESSetFunction()? 2010/9/26 Jed Brown > 2010/9/26 ??????? ??????? : > > 1) First of all, can you describe in a bit more detailed way the usage > of > > AppCtx class of matfree.py module to solve ODE systems > > (determined as in rober.py), without jacobian initialisation, in other > words > > how can change rober.py to solve this issue? > > You can just skip the ts.setIJacobian call and run with -snes_mf. Or > set an approximate Jacobian that way, but use it only for > preconditioning, with -snes_mf_operator. Run with -ts_view to confirm > that you are running what you think you are. > > > 2) Does THETA integration implement time step adaptation? > > No, and it doesn't come with a built-in error estimate. TSGL does > adaptation, but the controller for adaptive order (-ts_adapt_type > both) is not at all robust, so I recommend using -ts_adapt_type step > or writing your own controller (see src/ts/impls/implicit/gl/gladapt.c > for examples). > > > 3) Suppose i have a large ODE system, how can i implement multiprocessor > > (parallel) integration in a way similar with those (function definition > and > > plotting) in rober.py? > > Lisandro might have other suggestions, but > src/ts/examples/tutorials/ex8.py solves a transient Bratu problem in > parallel. Get it from dev, the copy in 3.1 does not work correctly in > parallel for superficial indexing reasons: > > > http://petsc.cs.iit.edu/petsc/petsc-dev/file/c03db8f211dd/src/ts/examples/tutorials/ex8.py > > You can run it like (it uses TSGL by default) > > mpiexec -n 4 python ex8.py -M 40 -ts_monitor -ts_monitor_solution > -ts_max_steps 1000 -ts_adapt_type size > > Note that theta=0.5 is highly oscillatory for this problem, use > something like -ts_type theta -ts_theta_theta 0.8 for a more stable > solution. > > You could of course plot the solution using Matplotlib (as in, e.g. > bratu2d.py) instead, but you would have to gather it to process 0 > because Matplotlib is not parallel. Other options include just > writing out the state at the time steps you are interested in, or > (much more effort) using libsim from VisIt to get live visualization > and interaction with your running parallel program. > > Jed > -------------- next part -------------- An HTML attachment was scrubbed... URL: From kutuzovnp at gmail.com Wed Sep 29 14:16:10 2010 From: kutuzovnp at gmail.com (=?KOI8-R?B?7snLz8zByiDr1dTV2s/X?=) Date: Wed, 29 Sep 2010 23:16:10 +0400 Subject: [petsc-users] petsc4py In-Reply-To: References: Message-ID: i've tried to run rober.py with ts.setIJacobian line skipped, and with --( python rober2.py -snes_mf ) got the following --- Traceback (most recent call last): File "rober2.py", line 59, in ts.setFromOptions() File "TS.pyx", line 83, in petsc4py.PETSc.TS.setFromOptions (src/petsc4py.PETSc.c:90385) petsc4py.PETSc.Error: error code 73 [0] TSSetFromOptions() line 136 in src/ts/interface/ts.c [0] SNESSetFromOptions() line 420 in src/snes/interface/snes.c [0] SNESSetUpMatrixFree_Private() line 190 in src/snes/interface/snes.c [0] MatCreateSNESMF() line 131 in src/snes/mf/snesmfj.c [0] Object is in wrong state [0] SNESSetFunction() must be called first what is SNESSetFunction()? 2010/9/26 Jed Brown > 2010/9/26 ??????? ??????? : > > 1) First of all, can you describe in a bit more detailed way the usage > of > > AppCtx class of matfree.py module to solve ODE systems > > (determined as in rober.py), without jacobian initialisation, in other > words > > how can change rober.py to solve this issue? > > You can just skip the ts.setIJacobian call and run with -snes_mf. Or > set an approximate Jacobian that way, but use it only for > preconditioning, with -snes_mf_operator. Run with -ts_view to confirm > that you are running what you think you are. > > > 2) Does THETA integration implement time step adaptation? > > No, and it doesn't come with a built-in error estimate. TSGL does > adaptation, but the controller for adaptive order (-ts_adapt_type > both) is not at all robust, so I recommend using -ts_adapt_type step > or writing your own controller (see src/ts/impls/implicit/gl/gladapt.c > for examples). > > > 3) Suppose i have a large ODE system, how can i implement multiprocessor > > (parallel) integration in a way similar with those (function definition > and > > plotting) in rober.py? > > Lisandro might have other suggestions, but > src/ts/examples/tutorials/ex8.py solves a transient Bratu problem in > parallel. Get it from dev, the copy in 3.1 does not work correctly in > parallel for superficial indexing reasons: > > > http://petsc.cs.iit.edu/petsc/petsc-dev/file/c03db8f211dd/src/ts/examples/tutorials/ex8.py > > You can run it like (it uses TSGL by default) > > mpiexec -n 4 python ex8.py -M 40 -ts_monitor -ts_monitor_solution > -ts_max_steps 1000 -ts_adapt_type size > > Note that theta=0.5 is highly oscillatory for this problem, use > something like -ts_type theta -ts_theta_theta 0.8 for a more stable > solution. > > You could of course plot the solution using Matplotlib (as in, e.g. > bratu2d.py) instead, but you would have to gather it to process 0 > because Matplotlib is not parallel. Other options include just > writing out the state at the time steps you are interested in, or > (much more effort) using libsim from VisIt to get live visualization > and interaction with your running parallel program. > > Jed > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rlmackie862 at gmail.com Wed Sep 29 16:07:42 2010 From: rlmackie862 at gmail.com (Randall Mackie) Date: Wed, 29 Sep 2010 14:07:42 -0700 Subject: [petsc-users] DASetCoordinates In-Reply-To: References: <5AFDEDBA-9F3F-4F46-8997-162BA44DEB4E@gmail.com> <88BBB8CB-9E31-417C-933F-F9A73ED1E1C3@gmail.com> Message-ID: Unfortunately this does not work, as Fortran complains that dmmg has not been declared an array or a function. I guess I could write a c wrapper to do this, perhaps following the wrappers I've found in the petsc src directory. Can someone tell me if the DMMGArray and DMMGArrayGetDMMG might be useful here, and just what exactly is a DMMGArray and how can it be used in Fortran. Randy On Tue, Sep 28, 2010 at 5:18 PM, Jed Brown wrote: > I don't think there is a Fortran API for that currently, but the DM its the > first slot in the DMMG so you can probably do > > DASetCoordinates(dmmg(i),cvec,ierr) > > (completely untested) > > Jed > > On Sep 29, 2010 1:33 AM, "Randall Mackie" wrote: > > > On Sep 28, 2010, at 3:19 PM, Jed Brown wrote: > > > On Wed, Sep 29, 2010 at 00:14, Randall Mackie Thanks Jed. One more question: how the heck do I get the (DA)dmmg[i]-> if > my code is in Fortran? > > > Randy > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at 59A2.org Wed Sep 29 17:19:59 2010 From: jed at 59A2.org (Jed Brown) Date: Thu, 30 Sep 2010 00:19:59 +0200 Subject: [petsc-users] petsc4py In-Reply-To: References: Message-ID: 2010/9/29 ??????? ??????? : > > i've tried to run rober.py with ts.setIJacobian line skipped, and with --( > python rober2.py -snes_mf? ) > got the following --- You must have also commented out ts.setIFunction. I just ran this example without ts.setIJacobian and with -snes_mf. Jed From kutuzovnp at gmail.com Wed Sep 29 17:59:56 2010 From: kutuzovnp at gmail.com (=?KOI8-R?B?7snLz8zByiDr1dTV2s/X?=) Date: Thu, 30 Sep 2010 02:59:56 +0400 Subject: [petsc-users] petsc4py In-Reply-To: References: Message-ID: Commenting out ts.setIFunction line has no effect -- got the same error... 2010/9/30 Jed Brown > 2010/9/29 ??????? ??????? : > > > > i've tried to run rober.py with ts.setIJacobian line skipped, and with > --( > > python rober2.py -snes_mf ) > > got the following --- > > You must have also commented out ts.setIFunction. I just ran this > example without ts.setIJacobian and with -snes_mf. > > Jed > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at 59A2.org Wed Sep 29 18:05:41 2010 From: jed at 59A2.org (Jed Brown) Date: Thu, 30 Sep 2010 01:05:41 +0200 Subject: [petsc-users] petsc4py In-Reply-To: References: Message-ID: You must have made some other change. Grab a clean version and only comment out the ts.setIJacobian line. I confirmed that this works with -dev, maybe something is different with 3.1, but I don't have it here on my phone to try. Jed On Sep 30, 2010 1:00 AM, "??????? ???????" wrote: Commenting out ts.setIFunction line has no effect -- got the same error... 2010/9/30 Jed Brown > > 2010/9/29 ??????? ??????? : > > > > i've tried to run rober.py with ts.set... -------------- next part -------------- An HTML attachment was scrubbed... URL: From kutuzovnp at gmail.com Wed Sep 29 18:27:41 2010 From: kutuzovnp at gmail.com (=?KOI8-R?B?7snLz8zByiDr1dTV2s/X?=) Date: Thu, 30 Sep 2010 03:27:41 +0400 Subject: [petsc-users] petsc4py In-Reply-To: References: Message-ID: Where can i get it? I've tried this one -- http://code.google.com/p/petsc4py/source/browse/demo/ode/rober.py 2010/9/30 Jed Brown > You must have made some other change. Grab a clean version and only comment > out the ts.setIJacobian line. I confirmed that this works with -dev, maybe > something is different with 3.1, but I don't have it here on my phone to > try. > > Jed > > On Sep 30, 2010 1:00 AM, "??????? ???????" wrote: > > Commenting out ts.setIFunction line has no effect -- got the same error... > > 2010/9/30 Jed Brown > > > > > > 2010/9/29 ??????? ??????? : > > > > > > i've tried to run rober.py with ts.set... > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dalcinl at gmail.com Wed Sep 29 21:24:58 2010 From: dalcinl at gmail.com (Lisandro Dalcin) Date: Wed, 29 Sep 2010 23:24:58 -0300 Subject: [petsc-users] petsc4py In-Reply-To: References: Message-ID: 2010/9/29 ??????? ??????? : > Where can i get it? I've tried this one -- > http://code.google.com/p/petsc4py/source/browse/demo/ode/rober.py > This is from branch release-1.1, a new release 1.1.2 is planned for the weekend. As you can see from the output at the very end, SNES is using matrix free. Note that I've not changed he code code, just passed -snes_mf What petsc4py and PETSc versions are you using? [dalcinl at trantor petsc4py-release-1.1]$ python demo/ode/rober.py -ts_view -snes_mf TS Object: type: theta Theta=0.5 Extrapolation=no maximum steps=100 maximum time=1e+30 total number of nonlinear solver iterations=203 total number of linear solver iterations=311 SNES Object: type: ls line search variant: SNESLineSearchCubic alpha=0.0001, maxstep=1e+08, minlambda=1e-12 maximum iterations=50, maximum function evaluations=10000 tolerances: relative=1e-08, absolute=1e-50, solution=1e-08 total number of linear solver iterations=3 total number of function evaluations=8 KSP Object: type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: type: none linear system matrix = precond matrix: Matrix Object: type=mffd, rows=3, cols=3 matrix-free approximation: err=1e-07 (relative error in function evaluation) Using wp compute h routine Computes normA Does not compute normU -- Lisandro Dalcin --------------- CIMEC (INTEC/CONICET-UNL) Predio CONICET-Santa Fe Colectora RN 168 Km 472, Paraje El Pozo Tel: +54-342-4511594 (ext 1011) Tel/Fax: +54-342-4511169 From jed at 59A2.org Thu Sep 30 00:28:23 2010 From: jed at 59A2.org (Jed Brown) Date: Thu, 30 Sep 2010 07:28:23 +0200 Subject: [petsc-users] petsc4py In-Reply-To: References: Message-ID: And I get the same output as Lisandro if I comment the ts.setIJaconian line (because that Jacobian is not being used). Jed On Sep 30, 2010 4:25 AM, "Lisandro Dalcin" wrote: 2010/9/29 ??????? ??????? : > Where can i get it? I've tried this one -- > http://code.google.com/p/petsc4py/source/browse/demo/... This is from branch release-1.1, a new release 1.1.2 is planned for the weekend. As you can see from the output at the very end, SNES is using matrix free. Note that I've not changed he code code, just passed -snes_mf What petsc4py and PETSc versions are you using? [dalcinl at trantor petsc4py-release-1.1]$ python demo/ode/rober.py -ts_view -snes_mf TS Object: type: theta Theta=0.5 Extrapolation=no maximum steps=100 maximum time=1e+30 total number of nonlinear solver iterations=203 total number of linear solver iterations=311 SNES Object: type: ls line search variant: SNESLineSearchCubic alpha=0.0001, maxstep=1e+08, minlambda=1e-12 maximum iterations=50, maximum function evaluations=10000 tolerances: relative=1e-08, absolute=1e-50, solution=1e-08 total number of linear solver iterations=3 total number of function evaluations=8 KSP Object: type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: type: none linear system matrix = precond matrix: Matrix Object: type=mffd, rows=3, cols=3 matrix-free approximation: err=1e-07 (relative error in function evaluation) Using wp compute h routine Computes normA Does not compute normU -- Lisandro Dalcin --------------- CIMEC (INTEC/CONICET-UNL) Predio CONICET-Santa Fe Colectora RN 168 Km 472, Paraje El Pozo Tel: +54-342-4511594 (ext 1011) Tel/Fax: +54-342-4511169 -------------- next part -------------- An HTML attachment was scrubbed... URL: From lvankampenhout at gmail.com Thu Sep 30 09:31:49 2010 From: lvankampenhout at gmail.com (Leo van Kampenhout) Date: Thu, 30 Sep 2010 16:31:49 +0200 Subject: [petsc-users] [Fortran] subroutines inside modules? Message-ID: Hi all, since it is mandatory to declare all subroutines as "external" in Fortran, is it possible for Modules to have subroutines? I'm unable to declare the subroutine external inside the module itself, nor in the program which is using it. Not declaring it external at all results in the following compilation error: /net/users/csg/csg4035/master/workdir/src/main.F:97: undefined reference to `__grid_MOD_readgrid' (the module is here is named "grid", the subroutine "readgrid" ) Thanks, Leo -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Sep 30 10:01:26 2010 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 30 Sep 2010 10:01:26 -0500 Subject: [petsc-users] [Fortran] subroutines inside modules? In-Reply-To: References: Message-ID: <022FE245-A8A9-4E70-8741-FF8D51A2E2C3@mcs.anl.gov> On Sep 30, 2010, at 9:31 AM, Leo van Kampenhout wrote: > Hi all, > > since it is mandatory to declare all subroutines as "external" in Fortran, is it possible for Modules to have subroutines? I'm unable to declare the subroutine external inside the module itself, nor in the program which is using it. What happens when you try to declare it external in the "program which is using it" (I assume you mean subroutine that is using it). Barry > Not declaring it external at all results in the following compilation error: > > /net/users/csg/csg4035/master/workdir/src/main.F:97: undefined reference to `__grid_MOD_readgrid' > > (the module is here is named "grid", the subroutine "readgrid" ) > > Thanks, > Leo > From kutuzovnp at gmail.com Thu Sep 30 11:46:11 2010 From: kutuzovnp at gmail.com (=?KOI8-R?B?7snLz8zByiDr1dTV2s/X?=) Date: Thu, 30 Sep 2010 20:46:11 +0400 Subject: [petsc-users] petsc4py In-Reply-To: References: Message-ID: ----> petsc-3.1-p4 and petsc4py-1.1.1 I did only this change in http://code.google.com/p/petsc4py/source/browse/demo/ode/rober.py --- --- (42) ts.setIFunction(ode.evalFunction, f) ---> ts.setIFunction(ode.evalFunction) 2010/9/30 Jed Brown > And I get the same output as Lisandro if I comment the ts.setIJaconian line > (because that Jacobian is not being used). > > Jed > > On Sep 30, 2010 4:25 AM, "Lisandro Dalcin" wrote: > > 2010/9/29 ??????? ??????? : > > > Where can i get it? I've tried this one -- > > http://code.google.com/p/petsc4py/source/browse/demo/... > This is from branch release-1.1, a new release 1.1.2 is planned for > the weekend. As you can see from the output at the very end, SNES is > using matrix free. Note that I've not changed he code code, just > passed -snes_mf > > What petsc4py and PETSc versions are you using? > > [dalcinl at trantor petsc4py-release-1.1]$ python demo/ode/rober.py > -ts_view -snes_mf > TS Object: > type: theta > Theta=0.5 > Extrapolation=no > maximum steps=100 > maximum time=1e+30 > total number of nonlinear solver iterations=203 > total number of linear solver iterations=311 > SNES Object: > type: ls > line search variant: SNESLineSearchCubic > alpha=0.0001, maxstep=1e+08, minlambda=1e-12 > maximum iterations=50, maximum function evaluations=10000 > tolerances: relative=1e-08, absolute=1e-50, solution=1e-08 > total number of linear solver iterations=3 > total number of function evaluations=8 > KSP Object: > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000 > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: > type: none > linear system matrix = precond matrix: > Matrix Object: > type=mffd, rows=3, cols=3 > matrix-free approximation: > err=1e-07 (relative error in function evaluation) > Using wp compute h routine > Computes normA > Does not compute normU > > > -- > Lisandro Dalcin > --------------- > CIMEC (INTEC/CONICET-UNL) > Predio CONICET-Santa Fe > Colectora RN 168 Km 472, Paraje El Pozo > Tel: +54-342-4511594 (ext 1011) > Tel/Fax: +54-342-4511169 > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at 59A2.org Thu Sep 30 12:35:05 2010 From: jed at 59A2.org (Jed Brown) Date: Thu, 30 Sep 2010 19:35:05 +0200 Subject: [petsc-users] petsc4py In-Reply-To: References: Message-ID: 2010/9/30 ??????? ??????? : > ts.setIFunction(ode.evalFunction, f) ---> ts.setIFunction(ode.evalFunction) This won't work, you need to provide a residual vector, otherwise PETSc does not know how large your vector is. Your earlier emails said that you removed the line that reads ts.setIJacobian(ode.evalJacobian, J) but said nothing about modifying the line ts.setIFunction(ode.evalFunction, f) That was why we couldn't understand what was going wrong for you. You *can* skip the ts.setIJacobian line if you use the option -snes_mf, it will do exactly the same algorithm as if you leave rober.py exactly as it is and run with -snes_mf. Jed From s.kramer at imperial.ac.uk Thu Sep 30 12:52:24 2010 From: s.kramer at imperial.ac.uk (Stephan Kramer) Date: Thu, 30 Sep 2010 18:52:24 +0100 Subject: [petsc-users] [Fortran] subroutines inside modules? In-Reply-To: References: Message-ID: <4CA4CE58.3040004@imperial.ac.uk> On 30/09/10 15:31, Leo van Kampenhout wrote: > Hi all, > > since it is mandatory to declare all subroutines as "external" in > Fortran, is it possible for Modules to have subroutines? I'm unable to > declare the subroutine external inside the module itself, nor in the > program which is using it. Not declaring it external at all results in > the following compilation error: > > /net/users/csg/csg4035/master/workdir/src/main.F:97: undefined reference > to `__grid_MOD_readgrid' > > (the module is here is named "grid", the subroutine "readgrid" ) > > Thanks, > Leo > If you put your subroutine in a module, it should not be declared external. You can directly call it from within the module itself. When calling it inside any other module/program you need to add "use grid" before the "implicit none". Putting subroutines inside a module is highly recommended as it automatically provides an explicit interface so that the compiler can check the arguments in your subroutine call. Cheers Stephan From jed at 59A2.org Thu Sep 30 13:07:47 2010 From: jed at 59A2.org (Jed Brown) Date: Thu, 30 Sep 2010 20:07:47 +0200 Subject: [petsc-users] petsc4py In-Reply-To: References: Message-ID: On Thu, Sep 30, 2010 at 19:35, Jed Brown wrote: >> ts.setIFunction(ode.evalFunction, f) ---> ts.setIFunction(ode.evalFunction) > > This won't work, you need to provide a residual vector, otherwise > PETSc does not know how large your vector is Actually, you have to set the initial vector somewhere, and the residual vector can be obtained by duplicating it. PETSc works this way in C. Lisandro, why does TS.setIFunction require the residual vector to be provided? If you agree that it's not necessary, please apply the attached patch. Jed -------------- next part -------------- A non-text attachment was scrubbed... Name: jed-setifunction.patch Type: text/x-patch Size: 1172 bytes Desc: not available URL: From kutuzovnp at gmail.com Thu Sep 30 13:26:20 2010 From: kutuzovnp at gmail.com (=?KOI8-R?B?7snLz8zByiDr1dTV2s/X?=) Date: Thu, 30 Sep 2010 22:26:20 +0400 Subject: [petsc-users] petsc4py In-Reply-To: References: Message-ID: how should i use that patch? 2010/9/30 Jed Brown > On Thu, Sep 30, 2010 at 19:35, Jed Brown wrote: > >> ts.setIFunction(ode.evalFunction, f) ---> > ts.setIFunction(ode.evalFunction) > > > > This won't work, you need to provide a residual vector, otherwise > > PETSc does not know how large your vector is > > Actually, you have to set the initial vector somewhere, and the > residual vector can be obtained by duplicating it. PETSc works this > way in C. Lisandro, why does TS.setIFunction require the residual > vector to be provided? If you agree that it's not necessary, please > apply the attached patch. > > Jed > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at 59A2.org Thu Sep 30 13:28:35 2010 From: jed at 59A2.org (Jed Brown) Date: Thu, 30 Sep 2010 20:28:35 +0200 Subject: [petsc-users] petsc4py In-Reply-To: References: Message-ID: 2010/9/30 ??????? ??????? : > how should i use that patch? It was for Lisandro, but if you have checked out petsc4py trunk, you can hg import jed-setifunction.patch But don't bother, just provide the residual vector, and skip the ts.setIJacobian() call if you don't have a Jacobian. Jed From lvankampenhout at gmail.com Thu Sep 30 17:09:53 2010 From: lvankampenhout at gmail.com (Leo van Kampenhout) Date: Fri, 1 Oct 2010 00:09:53 +0200 Subject: [petsc-users] [Fortran] subroutines inside modules? In-Reply-To: <4CA4CE58.3040004@imperial.ac.uk> References: <4CA4CE58.3040004@imperial.ac.uk> Message-ID: Declaring it external in the program/subroutine that is using the module results in main.F:65.43: external gridtest Error: Cannot change attributes of USE-associated symbol at (1) Thanks, Leo 2010/9/30 Stephan Kramer > On 30/09/10 15:31, Leo van Kampenhout wrote: > >> Hi all, >> >> since it is mandatory to declare all subroutines as "external" in >> Fortran, is it possible for Modules to have subroutines? I'm unable to >> declare the subroutine external inside the module itself, nor in the >> program which is using it. Not declaring it external at all results in >> the following compilation error: >> >> /net/users/csg/csg4035/master/workdir/src/main.F:97: undefined reference >> to `__grid_MOD_readgrid' >> >> (the module is here is named "grid", the subroutine "readgrid" ) >> >> Thanks, >> Leo >> >> > If you put your subroutine in a module, it should not be declared > external. You can directly call it from within the module itself. When > calling it inside any other module/program you need to add "use grid" > before > the "implicit none". > > Putting subroutines inside a module is highly recommended as it > automatically > provides an explicit interface so that the compiler can check the arguments > in > your subroutine call. > > Cheers > Stephan > -------------- next part -------------- An HTML attachment was scrubbed... URL: From s.kramer at imperial.ac.uk Thu Sep 30 17:30:48 2010 From: s.kramer at imperial.ac.uk (Stephan Kramer) Date: Thu, 30 Sep 2010 23:30:48 +0100 Subject: [petsc-users] [Fortran] subroutines inside modules? In-Reply-To: References: <4CA4CE58.3040004@imperial.ac.uk> Message-ID: <4CA50F98.9000902@imperial.ac.uk> On 30/09/10 23:09, Leo van Kampenhout wrote: > Declaring it external in the program/subroutine that is using the module > results in > > main.F:65.43: > external gridtest > Error: Cannot change attributes of USE-associated symbol at (1) > > Thanks, Leo Yes, as I said before :) - module subroutines should *not* be declared external. You do not need that line. Cheers Stephan > > > 2010/9/30 Stephan Kramer > > > On 30/09/10 15:31, Leo van Kampenhout wrote: > > Hi all, > > since it is mandatory to declare all subroutines as "external" in > Fortran, is it possible for Modules to have subroutines? I'm > unable to > declare the subroutine external inside the module itself, nor in the > program which is using it. Not declaring it external at all > results in > the following compilation error: > > /net/users/csg/csg4035/master/workdir/src/main.F:97: undefined > reference > to `__grid_MOD_readgrid' > > (the module is here is named "grid", the subroutine "readgrid" ) > > Thanks, > Leo > > > If you put your subroutine in a module, it should not be declared > external. You can directly call it from within the module itself. When > calling it inside any other module/program you need to add "use > grid" before > the "implicit none". > > Putting subroutines inside a module is highly recommended as it > automatically > provides an explicit interface so that the compiler can check the > arguments in > your subroutine call. > > Cheers > Stephan > >