From gdiso at ustc.edu Sun Jan 2 06:56:58 2011 From: gdiso at ustc.edu (Gong Ding) Date: Sun, 2 Jan 2011 20:56:58 +0800 Subject: [petsc-users] pastix solver break at pastix_checkMatrix References: <54B6E01485354315BF5C9040319E4F0C@cogendaeda><2DF65225-7B22-43A8-A2D3-DE2DD4B75BF6@mcs.anl.gov><76538ECE086E482FB1366C668A1EA28A@cogendaeda> Message-ID: <38A78A34266249418C15433E6AB61305@cogendaeda> ----- Original Message ----- From: "Barry Smith" To: "PETSc users list" Sent: Saturday, January 01, 2011 2:12 AM Subject: Re: [petsc-users] pastix solver break at pastix_checkMatrix Sorry. Yes there was another bug. What caused both of these problems is that Pastix requires a symmetric nonzero structure and the interface made some assumptions that the PETSc matrix had a symmetric nonzero structure which is usually true, hence it did not crash for most matrices people use. I've attached another copy of pastix.c follow the same procedure again. Barry Thank you, Barry. The new version of pastix.c works. And the previous patch of memory allocation seems useless, you can reset it back. Further more, valgrind reported serious memory leak. However, I think it is nothing to do with the petsc interface. Maybe update pastix to the latest version can solve this problem. From pengxwang at hotmail.com Sun Jan 2 12:43:18 2011 From: pengxwang at hotmail.com (Peter Wang) Date: Sun, 2 Jan 2011 12:43:18 -0600 Subject: [petsc-users] Matrix output more than 1024 rows for multi-process Message-ID: A matrix with more than 1024 rows was output by calling MatView(). If only one process was used, then no error came out. However, if multi-process were used, the error information like 'ASCII matrix output not allowed for matrices with more than 1024 rows' came out. If the rule of 1024 rows is only for multi-process? To my understand, there should be two approaches: 1 use binary format, 2 use option -mat_ascii_output_large. If I need to check the matrix on the monitor, I cannot use MatView() to do this, right? Whether the binary format can be used to output the matrix to the monitor? In FAQ, the answer suggests to use the examples of ex72.c, ex78.c and ex32.c. Is there any example for this issue in Fortran? Thanks. The solution in FAQ: How can I read in or write out a sparse matrix in Matrix Market, Harwell-Boeing, SLAPC or other ASCII format?See the examples in src/mat/examples/tests, specifically ex72.c, ex78.c, and ex32.c. You will likely need to modify the code slightly to match your required ASCII format. Note: Never read or write in parallel an ASCII matrix file, instead for reading: read in sequentially with a standalone code based on ex72.c, ex78.c, or ex32.c then save the matrix with the binary viewer PetscBinaryViewerOpen() and load the matrix in parallel in your "real" PETSc program with MatLoad(); for writing save with the binary viewer and then load with the sequential code to store it as ASCII. -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sun Jan 2 16:15:03 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sun, 2 Jan 2011 16:15:03 -0600 Subject: [petsc-users] Matrix output more than 1024 rows for multi-process In-Reply-To: References: Message-ID: <6D398715-AC05-414E-8016-A84019E4EC09@mcs.anl.gov> You can print any size matrix to the screen "for checking" by using an ASCII viewer with MatView() with the option -mat_ascii_output_large If you want to save a matrix to load later for running in another program you should ALWAYS use a binary viewer (ASCII just doesn't make sense for this purpose). On Jan 2, 2011, at 12:43 PM, Peter Wang wrote: > A matrix with more than 1024 rows was output by calling MatView(). If only one process was used, then no error came out. However, if multi-process were used, the error information like 'ASCII matrix output not allowed for matrices with more than 1024 rows' came out. If the rule of 1024 rows is only for multi-process? Apparently, but so what. Just use the flag. > > To my understand, there should be two approaches: 1 use binary format, 2 use option -mat_ascii_output_large. > If I need to check the matrix on the monitor, I cannot use MatView() to do this, right? You can use MatView() with an ASCII viewer to see it on the monitor. > Whether the binary format can be used to output the matrix to the monitor? Binary format is unreadable to the human eye so makes no sense to send it to the monitor. > > In FAQ, the answer suggests to use the examples of ex72.c, ex78.c and ex32.c. Is there any example for this issue in Fortran? You could write similar code in Fortran (no we don't have any examples of it), but why bother? This is just a preprocessing step in a tiny standalone program to change the matrix format, just use the C code. Barry Why do we have the option -mat_ascii_output_large? Because outputting a large sparse matrix in ASCII format is just plain silly so we force the user to acknowledge they really want to do it by using that flag. > Thanks. > > The solution in FAQ: > How can I read in or write out a sparse matrix in Matrix Market, Harwell-Boeing, SLAPC or other ASCII format?See the examples in src/mat/examples/tests, specifically ex72.c, ex78.c, and ex32.c. You will likely need to modify the code slightly to match your required ASCII format. Note: Never read or write in parallel an ASCII matrix file, instead for reading: read in sequentially with a standalone code based on ex72.c, ex78.c, or ex32.c then save the matrix with the binary viewer PetscBinaryViewerOpen() and load the matrix in parallel in your "real" PETSc program with MatLoad(); for writing save with the binary viewer and then load with the sequential code to store it as ASCII. > From pengxwang at hotmail.com Sun Jan 2 21:32:27 2011 From: pengxwang at hotmail.com (Peter Wang) Date: Sun, 2 Jan 2011 21:32:27 -0600 Subject: [petsc-users] result of ex2f.F in petsc-3.1-p5\src\ksp\ksp\examples\tutorials Message-ID: I added VecView() in the ex2f.F to check the three vectors (u,b, and x) in the code. -----The vectors of u and b are same. BUt, the value of vector x is different with different processes. For example, with only 1 process, the vector x is: Process [0] 2.72322e-07 3.81437e-07 1.58922e-07 3.81437e-07 2.38878e-07 -6.65645e-07 1.58922e-07 -6.65645e-07 -2.51219e-07 with 2 processes, the vector is: Process [0] -1.11022e-16 0 2.22045e-16 2.22045e-16 0 Process [1] 2.22045e-16 2.22045e-16 0 2.22045e-16 The example is supposed to get a vector x similar to u. Why the result is different with differnt number of processes used? ----Also, if the runtime option -my_ksp_monitor is used, there is a error showing: [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range It seems there is something wrong with calling 'call KSPBuildSolution(ksp,PETSC_NULL_OBJECT,x,ierr)' in user defined function MyKSPMonitor(). Any hints for this error? Thanks a lots. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gdiso at ustc.edu Sun Jan 2 21:42:01 2011 From: gdiso at ustc.edu (Gong Ding) Date: Mon, 3 Jan 2011 11:42:01 +0800 Subject: [petsc-users] pastix solver break at pastix_checkMatrix References: <54B6E01485354315BF5C9040319E4F0C@cogendaeda><2DF65225-7B22-43A8-A2D3-DE2DD4B75BF6@mcs.anl.gov><76538ECE086E482FB1366C668A1EA28A@cogendaeda> Message-ID: <2EE18AB532EB4A1EA114F736BED1E8C5@cogendaeda> Dear Barry, It seems the patch will cause memory leak. when valOnly is false, memory will be allocated for colptr, row and values. 99: if (!valOnly){ 100: ierr = PetscMalloc(((*n)+1) *sizeof(PetscInt) ,colptr );CHKERRQ(ierr); 101: ierr = PetscMalloc( nnz *sizeof(PetscInt) ,row);CHKERRQ(ierr); 102: ierr = PetscMalloc( nnz *sizeof(PetscScalar),values);CHKERRQ(ierr); But at the end of function MatConvertToCSC 186: ierr = PetscMemcpy(*colptr,tmpcolptr,(*n+1)*sizeof(PetscInt));CHKERRQ(ierr); 187: ierr = PetscMalloc(((*colptr)[*n]-1)*sizeof(PetscInt),row);CHKERRQ(ierr); 188: ierr = PetscMemcpy(*row,tmprows,((*colptr)[*n]-1)*sizeof(PetscInt));CHKERRQ(ierr); 189: ierr = PetscMalloc(((*colptr)[*n]-1)*sizeof(PetscScalar),values);CHKERRQ(ierr); 190: ierr = PetscMemcpy(*values,tmpvalues,((*colptr)[*n]-1)*sizeof(PetscScalar));CHKERRQ(ierr); memory will be allocated again. Which means when code call MatConvertToCSC next time, the memory allocated at line 100-102 will be lost. Will you please fix this problem? i.e. check the *colptr, *row and *values, when they are not empty, skip the memory allocation at line 100-102. Yours Gong Ding From bsmith at mcs.anl.gov Sun Jan 2 22:55:00 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sun, 2 Jan 2011 22:55:00 -0600 Subject: [petsc-users] result of ex2f.F in petsc-3.1-p5\src\ksp\ksp\examples\tutorials In-Reply-To: References: Message-ID: <1F3FC2CF-4191-4FB8-A744-6C7D3FDCEF03@mcs.anl.gov> On Jan 2, 2011, at 9:32 PM, Peter Wang wrote: > I added VecView() in the ex2f.F to check the three vectors (u,b, and x) in the code. > > -----The vectors of u and b are same. BUt, the value of vector x is different with different processes. For example, > > with only 1 process, the vector x is: > > Process [0] > 2.72322e-07 > 3.81437e-07 > 1.58922e-07 > 3.81437e-07 > 2.38878e-07 > -6.65645e-07 > 1.58922e-07 > -6.65645e-07 > -2.51219e-07 > > with 2 processes, the vector is: > > Process [0] > -1.11022e-16 > 0 > 2.22045e-16 > 2.22045e-16 > 0 > > Process [1] > 2.22045e-16 > 2.22045e-16 > 0 > 2.22045e-16 void PETSC_STDCALL kspbuildsolution_(KSP *ksp,Vec *v,Vec *V, int *ierr ) { Vec vp = 0; CHKFORTRANNULLOBJECT(v); CHKFORTRANNULLOBJECT(V); if (v) vp = *v; *ierr = KSPBuildSolution(*ksp,vp,V); } void PETSC_STDCALL kspbuildresidual_(KSP *ksp,Vec *t,Vec *v,Vec *V, int *ierr ) { Vec tp = 0,vp = 0; CHKFORTRANNULLOBJECT(t); CHKFORTRANNULLOBJECT(v); CHKFORTRANNULLOBJECT(V); if (t) tp = *t; if (v) vp = *v; *ierr = KSPBuildResidual(*ksp,tp,vp,V); } Barry x is computed in the code as the difference between the "exact" solution and the "computed" solution. Since this example uses iterative solvers, which by default do not compute the solution to full accuracy, the "error" will be different for different number of processes. It is just a fluke that the error is smaller with two processes instead of one. > > The example is supposed to get a vector x similar to u. Why the result is different with differnt number of processes used? > > ----Also, if the runtime option -my_ksp_monitor is used, there is a error showing: > > [0]PETSC ERROR: ------------------------------------------------------------------------ > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range > > It seems there is something wrong with calling 'call KSPBuildSolution(ksp,PETSC_NULL_OBJECT,x,ierr)' in user defined function MyKSPMonitor(). > > Any hints for this error? Thanks a lots. Bug in our Fortran Interface for KSPBuildSolution() fortran interface in that case. If you replace the two functions in src/ksp/ksp/interface/ftn-custom/zitclf.c with the ones below and run make in that directory this monitor routine will work. > > > > > > > From bsmith at mcs.anl.gov Sun Jan 2 23:10:47 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sun, 2 Jan 2011 23:10:47 -0600 Subject: [petsc-users] pastix solver break at pastix_checkMatrix In-Reply-To: <2EE18AB532EB4A1EA114F736BED1E8C5@cogendaeda> References: <54B6E01485354315BF5C9040319E4F0C@cogendaeda><2DF65225-7B22-43A8-A2D3-DE2DD4B75BF6@mcs.anl.gov><76538ECE086E482FB1366C668A1EA28A@cogendaeda> <2EE18AB532EB4A1EA114F736BED1E8C5@cogendaeda> Message-ID: I just two distinct memory leaks in pastix.c (new file attached) for 3.1 (also fixed in petsc-dev). Barry -------------- next part -------------- A non-text attachment was scrubbed... Name: pastix.c Type: application/octet-stream Size: 28203 bytes Desc: not available URL: -------------- next part -------------- On Jan 2, 2011, at 9:42 PM, Gong Ding wrote: > Dear Barry, > > It seems the patch will cause memory leak. > > when valOnly is false, memory will be allocated for colptr, row and values. > > 99: if (!valOnly){ > 100: ierr = PetscMalloc(((*n)+1) *sizeof(PetscInt) ,colptr );CHKERRQ(ierr); > 101: ierr = PetscMalloc( nnz *sizeof(PetscInt) ,row);CHKERRQ(ierr); > 102: ierr = PetscMalloc( nnz *sizeof(PetscScalar),values);CHKERRQ(ierr); > > > But at the end of function MatConvertToCSC > > 186: ierr = PetscMemcpy(*colptr,tmpcolptr,(*n+1)*sizeof(PetscInt));CHKERRQ(ierr); > 187: ierr = PetscMalloc(((*colptr)[*n]-1)*sizeof(PetscInt),row);CHKERRQ(ierr); > 188: ierr = PetscMemcpy(*row,tmprows,((*colptr)[*n]-1)*sizeof(PetscInt));CHKERRQ(ierr); > 189: ierr = PetscMalloc(((*colptr)[*n]-1)*sizeof(PetscScalar),values);CHKERRQ(ierr); > 190: ierr = PetscMemcpy(*values,tmpvalues,((*colptr)[*n]-1)*sizeof(PetscScalar));CHKERRQ(ierr); > > memory will be allocated again. > Which means when code call MatConvertToCSC next time, the memory allocated at line 100-102 will be lost. > > Will you please fix this problem? i.e. check the *colptr, *row and *values, when they are not empty, skip the memory allocation > at line 100-102. > > Yours > Gong Ding From pengxwang at hotmail.com Sun Jan 2 23:35:20 2011 From: pengxwang at hotmail.com (Peter Wang) Date: Sun, 2 Jan 2011 23:35:20 -0600 Subject: [petsc-users] result of ex2f.F in petsc-3.1-p5\src\ksp\ksp\examples\tutorials In-Reply-To: <1F3FC2CF-4191-4FB8-A744-6C7D3FDCEF03@mcs.anl.gov> References: , <1F3FC2CF-4191-4FB8-A744-6C7D3FDCEF03@mcs.anl.gov> Message-ID: Thanks, Barry, when you said, > x is computed in the code as the difference between the "exact" solution and the "computed" solution. Does it mean x is not the solution? In the webpage document for KSPSolve, the x is defined as : Parameter ksp - iterative context obtained from KSPCreate() b - the right hand side vector x - the solution Why is x the difference in this example code? what is the solution,then? > From: bsmith at mcs.anl.gov > Date: Sun, 2 Jan 2011 22:55:00 -0600 > To: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] result of ex2f.F in petsc-3.1-p5\src\ksp\ksp\examples\tutorials > > > On Jan 2, 2011, at 9:32 PM, Peter Wang wrote: > > > I added VecView() in the ex2f.F to check the three vectors (u,b, and x) in the code. > > > > -----The vectors of u and b are same. BUt, the value of vector x is different with different processes. For example, > > > > with only 1 process, the vector x is: > > > > Process [0] > > 2.72322e-07 > > 3.81437e-07 > > 1.58922e-07 > > 3.81437e-07 > > 2.38878e-07 > > -6.65645e-07 > > 1.58922e-07 > > -6.65645e-07 > > -2.51219e-07 > > > > with 2 processes, the vector is: > > > > Process [0] > > -1.11022e-16 > > 0 > > 2.22045e-16 > > 2.22045e-16 > > 0 > > > > Process [1] > > 2.22045e-16 > > 2.22045e-16 > > 0 > > 2.22045e-16 > void PETSC_STDCALL kspbuildsolution_(KSP *ksp,Vec *v,Vec *V, int *ierr ) > { > Vec vp = 0; > CHKFORTRANNULLOBJECT(v); > CHKFORTRANNULLOBJECT(V); > if (v) vp = *v; > *ierr = KSPBuildSolution(*ksp,vp,V); > } > > void PETSC_STDCALL kspbuildresidual_(KSP *ksp,Vec *t,Vec *v,Vec *V, int *ierr ) > { > Vec tp = 0,vp = 0; > CHKFORTRANNULLOBJECT(t); > CHKFORTRANNULLOBJECT(v); > CHKFORTRANNULLOBJECT(V); > if (t) tp = *t; > if (v) vp = *v; > *ierr = KSPBuildResidual(*ksp,tp,vp,V); > } > > Barry > > x is computed in the code as the difference between the "exact" solution and the "computed" solution. Since this example uses iterative solvers, which by default do not compute the solution to full accuracy, the "error" will be different for different number of processes. It is just a fluke that the error is smaller with two processes instead of one. > > > > > The example is supposed to get a vector x similar to u. Why the result is different with differnt number of processes used? > > > > ----Also, if the runtime option -my_ksp_monitor is used, there is a error showing: > > > > [0]PETSC ERROR: ------------------------------------------------------------------------ > > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range > > > > It seems there is something wrong with calling 'call KSPBuildSolution(ksp,PETSC_NULL_OBJECT,x,ierr)' in user defined function MyKSPMonitor(). > > > > Any hints for this error? Thanks a lots. > > Bug in our Fortran Interface for KSPBuildSolution() fortran interface in that case. If you replace the two functions in src/ksp/ksp/interface/ftn-custom/zitclf.c with the ones below and run make in that directory this monitor routine will work. > > > > > > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gdiso at ustc.edu Mon Jan 3 02:26:37 2011 From: gdiso at ustc.edu (Gong Ding) Date: Mon, 3 Jan 2011 16:26:37 +0800 Subject: [petsc-users] superlu_dist has some problem for SAME_NONZERO_PATTERN Message-ID: <4317F5ADFA0343868E75F2B898BC7B17@cogendaeda> Dear Barry, Trouble you again. I found that my nonlinear solver failed to convergence with superlu_dist (even broken some times), but works well with mumps and pastix. When I change SAME_NONZERO_PATTERN to DIFFERENT_NONZERO_PATTERN, everything goes well. Will you please try to fix this? Gong Ding From gianmail at gmail.com Mon Jan 3 05:22:11 2011 From: gianmail at gmail.com (Gianluca Meneghello) Date: Mon, 3 Jan 2011 12:22:11 +0100 Subject: [petsc-users] VecGetSubVector Message-ID: Hi, I'm new to PETSc, so that this can be a very simple question: I'm looking for something like VecGetSubVector, which I've seen it exists in the dev version but not in the released one. I need to write a smoother for a multigrid algorithm (something like a block Gauss Seidel) which can be written in matlab as for j = 1:ny ?P = ; ?du(P) = L(P,P) \ ( ?rhs(P) - L(P,:)*du + D2(P,P)*du(P) ); end where L is a matrix (in my case the linearized Navier Stokes). I was thinking about using IS for declaring P, so that D2(P,P) can be obtained using MatGetSubMatrix. I would need the same for the vector du. Is there a way to do that without using the developer version? (I really don't feel like being "experienced with building, using and debugging PETSc). Thanks in advance Gianluca From hzhang at mcs.anl.gov Mon Jan 3 10:09:16 2011 From: hzhang at mcs.anl.gov (Hong Zhang) Date: Mon, 3 Jan 2011 10:09:16 -0600 Subject: [petsc-users] superlu_dist has some problem for SAME_NONZERO_PATTERN In-Reply-To: <4317F5ADFA0343868E75F2B898BC7B17@cogendaeda> References: <4317F5ADFA0343868E75F2B898BC7B17@cogendaeda> Message-ID: Gong: Can you give us a simple example of your calling procedural that reproduces this behavior? I tested petsc-dev/src/ksp/ksp/examples/tutorials/ex10.c and did not get any error (this is a linear solver though). Hong > Dear Barry, > Trouble you again. > I found that my nonlinear solver failed to convergence with superlu_dist (even broken some times), but works well with mumps and pastix. > > When I change SAME_NONZERO_PATTERN to DIFFERENT_NONZERO_PATTERN, everything goes well. > > Will you please try to fix this? > > Gong Ding > From bsmith at mcs.anl.gov Mon Jan 3 10:32:17 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 3 Jan 2011 10:32:17 -0600 Subject: [petsc-users] result of ex2f.F in petsc-3.1-p5\src\ksp\ksp\examples\tutorials In-Reply-To: References: , <1F3FC2CF-4191-4FB8-A744-6C7D3FDCEF03@mcs.anl.gov> Message-ID: <6278672B-F8A9-4943-BEBC-03537882B5DB@mcs.anl.gov> call KSPSolve(ksp,b,x,ierr) ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! Check solution and clean up ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! Check the error call VecAXPY(x,neg_one,u,ierr) ^^^^^^^^^^^^^^^^^^^ x = x - 1.0* u; and hence changes x On Jan 2, 2011, at 11:35 PM, Peter Wang wrote: > Thanks, Barry, > when you said, > > x is computed in the code as the difference between the "exact" solution and the "computed" solution. > Does it mean x is not the solution? > > In the webpage document for KSPSolve, the x is defined as : > Parameter > ksp - iterative context obtained from KSPCreate() > b - the right hand side vector > x - the solution > > Why is x the difference in this example code? what is the solution,then? > > > > From: bsmith at mcs.anl.gov > > Date: Sun, 2 Jan 2011 22:55:00 -0600 > > To: petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] result of ex2f.F in petsc-3.1-p5\src\ksp\ksp\examples\tutorials > > > > > > On Jan 2, 2011, at 9:32 PM, Peter Wang wrote: > > > > > I added VecView() in the ex2f.F to check the three vectors (u,b, and x) in the code. > > > > > > -----The vectors of u and b are same. BUt, the value of vector x is different with different processes. For example, > > > > > > with only 1 process, the vector x is: > > > > > > Process [0] > > > 2.72322e-07 > > > 3.81437e-07 > > > 1.58922e-07 > > > 3.81437e-07 > > > 2.38878e-07 > > > -6.65645e-07 > > > 1.58922e-07 > > > -6.65645e-07 > > > -2.51219e-07 > > > > > > with 2 processes, the vector is: > > > > > > Process [0] > > > -1.11022e-16 > > > 0 > > > 2.22045e-16 > > > 2.22045e-16 > > > 0 > > > > > > Process [1] > > > 2.22045e-16 > > > 2.22045e-16 > > > 0 > > > 2.22045e-16 > > void PETSC_STDCALL kspbuildsolution_(KSP *ksp,Vec *v,Vec *V, int *ierr ) > > { > > Vec vp = 0; > > CHKFORTRANNULLOBJECT(v); > > CHKFORTRANNULLOBJECT(V); > > if (v) vp = *v; > > *ierr = KSPBuildSolution(*ksp,vp,V); > > } > > > > void PETSC_STDCALL kspbuildresidual_(KSP *ksp,Vec *t,Vec *v,Vec *V, int *ierr ) > > { > > Vec tp = 0,vp = 0; > > CHKFORTRANNULLOBJECT(t); > > CHKFORTRANNULLOBJECT(v); > > CHKFORTRANNULLOBJECT(V); > > if (t) tp = *t; > > if (v) vp = *v; > > *ierr = KSPBuildResidual(*ksp,tp,vp,V); > > } > > > > Barry > > > > x is computed in the code as the difference between the "exact" solution and the "computed" solution. Since this example uses iterative solvers, which by default do not compute the solution to full accuracy, the "error" will be different for different number of processes. It is just a fluke that the error is smaller with two processes instead of one. > > > > > > > > The example is supposed to get a vector x similar to u. Why the result is different with differnt number of processes used? > > > > > > ----Also, if the runtime option -my_ksp_monitor is used, there is a error showing: > > > > > > [0]PETSC ERROR: ------------------------------------------------------------------------ > > > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range > > > > > > It seems there is something wrong with calling 'call KSPBuildSolution(ksp,PETSC_NULL_OBJECT,x,ierr)' in user defined function MyKSPMonitor(). > > > > > > Any hints for this error? Thanks a lots. > > > > Bug in our Fortran Interface for KSPBuildSolution() fortran interface in that case. If you replace the two functions in src/ksp/ksp/interface/ftn-custom/zitclf.c with the ones below and run make in that directory this monitor routine will work. > > > > > > > > > > > > > > > > > > > > > > > > > > > From pengxwang at hotmail.com Mon Jan 3 10:41:27 2011 From: pengxwang at hotmail.com (Peter Wang) Date: Mon, 3 Jan 2011 10:41:27 -0600 Subject: [petsc-users] result of ex2f.F in petsc-3.1-p5\src\ksp\ksp\examples\tutorials In-Reply-To: <6278672B-F8A9-4943-BEBC-03537882B5DB@mcs.anl.gov> References: , , <1F3FC2CF-4191-4FB8-A744-6C7D3FDCEF03@mcs.anl.gov>, , <6278672B-F8A9-4943-BEBC-03537882B5DB@mcs.anl.gov> Message-ID: Thanks Barry, sorry for careless reading the code. > From: bsmith at mcs.anl.gov > Date: Mon, 3 Jan 2011 10:32:17 -0600 > To: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] result of ex2f.F in petsc-3.1-p5\src\ksp\ksp\examples\tutorials > > > call KSPSolve(ksp,b,x,ierr) > > ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > ! Check solution and clean up > ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - > > ! Check the error > call VecAXPY(x,neg_one,u,ierr) > ^^^^^^^^^^^^^^^^^^^ > x = x - 1.0* u; and hence changes x > > > > On Jan 2, 2011, at 11:35 PM, Peter Wang wrote: > > > Thanks, Barry, > > when you said, > > > x is computed in the code as the difference between the "exact" solution and the "computed" solution. > > Does it mean x is not the solution? > > > > In the webpage document for KSPSolve, the x is defined as : > > Parameter > > ksp - iterative context obtained from KSPCreate() > > b - the right hand side vector > > x - the solution > > > > Why is x the difference in this example code? what is the solution,then? > > > > > > > From: bsmith at mcs.anl.gov > > > Date: Sun, 2 Jan 2011 22:55:00 -0600 > > > To: petsc-users at mcs.anl.gov > > > Subject: Re: [petsc-users] result of ex2f.F in petsc-3.1-p5\src\ksp\ksp\examples\tutorials > > > > > > > > > On Jan 2, 2011, at 9:32 PM, Peter Wang wrote: > > > > > > > I added VecView() in the ex2f.F to check the three vectors (u,b, and x) in the code. > > > > > > > > -----The vectors of u and b are same. BUt, the value of vector x is different with different processes. For example, > > > > > > > > with only 1 process, the vector x is: > > > > > > > > Process [0] > > > > 2.72322e-07 > > > > 3.81437e-07 > > > > 1.58922e-07 > > > > 3.81437e-07 > > > > 2.38878e-07 > > > > -6.65645e-07 > > > > 1.58922e-07 > > > > -6.65645e-07 > > > > -2.51219e-07 > > > > > > > > with 2 processes, the vector is: > > > > > > > > Process [0] > > > > -1.11022e-16 > > > > 0 > > > > 2.22045e-16 > > > > 2.22045e-16 > > > > 0 > > > > > > > > Process [1] > > > > 2.22045e-16 > > > > 2.22045e-16 > > > > 0 > > > > 2.22045e-16 > > > void PETSC_STDCALL kspbuildsolution_(KSP *ksp,Vec *v,Vec *V, int *ierr ) > > > { > > > Vec vp = 0; > > > CHKFORTRANNULLOBJECT(v); > > > CHKFORTRANNULLOBJECT(V); > > > if (v) vp = *v; > > > *ierr = KSPBuildSolution(*ksp,vp,V); > > > } > > > > > > void PETSC_STDCALL kspbuildresidual_(KSP *ksp,Vec *t,Vec *v,Vec *V, int *ierr ) > > > { > > > Vec tp = 0,vp = 0; > > > CHKFORTRANNULLOBJECT(t); > > > CHKFORTRANNULLOBJECT(v); > > > CHKFORTRANNULLOBJECT(V); > > > if (t) tp = *t; > > > if (v) vp = *v; > > > *ierr = KSPBuildResidual(*ksp,tp,vp,V); > > > } > > > > > > Barry > > > > > > x is computed in the code as the difference between the "exact" solution and the "computed" solution. Since this example uses iterative solvers, which by default do not compute the solution to full accuracy, the "error" will be different for different number of processes. It is just a fluke that the error is smaller with two processes instead of one. > > > > > > > > > > > The example is supposed to get a vector x similar to u. Why the result is different with differnt number of processes used? > > > > > > > > ----Also, if the runtime option -my_ksp_monitor is used, there is a error showing: > > > > > > > > [0]PETSC ERROR: ------------------------------------------------------------------------ > > > > [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range > > > > > > > > It seems there is something wrong with calling 'call KSPBuildSolution(ksp,PETSC_NULL_OBJECT,x,ierr)' in user defined function MyKSPMonitor(). > > > > > > > > Any hints for this error? Thanks a lots. > > > > > > Bug in our Fortran Interface for KSPBuildSolution() fortran interface in that case. If you replace the two functions in src/ksp/ksp/interface/ftn-custom/zitclf.c with the ones below and run make in that directory this monitor routine will work. > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Mon Jan 3 10:43:41 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 3 Jan 2011 10:43:41 -0600 Subject: [petsc-users] VecGetSubVector In-Reply-To: References: Message-ID: Gianluca, The expected use is with the VecScatter object. First you create a VecScatter object with VecScatterCreate() then each time you need the "subvector" you call VecScatterBegin() followed by VecScatterEnd() Note that usually the VecScatter object is retained and used many times. Barry On Jan 3, 2011, at 5:22 AM, Gianluca Meneghello wrote: > Hi, > > I'm new to PETSc, so that this can be a very simple question: > > I'm looking for something like VecGetSubVector, which I've seen it > exists in the dev version but not in the released one. > > I need to write a smoother for a multigrid algorithm (something like a > block Gauss Seidel) which can be written in matlab as > > for j = 1:ny > P = ; > du(P) = L(P,P) \ ( rhs(P) - L(P,:)*du + D2(P,P)*du(P) ); > end > > where L is a matrix (in my case the linearized Navier Stokes). > > I was thinking about using IS for declaring P, so that D2(P,P) can be > obtained using MatGetSubMatrix. I would need the same for the vector > du. > > Is there a way to do that without using the developer version? (I > really don't feel like being "experienced with building, using and > debugging PETSc). > > Thanks in advance > > Gianluca From bsmith at mcs.anl.gov Mon Jan 3 10:47:35 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 3 Jan 2011 10:47:35 -0600 Subject: [petsc-users] superlu_dist has some problem for SAME_NONZERO_PATTERN In-Reply-To: <4317F5ADFA0343868E75F2B898BC7B17@cogendaeda> References: <4317F5ADFA0343868E75F2B898BC7B17@cogendaeda> Message-ID: On Jan 3, 2011, at 2:26 AM, Gong Ding wrote: > Dear Barry, > Trouble you again. > I found that my nonlinear solver failed to convergence with superlu_dist (even broken some times), but works well with mumps and pastix. > > When I change SAME_NONZERO_PATTERN to DIFFERENT_NONZERO_PATTERN, everything goes well. Are you sure that non-zero pattern isn't changing? After the first solve call MatSetOption(mat, MAT_NEW_NONZERO_LOCATION_ERR,PETSC_TRUE and it will stop if the code tries to add a new nonzero location. Also please switch to petsc-dev http://www.mcs.anl.gov/petsc/petsc-as/developers/index.html Sherry Li has fixed a bunch of things in SuperLU_dist that you can only use from petsc-dev. Barry > > Will you please try to fix this? > > Gong Ding From enjoywm at cs.wm.edu Mon Jan 3 12:10:17 2011 From: enjoywm at cs.wm.edu (enjoywm at cs.wm.edu) Date: Mon, 3 Jan 2011 13:10:17 -0500 Subject: [petsc-users] PETSc libs Message-ID: <80c013e5de929bc2c997eeff8915039c.squirrel@mail.cs.wm.edu> Hi, My application works well on old version petsc with libs: petscsnes, petscksp, petsccdm, petscmat and petscvec and lapack. Now I use PETSc-3.1-p6 and installed it successfully. However, I cannot find these libs except libpetsc.a and libflapack.a. Do I need to replace petscsnes, petscksp, petsccdm, petscmat and petscvec with libpetsc and replace lapack with flapack? Thanks. Yixun From bsmith at mcs.anl.gov Mon Jan 3 12:22:47 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 3 Jan 2011 12:22:47 -0600 Subject: [petsc-users] PETSc libs In-Reply-To: <80c013e5de929bc2c997eeff8915039c.squirrel@mail.cs.wm.edu> References: <80c013e5de929bc2c997eeff8915039c.squirrel@mail.cs.wm.edu> Message-ID: <8CE2E417-3023-4A34-B3DE-3F1DB8BB9604@mcs.anl.gov> On Jan 3, 2011, at 12:10 PM, enjoywm at cs.wm.edu wrote: > Hi, > My application works well on old version petsc with libs: petscsnes, > petscksp, petsccdm, petscmat and petscvec and lapack. > Now I use PETSc-3.1-p6 and installed it successfully. > However, I cannot find these libs except libpetsc.a and libflapack.a. > Do I need to replace petscsnes, petscksp, petsccdm, petscmat and petscvec > with libpetsc and replace lapack with flapack? > > Thanks. > > Yixun The 3.1 release defaults now to a single library libpetsc The names of the lapack and blas libraries depends on what Blas and LAPACK you are using, so if you hardwired the name then yes you need to change them. Note if you used the PETSc makefiles as a starting point for your makefile you would have a completely portable makefile that you would not need to change for different systems and configurations. See for example src/snes/examples/tutorials/makefile and the rule for building ex1 Barry From gaurish108 at gmail.com Tue Jan 4 00:30:22 2011 From: gaurish108 at gmail.com (Gaurish Telang) Date: Tue, 4 Jan 2011 01:30:22 -0500 Subject: [petsc-users] Question on some PETSc functions. Message-ID: I have 2 semi sparse matrices (see sparsity plots) attached and I wanted to get to know some basic information about them. 1. How does one calculate the rank of a matrix in PETSc. Is the answer returned approximate or exact for matrices of dimension say 2500X1200 2. How good are the sparse sovlers for Ax=b that PETSc employs when A is a dense full rank square matrix of size 2000x2000. 3. In my case the matrices are mostly sparse but they tend to get dense towards the bottom. What matrix format is most efficent for handling such matrices? -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Jan 4 06:17:53 2011 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 4 Jan 2011 06:17:53 -0600 Subject: [petsc-users] Question on some PETSc functions. In-Reply-To: References: Message-ID: On Tue, Jan 4, 2011 at 12:30 AM, Gaurish Telang wrote: > I have 2 semi sparse matrices (see sparsity plots) attached and I wanted to > get to know some basic information about them. There are no attachments. > 1. How does one calculate the rank of a matrix in PETSc. Is the answer > returned approximate or exact for matrices of dimension say 2500X1200 > You need a rank revealing factorization. I know of no sparse packages for this. > 2. How good are the sparse sovlers for Ax=b that PETSc employs when A is a > dense full rank square matrix of size 2000x2000. > I think you mean, "how well do Krylov solvers work for dense matrices?". Solver performance depends heavily on the characteristics of the matrix. There are no general statements that can be made about Krylov solvers. > 3. In my case the matrices are mostly sparse but they tend to get dense > towards the bottom. What matrix format is most efficent for handling such > matrices? > You might be able to handle this with a sparse matrix A + an outer product of vectors. However, it depends on your problem. Matt -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From pengxwang at hotmail.com Tue Jan 4 17:38:53 2011 From: pengxwang at hotmail.com (Peter Wang) Date: Tue, 4 Jan 2011 17:38:53 -0600 Subject: [petsc-users] error in calling VecGetArrayf90() Message-ID: I am trying to obtain the value of each element of a solution Vector by KSPsolve(). The variables are defined according the example of ex4f90.F in \petsc-3.1-p5\src\snes\examples\tutorials\ as following, PetscScalar, pointer :: xx_v(:) ... call KSPSolve(ksp,b,x,ierr) call VecView(x,PETSC_VIEWER_STDOUT_WORLD,ierr) call VecGetArrayF90(x,xx_v,ierr) call VecRestoreArrayF90(x,xx_v,ierr) ... But, the error keeps coming out when call VecGetArrayF90(x,xx_v,ierr) and call VecRestoreArrayF90(x,xx_v,ierr) are not commented off. The error information shows: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [0]PETSC ERROR: INSTEAD the line number of the start of the function [0]PETSC ERROR: is given. [0]PETSC ERROR: [0] F90Array1dCreate line 52 src/sys/f90-src/f90_cwrap.c [0]PETSC ERROR: --------------------- Error Message ------------------------------------ I checked the code according the example, but cannot see any difference to that. Just don't know why the pointer array xx_v doesn't work here? Thanks. -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Tue Jan 4 17:50:11 2011 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 4 Jan 2011 17:50:11 -0600 (CST) Subject: [petsc-users] error in calling VecGetArrayf90() In-Reply-To: References: Message-ID: Did you included "finclude/petscvec.h90" in your code - as the example did? satish On Tue, 4 Jan 2011, Peter Wang wrote: > > I am trying to obtain the value of each element of a solution Vector by KSPsolve(). > > The variables are defined according the example of ex4f90.F in \petsc-3.1-p5\src\snes\examples\tutorials\ as following, > > PetscScalar, pointer :: xx_v(:) > > ... > call KSPSolve(ksp,b,x,ierr) > call VecView(x,PETSC_VIEWER_STDOUT_WORLD,ierr) > > call VecGetArrayF90(x,xx_v,ierr) > call VecRestoreArrayF90(x,xx_v,ierr) > > ... > > But, the error keeps coming out when call VecGetArrayF90(x,xx_v,ierr) and call VecRestoreArrayF90(x,xx_v,ierr) are not commented off. > > > The error information shows: > Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range > > [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------ > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, > [0]PETSC ERROR: INSTEAD the line number of the start of the function > [0]PETSC ERROR: is given. > [0]PETSC ERROR: [0] F90Array1dCreate line 52 src/sys/f90-src/f90_cwrap.c > [0]PETSC ERROR: --------------------- Error Message ------------------------------------ > > I checked the code according the example, but cannot see any difference to that. Just don't know why the pointer array xx_v doesn't work here? Thanks. > > From pengxwang at hotmail.com Tue Jan 4 20:43:28 2011 From: pengxwang at hotmail.com (Peter Wang) Date: Tue, 4 Jan 2011 20:43:28 -0600 Subject: [petsc-users] error in calling VecGetArrayf90() In-Reply-To: References: , Message-ID: Thanks a million. Yes, you are right, it works after included "finclude/petscvec.h90". The previouse code just with petscvec.h, but without petscvec.h90. Therefore, there is not a compiling error, but the runtime error. Thanks again. I had been stuck with this trouble the whole day. No, it is over !! :) > Date: Tue, 4 Jan 2011 17:50:11 -0600 > From: balay at mcs.anl.gov > To: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] error in calling VecGetArrayf90() > > Did you included "finclude/petscvec.h90" in your code - as the example did? > > satish > > On Tue, 4 Jan 2011, Peter Wang wrote: > > > > > I am trying to obtain the value of each element of a solution Vector by KSPsolve(). > > > > The variables are defined according the example of ex4f90.F in \petsc-3.1-p5\src\snes\examples\tutorials\ as following, > > > > PetscScalar, pointer :: xx_v(:) > > > > ... > > call KSPSolve(ksp,b,x,ierr) > > call VecView(x,PETSC_VIEWER_STDOUT_WORLD,ierr) > > > > call VecGetArrayF90(x,xx_v,ierr) > > call VecRestoreArrayF90(x,xx_v,ierr) > > > > ... > > > > But, the error keeps coming out when call VecGetArrayF90(x,xx_v,ierr) and call VecRestoreArrayF90(x,xx_v,ierr) are not commented off. > > > > > > The error information shows: > > Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range > > > > [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------ > > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, > > [0]PETSC ERROR: INSTEAD the line number of the start of the function > > [0]PETSC ERROR: is given. > > [0]PETSC ERROR: [0] F90Array1dCreate line 52 src/sys/f90-src/f90_cwrap.c > > [0]PETSC ERROR: --------------------- Error Message ------------------------------------ > > > > I checked the code according the example, but cannot see any difference to that. Just don't know why the pointer array xx_v doesn't work here? Thanks. > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pengxwang at hotmail.com Tue Jan 4 22:10:43 2011 From: pengxwang at hotmail.com (Peter Wang) Date: Tue, 4 Jan 2011 22:10:43 -0600 Subject: [petsc-users] error in calling VecGetArrayf90() In-Reply-To: References: , Message-ID: In last question, the pointer xx_v is local data. However, if write them to the monitor or assign them to another array, the value is incorrect. The protion of the code to display them on the monitor is like as following: call MatGetOwnershipRange(A,Istart,Iend,ierr) call VecGetArrayF90(x,xx_v,ierr) ! Vector x is matched with Matrix A in the same communicator write(*,*)xx_v,myid ! write the poiner array together do i=Istart,Iend-1 write(6,*)'check xx_v',i,xx_v(i),myid !write the element of the array one by one with local range (Istart to Iend-1) enddo =========The result is as following: ( the values of the elements from 7 to 20 are not correct !!) 3999.9999999999982 3999.9999999999982 3999.9999999999982 3999.9999999999982 3999.9999999999982 3000.0000000000005 3000.0000000000005 0 3000.0000000000009 3000.0000000000009 3000.0000000000009 2000.0000000000011 2000.0000000000011 2000.0000000000000 1 2000.0000000000009 2000.0000000000009 1000.0000000000003 1000.0000000000003 1000.0000000000003 999.99999999999989 2 1000.0000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000000 3 check xx_v 0 0.0000000000000000 0.0000000000000000 0 check xx_v 1 3999.9999999999982 3999.9999999999982 0 check xx_v 2 3999.9999999999982 3999.9999999999982 0 check xx_v 3 3999.9999999999982 3999.9999999999982 0 check xx_v 4 3999.9999999999982 3999.9999999999982 0 check xx_v 5 3999.9999999999982 3999.9999999999982 0 check xx_v 6 3000.0000000000005 3000.0000000000005 0 check xx_v 7 1.99665037664579820E-314 1.99665037664579820E-314 1 check xx_v 8 2.61360726650019422E-321 2.61360726650019422E-321 1 check xx_v 9 7.90505033345994471E-323 7.90505033345994471E-323 1 check xx_v 10 1.69759663277221785E-312 1.69759663277221785E-312 1 check xx_v 11 6.16846344148335980E-317 6.16846344148335980E-317 1 check xx_v 12 6.16846640587723485E-317 6.16846640587723485E-317 1 check xx_v 13 6.16838982570212945E-317 6.16838982570212945E-317 2 check xx_v 14 1.99665037664579820E-314 1.99665037664579820E-314 2 check xx_v 15 6.19790333112210216E-317 6.19790333112210216E-317 2 check xx_v 16 6.20255545324334334E-317 6.20255545324334334E-317 2 check xx_v 17 6.20225061473985929E-317 6.20225061473985929E-317 2 check xx_v 18 6.18242376037225006E-317 6.18242376037225006E-317 2 check xx_v 19 6.16846640587723485E-317 6.16846640587723485E-317 3 check xx_v 20 6.18113523716789609E-317 6.18113523716789609E-317 3 check xx_v 21 0.0000000000000000 0.0000000000000000 3 check xx_v 22 0.0000000000000000 0.0000000000000000 3 check xx_v 23 0.0000000000000000 0.0000000000000000 3 check xx_v 24 0.0000000000000000 0.0000000000000000 3 ======The vector x is : Process [0] 4000 4000 4000 4000 4000 3000 3000 Process [1] 3000 3000 3000 2000 2000 2000 Process [2] 2000 2000 1000 1000 1000 1000 Process [3] 1000 0 0 0 0 0 > Date: Tue, 4 Jan 2011 17:50:11 -0600 > From: balay at mcs.anl.gov > To: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] error in calling VecGetArrayf90() > > Did you included "finclude/petscvec.h90" in your code - as the example did? > > satish > > On Tue, 4 Jan 2011, Peter Wang wrote: > > > > > I am trying to obtain the value of each element of a solution Vector by KSPsolve(). > > > > The variables are defined according the example of ex4f90.F in \petsc-3.1-p5\src\snes\examples\tutorials\ as following, > > > > PetscScalar, pointer :: xx_v(:) > > > > ... > > call KSPSolve(ksp,b,x,ierr) > > call VecView(x,PETSC_VIEWER_STDOUT_WORLD,ierr) > > > > call VecGetArrayF90(x,xx_v,ierr) > > call VecRestoreArrayF90(x,xx_v,ierr) > > > > ... > > > > But, the error keeps coming out when call VecGetArrayF90(x,xx_v,ierr) and call VecRestoreArrayF90(x,xx_v,ierr) are not commented off. > > > > > > The error information shows: > > Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range > > > > [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------ > > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, > > [0]PETSC ERROR: INSTEAD the line number of the start of the function > > [0]PETSC ERROR: is given. > > [0]PETSC ERROR: [0] F90Array1dCreate line 52 src/sys/f90-src/f90_cwrap.c > > [0]PETSC ERROR: --------------------- Error Message ------------------------------------ > > > > I checked the code according the example, but cannot see any difference to that. Just don't know why the pointer array xx_v doesn't work here? Thanks. > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Tue Jan 4 22:49:50 2011 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 4 Jan 2011 22:49:50 -0600 (CST) Subject: [petsc-users] error in calling VecGetArrayf90() In-Reply-To: References: , Message-ID: The global index starts at Istart - but the array index starts at 1 [for fortran array] Satish On Tue, 4 Jan 2011, Peter Wang wrote: > > In last question, the pointer xx_v is local data. However, if write them to the monitor or assign them to another array, the value is incorrect. > > The protion of the code to display them on the monitor is like as following: > call MatGetOwnershipRange(A,Istart,Iend,ierr) > call VecGetArrayF90(x,xx_v,ierr) ! Vector x is matched with Matrix A in the same communicator > > write(*,*)xx_v,myid ! write the poiner array together > > do i=Istart,Iend-1 > write(6,*)'check xx_v',i,xx_v(i),myid !write the element of the array one by one with local range (Istart to Iend-1) > enddo > > > =========The result is as following: ( the values of the elements from 7 to 20 are not correct !!) > > 3999.9999999999982 3999.9999999999982 3999.9999999999982 3999.9999999999982 3999.9999999999982 3000.0000000000005 3000.0000000000005 0 > > 3000.0000000000009 3000.0000000000009 3000.0000000000009 2000.0000000000011 2000.0000000000011 2000.0000000000000 1 > > 2000.0000000000009 2000.0000000000009 1000.0000000000003 1000.0000000000003 1000.0000000000003 999.99999999999989 2 > > 1000.0000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000000 3 > > > check xx_v 0 0.0000000000000000 0.0000000000000000 0 > check xx_v 1 3999.9999999999982 3999.9999999999982 0 > check xx_v 2 3999.9999999999982 3999.9999999999982 0 > check xx_v 3 3999.9999999999982 3999.9999999999982 0 > check xx_v 4 3999.9999999999982 3999.9999999999982 0 > check xx_v 5 3999.9999999999982 3999.9999999999982 0 > check xx_v 6 3000.0000000000005 3000.0000000000005 0 > check xx_v 7 1.99665037664579820E-314 1.99665037664579820E-314 1 > check xx_v 8 2.61360726650019422E-321 2.61360726650019422E-321 1 > check xx_v 9 7.90505033345994471E-323 7.90505033345994471E-323 1 > check xx_v 10 1.69759663277221785E-312 1.69759663277221785E-312 1 > check xx_v 11 6.16846344148335980E-317 6.16846344148335980E-317 1 > check xx_v 12 6.16846640587723485E-317 6.16846640587723485E-317 1 > check xx_v 13 6.16838982570212945E-317 6.16838982570212945E-317 2 > check xx_v 14 1.99665037664579820E-314 1.99665037664579820E-314 2 > check xx_v 15 6.19790333112210216E-317 6.19790333112210216E-317 2 > check xx_v 16 6.20255545324334334E-317 6.20255545324334334E-317 2 > check xx_v 17 6.20225061473985929E-317 6.20225061473985929E-317 2 > check xx_v 18 6.18242376037225006E-317 6.18242376037225006E-317 2 > check xx_v 19 6.16846640587723485E-317 6.16846640587723485E-317 3 > check xx_v 20 6.18113523716789609E-317 6.18113523716789609E-317 3 > check xx_v 21 0.0000000000000000 0.0000000000000000 3 > check xx_v 22 0.0000000000000000 0.0000000000000000 3 > check xx_v 23 0.0000000000000000 0.0000000000000000 3 > check xx_v 24 0.0000000000000000 0.0000000000000000 3 > > ======The vector x is : > Process [0] > 4000 > 4000 > 4000 > 4000 > 4000 > 3000 > 3000 > Process [1] > 3000 > 3000 > 3000 > 2000 > 2000 > 2000 > Process [2] > 2000 > 2000 > 1000 > 1000 > 1000 > 1000 > Process [3] > 1000 > 0 > 0 > 0 > 0 > 0 > > > > > > Date: Tue, 4 Jan 2011 17:50:11 -0600 > > From: balay at mcs.anl.gov > > To: petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] error in calling VecGetArrayf90() > > > > Did you included "finclude/petscvec.h90" in your code - as the example did? > > > > satish > > > > On Tue, 4 Jan 2011, Peter Wang wrote: > > > > > > > > I am trying to obtain the value of each element of a solution Vector by KSPsolve(). > > > > > > The variables are defined according the example of ex4f90.F in \petsc-3.1-p5\src\snes\examples\tutorials\ as following, > > > > > > PetscScalar, pointer :: xx_v(:) > > > > > > ... > > > call KSPSolve(ksp,b,x,ierr) > > > call VecView(x,PETSC_VIEWER_STDOUT_WORLD,ierr) > > > > > > call VecGetArrayF90(x,xx_v,ierr) > > > call VecRestoreArrayF90(x,xx_v,ierr) > > > > > > ... > > > > > > But, the error keeps coming out when call VecGetArrayF90(x,xx_v,ierr) and call VecRestoreArrayF90(x,xx_v,ierr) are not commented off. > > > > > > > > > The error information shows: > > > Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range > > > > > > [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------ > > > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, > > > [0]PETSC ERROR: INSTEAD the line number of the start of the function > > > [0]PETSC ERROR: is given. > > > [0]PETSC ERROR: [0] F90Array1dCreate line 52 src/sys/f90-src/f90_cwrap.c > > > [0]PETSC ERROR: --------------------- Error Message ------------------------------------ > > > > > > I checked the code according the example, but cannot see any difference to that. Just don't know why the pointer array xx_v doesn't work here? Thanks. > > > > > > > > > From C.Klaij at marin.nl Wed Jan 5 09:55:08 2011 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Wed, 5 Jan 2011 15:55:08 +0000 Subject: [petsc-users] which version of openmpi for petsc-3.1? Message-ID: The manual states: "If one had to download a compatible external package manually, then the URL for this package is listed in configure source for this package. For example, check config/PETSc/packages/SuperLU.py for the url for download this package." But I cannot find anything like OpenMPI.py... Which version should I download? Chris dr. ir. Christiaan Klaij CFD Researcher Research & Development E mailto:C.Klaij at marin.nl T +31 317 49 33 44 MARIN 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl From dave.mayhem23 at gmail.com Wed Jan 5 10:08:44 2011 From: dave.mayhem23 at gmail.com (Dave May) Date: Wed, 5 Jan 2011 17:08:44 +0100 Subject: [petsc-users] which version of openmpi for petsc-3.1? In-Reply-To: References: Message-ID: If you dig through BuildSystem/config/packages/MPI.py you will find where the tarballs come from which are used with --download-{mpich,openmpi}=yes Cheers, Dave On 5 January 2011 16:55, Klaij, Christiaan wrote: > The manual states: > > "If one had to download a compatible external package manually, then the URL for this package is listed in configure source for this package. For example, check config/PETSc/packages/SuperLU.py for the url for download this package." > > But I cannot find anything like OpenMPI.py... Which version should I download? > > Chris > > > dr. ir. Christiaan Klaij > CFD Researcher > Research & Development > E mailto:C.Klaij at marin.nl > T +31 317 49 33 44 > > MARIN > 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands > T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl > > From jed at 59A2.org Wed Jan 5 10:11:19 2011 From: jed at 59A2.org (Jed Brown) Date: Wed, 5 Jan 2011 08:11:19 -0800 Subject: [petsc-users] which version of openmpi for petsc-3.1? In-Reply-To: References: Message-ID: Just use the most recent version unless you have a reason not to. PETSc does not depend on any specific versions (or implementations) of MPI. On Jan 5, 2011 7:55 AM, "Klaij, Christiaan" wrote: The manual states: "If one had to download a compatible external package manually, then the URL for this package is listed in configure source for this package. For example, check config/PETSc/packages/SuperLU.py for the url for download this package." But I cannot find anything like OpenMPI.py... Which version should I download? Chris dr. ir. Christiaan Klaij CFD Researcher Research & Development E mailto:C.Klaij at marin.nl T +31 317 49 33 44 MARIN 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Wed Jan 5 10:15:06 2011 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 5 Jan 2011 10:15:06 -0600 (CST) Subject: [petsc-users] which version of openmpi for petsc-3.1? In-Reply-To: References: Message-ID: --download-PACKAGE=PACKAGE.tar.gz usually works only with the specified version with the URL listed in the corresponding PACKAGE.py However with MPI - you can install any impl/version separately with your desired compilers [and not use --download-mpich or --download-openmpi], and then configure PETSc with mpicc/mpif90 etc from this install. Satish On Wed, 5 Jan 2011, Jed Brown wrote: > Just use the most recent version unless you have a reason not to. PETSc does > not depend on any specific versions (or implementations) of MPI. > > On Jan 5, 2011 7:55 AM, "Klaij, Christiaan" wrote: > > The manual states: > > "If one had to download a compatible external package manually, then the URL > for this package is listed in configure source for this package. For > example, check config/PETSc/packages/SuperLU.py for the url for download > this package." > > But I cannot find anything like OpenMPI.py... Which version should I > download? > > Chris > > > dr. ir. Christiaan Klaij > CFD Researcher > Research & Development > E mailto:C.Klaij at marin.nl > T +31 317 49 33 44 > > MARIN > 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands > T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl > From balay at mcs.anl.gov Wed Jan 5 10:18:05 2011 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 5 Jan 2011 10:18:05 -0600 (CST) Subject: [petsc-users] which version of openmpi for petsc-3.1? In-Reply-To: References: Message-ID: One additional note with OpenMPI install: OpenMPI defaults to installing shared libraries - but does not use --rpath option within the mpicc wrapper - and requires user to set LD_LIBRARY_PATH - so that mpicc is useable. So make sure you do this before running PETSc configure [or install openmpi static..] Satish On Wed, 5 Jan 2011, Satish Balay wrote: > --download-PACKAGE=PACKAGE.tar.gz usually works only with the > specified version with the URL listed in the corresponding PACKAGE.py > > However with MPI - you can install any impl/version separately with > your desired compilers [and not use --download-mpich or > --download-openmpi], and then configure PETSc with mpicc/mpif90 etc > from this install. > > Satish > > > On Wed, 5 Jan 2011, Jed Brown wrote: > > > Just use the most recent version unless you have a reason not to. PETSc does > > not depend on any specific versions (or implementations) of MPI. > > > > On Jan 5, 2011 7:55 AM, "Klaij, Christiaan" wrote: > > > > The manual states: > > > > "If one had to download a compatible external package manually, then the URL > > for this package is listed in configure source for this package. For > > example, check config/PETSc/packages/SuperLU.py for the url for download > > this package." > > > > But I cannot find anything like OpenMPI.py... Which version should I > > download? > > > > Chris > > > > > > dr. ir. Christiaan Klaij > > CFD Researcher > > Research & Development > > E mailto:C.Klaij at marin.nl > > T +31 317 49 33 44 > > > > MARIN > > 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands > > T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl > > > > From ecoon at lanl.gov Wed Jan 5 10:45:21 2011 From: ecoon at lanl.gov (Ethan Coon) Date: Wed, 05 Jan 2011 09:45:21 -0700 Subject: [petsc-users] error in calling VecGetArrayf90() In-Reply-To: References: , Message-ID: <1294245921.4768.14.camel@echo.lanl.gov> On Tue, 2011-01-04 at 22:49 -0600, Satish Balay wrote: > The global index starts at Istart - but the array index starts at 1 [for fortran array] > What Satish said. But you can get around this by declaring the size/shape (by passing into a subroutine). See http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/src/snes/examples/tutorials/ex5f90.F.html lines 113-121 and the subroutine FormFunctionLocal included therein for an example. Ethan > Satish > > On Tue, 4 Jan 2011, Peter Wang wrote: > > > > > In last question, the pointer xx_v is local data. However, if write them to the monitor or assign them to another array, the value is incorrect. > > > > The protion of the code to display them on the monitor is like as following: > > call MatGetOwnershipRange(A,Istart,Iend,ierr) > > call VecGetArrayF90(x,xx_v,ierr) ! Vector x is matched with Matrix A in the same communicator > > > > write(*,*)xx_v,myid ! write the poiner array together > > > > do i=Istart,Iend-1 > > write(6,*)'check xx_v',i,xx_v(i),myid !write the element of the array one by one with local range (Istart to Iend-1) > > enddo > > > > > > =========The result is as following: ( the values of the elements from 7 to 20 are not correct !!) > > > > 3999.9999999999982 3999.9999999999982 3999.9999999999982 3999.9999999999982 3999.9999999999982 3000.0000000000005 3000.0000000000005 0 > > > > 3000.0000000000009 3000.0000000000009 3000.0000000000009 2000.0000000000011 2000.0000000000011 2000.0000000000000 1 > > > > 2000.0000000000009 2000.0000000000009 1000.0000000000003 1000.0000000000003 1000.0000000000003 999.99999999999989 2 > > > > 1000.0000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000000 3 > > > > > > check xx_v 0 0.0000000000000000 0.0000000000000000 0 > > check xx_v 1 3999.9999999999982 3999.9999999999982 0 > > check xx_v 2 3999.9999999999982 3999.9999999999982 0 > > check xx_v 3 3999.9999999999982 3999.9999999999982 0 > > check xx_v 4 3999.9999999999982 3999.9999999999982 0 > > check xx_v 5 3999.9999999999982 3999.9999999999982 0 > > check xx_v 6 3000.0000000000005 3000.0000000000005 0 > > check xx_v 7 1.99665037664579820E-314 1.99665037664579820E-314 1 > > check xx_v 8 2.61360726650019422E-321 2.61360726650019422E-321 1 > > check xx_v 9 7.90505033345994471E-323 7.90505033345994471E-323 1 > > check xx_v 10 1.69759663277221785E-312 1.69759663277221785E-312 1 > > check xx_v 11 6.16846344148335980E-317 6.16846344148335980E-317 1 > > check xx_v 12 6.16846640587723485E-317 6.16846640587723485E-317 1 > > check xx_v 13 6.16838982570212945E-317 6.16838982570212945E-317 2 > > check xx_v 14 1.99665037664579820E-314 1.99665037664579820E-314 2 > > check xx_v 15 6.19790333112210216E-317 6.19790333112210216E-317 2 > > check xx_v 16 6.20255545324334334E-317 6.20255545324334334E-317 2 > > check xx_v 17 6.20225061473985929E-317 6.20225061473985929E-317 2 > > check xx_v 18 6.18242376037225006E-317 6.18242376037225006E-317 2 > > check xx_v 19 6.16846640587723485E-317 6.16846640587723485E-317 3 > > check xx_v 20 6.18113523716789609E-317 6.18113523716789609E-317 3 > > check xx_v 21 0.0000000000000000 0.0000000000000000 3 > > check xx_v 22 0.0000000000000000 0.0000000000000000 3 > > check xx_v 23 0.0000000000000000 0.0000000000000000 3 > > check xx_v 24 0.0000000000000000 0.0000000000000000 3 > > > > ======The vector x is : > > Process [0] > > 4000 > > 4000 > > 4000 > > 4000 > > 4000 > > 3000 > > 3000 > > Process [1] > > 3000 > > 3000 > > 3000 > > 2000 > > 2000 > > 2000 > > Process [2] > > 2000 > > 2000 > > 1000 > > 1000 > > 1000 > > 1000 > > Process [3] > > 1000 > > 0 > > 0 > > 0 > > 0 > > 0 > > > > > > > > > > > Date: Tue, 4 Jan 2011 17:50:11 -0600 > > > From: balay at mcs.anl.gov > > > To: petsc-users at mcs.anl.gov > > > Subject: Re: [petsc-users] error in calling VecGetArrayf90() > > > > > > Did you included "finclude/petscvec.h90" in your code - as the example did? > > > > > > satish > > > > > > On Tue, 4 Jan 2011, Peter Wang wrote: > > > > > > > > > > > I am trying to obtain the value of each element of a solution Vector by KSPsolve(). > > > > > > > > The variables are defined according the example of ex4f90.F in \petsc-3.1-p5\src\snes\examples\tutorials\ as following, > > > > > > > > PetscScalar, pointer :: xx_v(:) > > > > > > > > ... > > > > call KSPSolve(ksp,b,x,ierr) > > > > call VecView(x,PETSC_VIEWER_STDOUT_WORLD,ierr) > > > > > > > > call VecGetArrayF90(x,xx_v,ierr) > > > > call VecRestoreArrayF90(x,xx_v,ierr) > > > > > > > > ... > > > > > > > > But, the error keeps coming out when call VecGetArrayF90(x,xx_v,ierr) and call VecRestoreArrayF90(x,xx_v,ierr) are not commented off. > > > > > > > > > > > > The error information shows: > > > > Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range > > > > > > > > [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------ > > > > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, > > > > [0]PETSC ERROR: INSTEAD the line number of the start of the function > > > > [0]PETSC ERROR: is given. > > > > [0]PETSC ERROR: [0] F90Array1dCreate line 52 src/sys/f90-src/f90_cwrap.c > > > > [0]PETSC ERROR: --------------------- Error Message ------------------------------------ > > > > > > > > I checked the code according the example, but cannot see any difference to that. Just don't know why the pointer array xx_v doesn't work here? Thanks. > > > > > > > > > > > > > > -- ------------------------------------- Ethan Coon Post-Doctoral Researcher Mathematical Modeling and Analysis Los Alamos National Laboratory 505-665-8289 http://www.ldeo.columbia.edu/~ecoon/ ------------------------------------- From pengxwang at hotmail.com Wed Jan 5 11:23:07 2011 From: pengxwang at hotmail.com (Peter Wang) Date: Wed, 5 Jan 2011 11:23:07 -0600 Subject: [petsc-users] error in calling VecGetArrayf90() In-Reply-To: References: , , , , Message-ID: Thanks, Satish, The index of the array is modified to i+1: !=================== do i=Istart,Iend-1 write(6,*)'check xx_v',i+1,xx_v(i+1),myid enddo !=================== However, only the elements on root process (process 0) and the last process (process 3) are corrent, is there any ohter logical error? check xx_v 1 3999.9999999999982 0 check xx_v 2 3999.9999999999982 0 check xx_v 3 3999.9999999999982 0 check xx_v 4 3999.9999999999982 0 check xx_v 5 3999.9999999999982 0 check xx_v 6 3000.0000000000005 0 check xx_v 7 3000.0000000000005 0 check xx_v 8 2.61360726650019422E-321 1 check xx_v 9 7.90505033345994471E-323 1 check xx_v 10 1.69759663277221785E-312 1 check xx_v 11 6.16840020108069212E-317 1 check xx_v 12 6.16840316547456717E-317 1 check xx_v 13 6.16832658529946177E-317 1 check xx_v 14 1.99665037664579820E-314 2 check xx_v 15 6.19784009071943448E-317 2 check xx_v 16 6.20249221284067566E-317 2 check xx_v 17 6.20218737433719161E-317 2 check xx_v 18 6.18236051996958238E-317 2 check xx_v 19 6.16840316547456717E-317 2 check xx_v 20 6.18107199676522841E-317 3 check xx_v 21 0.0000000000000000 3 check xx_v 22 0.0000000000000000 3 check xx_v 23 0.0000000000000000 3 check xx_v 24 0.0000000000000000 3 check xx_v 25 0.0000000000000000 3 > Date: Tue, 4 Jan 2011 22:49:50 -0600 > From: balay at mcs.anl.gov > To: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] error in calling VecGetArrayf90() > > The global index starts at Istart - but the array index starts at 1 [for fortran array] > > Satish > > On Tue, 4 Jan 2011, Peter Wang wrote: > > > > > In last question, the pointer xx_v is local data. However, if write them to the monitor or assign them to another array, the value is incorrect. > > > > The protion of the code to display them on the monitor is like as following: > > call MatGetOwnershipRange(A,Istart,Iend,ierr) > > call VecGetArrayF90(x,xx_v,ierr) ! Vector x is matched with Matrix A in the same communicator > > > > write(*,*)xx_v,myid ! write the poiner array together > > > > do i=Istart,Iend-1 > > write(6,*)'check xx_v',i,xx_v(i),myid !write the element of the array one by one with local range (Istart to Iend-1) > > enddo > > > > > > =========The result is as following: ( the values of the elements from 7 to 20 are not correct !!) > > > > 3999.9999999999982 3999.9999999999982 3999.9999999999982 3999.9999999999982 3999.9999999999982 3000.0000000000005 3000.0000000000005 0 > > > > 3000.0000000000009 3000.0000000000009 3000.0000000000009 2000.0000000000011 2000.0000000000011 2000.0000000000000 1 > > > > 2000.0000000000009 2000.0000000000009 1000.0000000000003 1000.0000000000003 1000.0000000000003 999.99999999999989 2 > > > > 1000.0000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000000 0.0000000000000000 3 > > > > > > check xx_v 0 0.0000000000000000 0.0000000000000000 0 > > check xx_v 1 3999.9999999999982 3999.9999999999982 0 > > check xx_v 2 3999.9999999999982 3999.9999999999982 0 > > check xx_v 3 3999.9999999999982 3999.9999999999982 0 > > check xx_v 4 3999.9999999999982 3999.9999999999982 0 > > check xx_v 5 3999.9999999999982 3999.9999999999982 0 > > check xx_v 6 3000.0000000000005 3000.0000000000005 0 > > check xx_v 7 1.99665037664579820E-314 1.99665037664579820E-314 1 > > check xx_v 8 2.61360726650019422E-321 2.61360726650019422E-321 1 > > check xx_v 9 7.90505033345994471E-323 7.90505033345994471E-323 1 > > check xx_v 10 1.69759663277221785E-312 1.69759663277221785E-312 1 > > check xx_v 11 6.16846344148335980E-317 6.16846344148335980E-317 1 > > check xx_v 12 6.16846640587723485E-317 6.16846640587723485E-317 1 > > check xx_v 13 6.16838982570212945E-317 6.16838982570212945E-317 2 > > check xx_v 14 1.99665037664579820E-314 1.99665037664579820E-314 2 > > check xx_v 15 6.19790333112210216E-317 6.19790333112210216E-317 2 > > check xx_v 16 6.20255545324334334E-317 6.20255545324334334E-317 2 > > check xx_v 17 6.20225061473985929E-317 6.20225061473985929E-317 2 > > check xx_v 18 6.18242376037225006E-317 6.18242376037225006E-317 2 > > check xx_v 19 6.16846640587723485E-317 6.16846640587723485E-317 3 > > check xx_v 20 6.18113523716789609E-317 6.18113523716789609E-317 3 > > check xx_v 21 0.0000000000000000 0.0000000000000000 3 > > check xx_v 22 0.0000000000000000 0.0000000000000000 3 > > check xx_v 23 0.0000000000000000 0.0000000000000000 3 > > check xx_v 24 0.0000000000000000 0.0000000000000000 3 > > > > ======The vector x is : > > Process [0] > > 4000 > > 4000 > > 4000 > > 4000 > > 4000 > > 3000 > > 3000 > > Process [1] > > 3000 > > 3000 > > 3000 > > 2000 > > 2000 > > 2000 > > Process [2] > > 2000 > > 2000 > > 1000 > > 1000 > > 1000 > > 1000 > > Process [3] > > 1000 > > 0 > > 0 > > 0 > > 0 > > 0 > > > > > > > > > > > Date: Tue, 4 Jan 2011 17:50:11 -0600 > > > From: balay at mcs.anl.gov > > > To: petsc-users at mcs.anl.gov > > > Subject: Re: [petsc-users] error in calling VecGetArrayf90() > > > > > > Did you included "finclude/petscvec.h90" in your code - as the example did? > > > > > > satish > > > > > > On Tue, 4 Jan 2011, Peter Wang wrote: > > > > > > > > > > > I am trying to obtain the value of each element of a solution Vector by KSPsolve(). > > > > > > > > The variables are defined according the example of ex4f90.F in \petsc-3.1-p5\src\snes\examples\tutorials\ as following, > > > > > > > > PetscScalar, pointer :: xx_v(:) > > > > > > > > ... > > > > call KSPSolve(ksp,b,x,ierr) > > > > call VecView(x,PETSC_VIEWER_STDOUT_WORLD,ierr) > > > > > > > > call VecGetArrayF90(x,xx_v,ierr) > > > > call VecRestoreArrayF90(x,xx_v,ierr) > > > > > > > > ... > > > > > > > > But, the error keeps coming out when call VecGetArrayF90(x,xx_v,ierr) and call VecRestoreArrayF90(x,xx_v,ierr) are not commented off. > > > > > > > > > > > > The error information shows: > > > > Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range > > > > > > > > [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------ > > > > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, > > > > [0]PETSC ERROR: INSTEAD the line number of the start of the function > > > > [0]PETSC ERROR: is given. > > > > [0]PETSC ERROR: [0] F90Array1dCreate line 52 src/sys/f90-src/f90_cwrap.c > > > > [0]PETSC ERROR: --------------------- Error Message ------------------------------------ > > > > > > > > I checked the code according the example, but cannot see any difference to that. Just don't know why the pointer array xx_v doesn't work here? Thanks. > > > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ecoon at lanl.gov Wed Jan 5 13:03:48 2011 From: ecoon at lanl.gov (Ethan Coon) Date: Wed, 05 Jan 2011 12:03:48 -0700 Subject: [petsc-users] error in calling VecGetArrayf90() In-Reply-To: References: , , , , Message-ID: <1294254228.4768.36.camel@echo.lanl.gov> On all processors, the array you get is indexed: xx_v(1:(Iend-Istart)) while the Istart and Iend are global indices into the global Vec. It's only by luck that the values on proc 3 are correct (this code should probably seg fault as it is accessing memory outside of the bounds of xx_v). To access the array, you'll want: do i=1, (Iend-Istart) write(6,*)'check xx_v',i,xx_v(i),myid enddo or (better yet) pass it into a subroutine to get the array indexed correctly, like demonstrated in the example. Ethan On Wed, 2011-01-05 at 11:23 -0600, Peter Wang wrote: > Thanks, Satish, > > The index of the array is modified to i+1: > !=================== > do i=Istart,Iend-1 > write(6,*)'check xx_v',i+1,xx_v(i+1),myid > enddo > !=================== > > However, only the elements on root process (process 0) and the > last process (process 3) are corrent, is there any ohter logical > error? > > check xx_v 1 3999.9999999999982 0 > check xx_v 2 3999.9999999999982 0 > check xx_v 3 3999.9999999999982 0 > check xx_v 4 3999.9999999999982 0 > check xx_v 5 3999.9999999999982 0 > check xx_v 6 3000.0000000000005 0 > check xx_v 7 3000.0000000000005 0 > check xx_v 8 2.61360726650019422E-321 1 > check xx_v 9 7.90505033345994471E-323 1 > check xx_v 10 1.69759663277221785E-312 1 > check xx_v 11 6.16840020108069212E-317 1 > check xx_v 12 6.16840316547456717E-317 1 > check xx_v 13 6.16832658529946177E-317 1 > check xx_v 14 1.99665037664579820E-314 2 > check xx_v 15 6.19784009071943448E-317 2 > check xx_v 16 6.20249221284067566E-317 2 > check xx_v 17 6.20218737433719161E-317 2 > check xx_v 18 6.18236051996958238E-317 2 > check xx_v 19 6.16840316547456717E-317 2 > check xx_v 20 6.18107199676522841E-317 3 > check xx_v 21 0.0000000000000000 3 > check xx_v 22 0.0000000000000000 3 > check xx_v 23 0.0000000000000000 3 > check xx_v 24 0.0000000000000000 3 > check xx_v 25 0.0000000000000000 3 > > > > > Date: Tue, 4 Jan 2011 22:49:50 -0600 > > From: balay at mcs.anl.gov > > To: petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] error in calling VecGetArrayf90() > > > > The global index starts at Istart - but the array index starts at 1 > [for fortran array] > > > > Satish > > > > On Tue, 4 Jan 2011, Peter Wang wrote: > > > > > > > > In last question, the pointer xx_v is local data. However, if > write them to the monitor or assign them to another array, the value > is incorrect. > > > > > > The protion of the code to display them on the monitor is like as > following: > > > call MatGetOwnershipRange(A,Istart,Iend,ierr) > > > call VecGetArrayF90(x,xx_v,ierr) ! Vector x is matched with Matrix > A in the same communicator > > > > > > write(*,*)xx_v,myid ! write the poiner array together > > > > > > do i=Istart,Iend-1 > > > write(6,*)'check xx_v',i,xx_v(i),myid !write the element of the > array one by one with local range (Istart to Iend-1) > > > enddo > > > > > > > > > =========The result is as following: ( the values of the elements > from 7 to 20 are not correct !!) > > > > > > 3999.9999999999982 3999.9999999999982 3999.9999999999982 > 3999.9999999999982 3999.9999999999982 3000.0000000000005 > 3000.0000000000005 0 > > > > > > 3000.0000000000009 3000.0000000000009 3000.0000000000009 > 2000.0000000000011 2000.0000000000011 2000.0000000000000 1 > > > > > > 2000.0000000000009 2000.0000000000009 1000.0000000000003 > 1000.0000000000003 1000.0000000000003 999.99999999999989 2 > > > > > > 1000.0000000000000 0.0000000000000000 0.0000000000000000 > 0.0000000000000000 0.0000000000000000 0.0000000000000000 3 > > > > > > > > > check xx_v 0 0.0000000000000000 0.0000000000000000 0 > > > check xx_v 1 3999.9999999999982 3999.9999999999982 0 > > > check xx_v 2 3999.9999999999982 3999.9999999999982 0 > > > check xx_v 3 3999.9999999999982 3999.9999999999982 0 > > > check xx_v 4 3999.9999999999982 3999.9999999999982 0 > > > check xx_v 5 3999.9999999999982 3999.9999999999982 0 > > > check xx_v 6 3000.0000000000005 3000.0000000000005 0 > > > check xx_v 7 1.99665037664579820E-314 1.99665037664579820E-314 1 > > > check xx_v 8 2.61360726650019422E-321 2.61360726650019422E-321 1 > > > check xx_v 9 7.90505033345994471E-323 7.90505033345994471E-323 1 > > > check xx_v 10 1.69759663277221785E-312 1.69759663277221785E-312 1 > > > check xx_v 11 6.16846344148335980E-317 6.16846344148335980E-317 1 > > > check xx_v 12 6.16846640587723485E-317 6.16846640587723485E-317 1 > > > check xx_v 13 6.16838982570212945E-317 6.16838982570212945E-317 2 > > > check xx_v 14 1.99665037664579820E-314 1.99665037664579820E-314 2 > > > check xx_v 15 6.19790333112210216E-317 6.19790333112210216E-317 2 > > > check xx_v 16 6.20255545324334334E-317 6.20255545324334334E-317 2 > > > check xx_v 17 6.20225061473985929E-317 6.20225061473985929E-317 2 > > > check xx_v 18 6.18242376037225006E-317 6.18242376037225006E-317 2 > > > check xx_v 19 6.16846640587723485E-317 6.16846640587723485E-317 3 > > > check xx_v 20 6.18113523716789609E-317 6.18113523716789609E-317 3 > > > check xx_v 21 0.0000000000000000 0.0000000000000000 3 > > > check xx_v 22 0.0000000000000000 0.0000000000000000 3 > > > check xx_v 23 0.0000000000000000 0.0000000000000000 3 > > > check xx_v 24 0.0000000000000000 0.0000000000000000 3 > > > > > > ======The vector x is : > > > Process [0] > > > 4000 > > > 4000 > > > 4000 > > > 4000 > > > 4000 > > > 3000 > > > 3000 > > > Process [1] > > > 3000 > > > 3000 > > > 3000 > > > 2000 > > > 2000 > > > 2000 > > > Process [2] > > > 2000 > > > 2000 > > > 1000 > > > 1000 > > > 1000 > > > 1000 > > > Process [3] > > > 1000 > > > 0 > > > 0 > > > 0 > > > 0 > > > 0 > > > > > > > > > > > > > > > > Date: Tue, 4 Jan 2011 17:50:11 -0600 > > > > From: balay at mcs.anl.gov > > > > To: petsc-users at mcs.anl.gov > > > > Subject: Re: [petsc-users] error in calling VecGetArrayf90() > > > > > > > > Did you included "finclude/petscvec.h90" in your code - as the > example did? > > > > > > > > satish > > > > > > > > On Tue, 4 Jan 2011, Peter Wang wrote: > > > > > > > > > > > > > > I am trying to obtain the value of each element of a solution > Vector by KSPsolve(). > > > > > > > > > > The variables are defined according the example of ex4f90.F in > \petsc-3.1-p5\src\snes\examples\tutorials\ as following, > > > > > > > > > > PetscScalar, pointer :: xx_v(:) > > > > > > > > > > ... > > > > > call KSPSolve(ksp,b,x,ierr) > > > > > call VecView(x,PETSC_VIEWER_STDOUT_WORLD,ierr) > > > > > > > > > > call VecGetArrayF90(x,xx_v,ierr) > > > > > call VecRestoreArrayF90(x,xx_v,ierr) > > > > > > > > > > ... > > > > > > > > > > But, the error keeps coming out when call > VecGetArrayF90(x,xx_v,ierr) and call VecRestoreArrayF90(x,xx_v,ierr) > are not commented off. > > > > > > > > > > > > > > > The error information shows: > > > > > Caught signal number 11 SEGV: Segmentation Violation, probably > memory access out of range > > > > > > > > > > [0]PETSC ERROR: --------------------- Stack Frames > ------------------------------------ > > > > > [0]PETSC ERROR: Note: The EXACT line numbers in the stack are > not available, > > > > > [0]PETSC ERROR: INSTEAD the line number of the start of the > function > > > > > [0]PETSC ERROR: is given. > > > > > [0]PETSC ERROR: [0] F90Array1dCreate line 52 > src/sys/f90-src/f90_cwrap.c > > > > > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > > > > > > > > > > I checked the code according the example, but cannot see any > difference to that. Just don't know why the pointer array xx_v doesn't > work here? Thanks. > > > > > > > > > > > > > > > > > > > -- ------------------------------------- Ethan Coon Post-Doctoral Researcher Mathematical Modeling and Analysis Los Alamos National Laboratory 505-665-8289 http://www.ldeo.columbia.edu/~ecoon/ ------------------------------------- From bsmith at mcs.anl.gov Wed Jan 5 13:21:02 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 5 Jan 2011 13:21:02 -0600 Subject: [petsc-users] error in calling VecGetArrayf90() In-Reply-To: <1294254228.4768.36.camel@echo.lanl.gov> References: , , , , <1294254228.4768.36.camel@echo.lanl.gov> Message-ID: <8ECB9816-7803-4DDE-B179-A847B72F44C2@mcs.anl.gov> I don't think this is correct. You are suppose to use the local indexing for each process. With the strange index starting at 1 instead of 0. Barry On Jan 5, 2011, at 1:03 PM, Ethan Coon wrote: > On all processors, the array you get is indexed: > > xx_v(1:(Iend-Istart)) > > while the Istart and Iend are global indices into the global Vec. > > It's only by luck that the values on proc 3 are correct (this code > should probably seg fault as it is accessing memory outside of the > bounds of xx_v). > > To access the array, you'll want: > > do i=1, (Iend-Istart) > write(6,*)'check xx_v',i,xx_v(i),myid > enddo > > or (better yet) pass it into a subroutine to get the array indexed > correctly, like demonstrated in the example. > > Ethan > > On Wed, 2011-01-05 at 11:23 -0600, Peter Wang wrote: >> Thanks, Satish, >> >> The index of the array is modified to i+1: >> !=================== >> do i=Istart,Iend-1 >> write(6,*)'check xx_v',i+1,xx_v(i+1),myid >> enddo >> !=================== >> >> However, only the elements on root process (process 0) and the >> last process (process 3) are corrent, is there any ohter logical >> error? >> >> check xx_v 1 3999.9999999999982 0 >> check xx_v 2 3999.9999999999982 0 >> check xx_v 3 3999.9999999999982 0 >> check xx_v 4 3999.9999999999982 0 >> check xx_v 5 3999.9999999999982 0 >> check xx_v 6 3000.0000000000005 0 >> check xx_v 7 3000.0000000000005 0 >> check xx_v 8 2.61360726650019422E-321 1 >> check xx_v 9 7.90505033345994471E-323 1 >> check xx_v 10 1.69759663277221785E-312 1 >> check xx_v 11 6.16840020108069212E-317 1 >> check xx_v 12 6.16840316547456717E-317 1 >> check xx_v 13 6.16832658529946177E-317 1 >> check xx_v 14 1.99665037664579820E-314 2 >> check xx_v 15 6.19784009071943448E-317 2 >> check xx_v 16 6.20249221284067566E-317 2 >> check xx_v 17 6.20218737433719161E-317 2 >> check xx_v 18 6.18236051996958238E-317 2 >> check xx_v 19 6.16840316547456717E-317 2 >> check xx_v 20 6.18107199676522841E-317 3 >> check xx_v 21 0.0000000000000000 3 >> check xx_v 22 0.0000000000000000 3 >> check xx_v 23 0.0000000000000000 3 >> check xx_v 24 0.0000000000000000 3 >> check xx_v 25 0.0000000000000000 3 >> >> >> >>> Date: Tue, 4 Jan 2011 22:49:50 -0600 >>> From: balay at mcs.anl.gov >>> To: petsc-users at mcs.anl.gov >>> Subject: Re: [petsc-users] error in calling VecGetArrayf90() >>> >>> The global index starts at Istart - but the array index starts at 1 >> [for fortran array] >>> >>> Satish >>> >>> On Tue, 4 Jan 2011, Peter Wang wrote: >>> >>>> >>>> In last question, the pointer xx_v is local data. However, if >> write them to the monitor or assign them to another array, the value >> is incorrect. >>>> >>>> The protion of the code to display them on the monitor is like as >> following: >>>> call MatGetOwnershipRange(A,Istart,Iend,ierr) >>>> call VecGetArrayF90(x,xx_v,ierr) ! Vector x is matched with Matrix >> A in the same communicator >>>> >>>> write(*,*)xx_v,myid ! write the poiner array together >>>> >>>> do i=Istart,Iend-1 >>>> write(6,*)'check xx_v',i,xx_v(i),myid !write the element of the >> array one by one with local range (Istart to Iend-1) >>>> enddo >>>> >>>> >>>> =========The result is as following: ( the values of the elements >> from 7 to 20 are not correct !!) >>>> >>>> 3999.9999999999982 3999.9999999999982 3999.9999999999982 >> 3999.9999999999982 3999.9999999999982 3000.0000000000005 >> 3000.0000000000005 0 >>>> >>>> 3000.0000000000009 3000.0000000000009 3000.0000000000009 >> 2000.0000000000011 2000.0000000000011 2000.0000000000000 1 >>>> >>>> 2000.0000000000009 2000.0000000000009 1000.0000000000003 >> 1000.0000000000003 1000.0000000000003 999.99999999999989 2 >>>> >>>> 1000.0000000000000 0.0000000000000000 0.0000000000000000 >> 0.0000000000000000 0.0000000000000000 0.0000000000000000 3 >>>> >>>> >>>> check xx_v 0 0.0000000000000000 0.0000000000000000 0 >>>> check xx_v 1 3999.9999999999982 3999.9999999999982 0 >>>> check xx_v 2 3999.9999999999982 3999.9999999999982 0 >>>> check xx_v 3 3999.9999999999982 3999.9999999999982 0 >>>> check xx_v 4 3999.9999999999982 3999.9999999999982 0 >>>> check xx_v 5 3999.9999999999982 3999.9999999999982 0 >>>> check xx_v 6 3000.0000000000005 3000.0000000000005 0 >>>> check xx_v 7 1.99665037664579820E-314 1.99665037664579820E-314 1 >>>> check xx_v 8 2.61360726650019422E-321 2.61360726650019422E-321 1 >>>> check xx_v 9 7.90505033345994471E-323 7.90505033345994471E-323 1 >>>> check xx_v 10 1.69759663277221785E-312 1.69759663277221785E-312 1 >>>> check xx_v 11 6.16846344148335980E-317 6.16846344148335980E-317 1 >>>> check xx_v 12 6.16846640587723485E-317 6.16846640587723485E-317 1 >>>> check xx_v 13 6.16838982570212945E-317 6.16838982570212945E-317 2 >>>> check xx_v 14 1.99665037664579820E-314 1.99665037664579820E-314 2 >>>> check xx_v 15 6.19790333112210216E-317 6.19790333112210216E-317 2 >>>> check xx_v 16 6.20255545324334334E-317 6.20255545324334334E-317 2 >>>> check xx_v 17 6.20225061473985929E-317 6.20225061473985929E-317 2 >>>> check xx_v 18 6.18242376037225006E-317 6.18242376037225006E-317 2 >>>> check xx_v 19 6.16846640587723485E-317 6.16846640587723485E-317 3 >>>> check xx_v 20 6.18113523716789609E-317 6.18113523716789609E-317 3 >>>> check xx_v 21 0.0000000000000000 0.0000000000000000 3 >>>> check xx_v 22 0.0000000000000000 0.0000000000000000 3 >>>> check xx_v 23 0.0000000000000000 0.0000000000000000 3 >>>> check xx_v 24 0.0000000000000000 0.0000000000000000 3 >>>> >>>> ======The vector x is : >>>> Process [0] >>>> 4000 >>>> 4000 >>>> 4000 >>>> 4000 >>>> 4000 >>>> 3000 >>>> 3000 >>>> Process [1] >>>> 3000 >>>> 3000 >>>> 3000 >>>> 2000 >>>> 2000 >>>> 2000 >>>> Process [2] >>>> 2000 >>>> 2000 >>>> 1000 >>>> 1000 >>>> 1000 >>>> 1000 >>>> Process [3] >>>> 1000 >>>> 0 >>>> 0 >>>> 0 >>>> 0 >>>> 0 >>>> >>>> >>>> >>>> >>>>> Date: Tue, 4 Jan 2011 17:50:11 -0600 >>>>> From: balay at mcs.anl.gov >>>>> To: petsc-users at mcs.anl.gov >>>>> Subject: Re: [petsc-users] error in calling VecGetArrayf90() >>>>> >>>>> Did you included "finclude/petscvec.h90" in your code - as the >> example did? >>>>> >>>>> satish >>>>> >>>>> On Tue, 4 Jan 2011, Peter Wang wrote: >>>>> >>>>>> >>>>>> I am trying to obtain the value of each element of a solution >> Vector by KSPsolve(). >>>>>> >>>>>> The variables are defined according the example of ex4f90.F in >> \petsc-3.1-p5\src\snes\examples\tutorials\ as following, >>>>>> >>>>>> PetscScalar, pointer :: xx_v(:) >>>>>> >>>>>> ... >>>>>> call KSPSolve(ksp,b,x,ierr) >>>>>> call VecView(x,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>>>> >>>>>> call VecGetArrayF90(x,xx_v,ierr) >>>>>> call VecRestoreArrayF90(x,xx_v,ierr) >>>>>> >>>>>> ... >>>>>> >>>>>> But, the error keeps coming out when call >> VecGetArrayF90(x,xx_v,ierr) and call VecRestoreArrayF90(x,xx_v,ierr) >> are not commented off. >>>>>> >>>>>> >>>>>> The error information shows: >>>>>> Caught signal number 11 SEGV: Segmentation Violation, probably >> memory access out of range >>>>>> >>>>>> [0]PETSC ERROR: --------------------- Stack Frames >> ------------------------------------ >>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are >> not available, >>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of the >> function >>>>>> [0]PETSC ERROR: is given. >>>>>> [0]PETSC ERROR: [0] F90Array1dCreate line 52 >> src/sys/f90-src/f90_cwrap.c >>>>>> [0]PETSC ERROR: --------------------- Error Message >> ------------------------------------ >>>>>> >>>>>> I checked the code according the example, but cannot see any >> difference to that. Just don't know why the pointer array xx_v doesn't >> work here? Thanks. >>>>>> >>>>>> >>>>> >>>> >>> > > -- > ------------------------------------- > Ethan Coon > Post-Doctoral Researcher > Mathematical Modeling and Analysis > Los Alamos National Laboratory > 505-665-8289 > > http://www.ldeo.columbia.edu/~ecoon/ > ------------------------------------- > From jed at 59A2.org Wed Jan 5 13:28:58 2011 From: jed at 59A2.org (Jed Brown) Date: Wed, 5 Jan 2011 11:28:58 -0800 Subject: [petsc-users] error in calling VecGetArrayf90() In-Reply-To: References: <1294254228.4768.36.camel@echo.lanl.gov> <8ECB9816-7803-4DDE-B179-A847B72F44C2@mcs.anl.gov> Message-ID: 1:(Iend-Istart) is "local" indexing in the global vector. Since there us no scatter, there is no local vector and thus no indexing that includes ghosts (which would be consistent use of the term "local"). I think Ethan is correct that the only way to get proper "natural" indexing is to have an inner function. Am I missing something? On Jan 5, 2011 11:21 AM, "Barry Smith" wrote: I don't think this is correct. You are suppose to use the local indexing for each process. With the strange index starting at 1 instead of 0. Barry On Jan 5, 2011, at 1:03 PM, Ethan Coon wrote: > On all processors, the array you get is indexed: ... -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Wed Jan 5 13:35:24 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 5 Jan 2011 13:35:24 -0600 Subject: [petsc-users] error in calling VecGetArrayf90() In-Reply-To: References: <1294254228.4768.36.camel@echo.lanl.gov> <8ECB9816-7803-4DDE-B179-A847B72F44C2@mcs.anl.gov> Message-ID: <3920DEE1-ACD4-46C6-A43A-B2BE249CC081@mcs.anl.gov> I totally lost it. Yes VecGetArrayF90 starts with 1 on each process. Barry I lost my mind and was thinking of DAVecGetArrayF90() On Jan 5, 2011, at 1:28 PM, Jed Brown wrote: > 1:(Iend-Istart) is "local" indexing in the global vector. Since there us no scatter, there is no local vector and thus no indexing that includes ghosts (which would be consistent use of the term "local"). I think Ethan is correct that the only way to get proper "natural" indexing is to have an inner function. Am I missing something? > > >> On Jan 5, 2011 11:21 AM, "Barry Smith" wrote: >> >> >> I don't think this is correct. You are suppose to use the local indexing for each process. With the strange index starting at 1 instead of 0. >> >> >> Barry >> >> >> On Jan 5, 2011, at 1:03 PM, Ethan Coon wrote: >> >> > On all processors, the array you get is indexed: >> ... >> > From ecoon at lanl.gov Wed Jan 5 13:52:20 2011 From: ecoon at lanl.gov (Ethan Coon) Date: Wed, 05 Jan 2011 12:52:20 -0700 Subject: [petsc-users] error in calling VecGetArrayf90() In-Reply-To: <8ECB9816-7803-4DDE-B179-A847B72F44C2@mcs.anl.gov> References: , , , , <1294254228.4768.36.camel@echo.lanl.gov> <8ECB9816-7803-4DDE-B179-A847B72F44C2@mcs.anl.gov> Message-ID: <1294257140.4768.47.camel@echo.lanl.gov> See the below program and output (done with dev, but I don't think this has changed). In F90, the local arrays, gotten by VecGetArrayF90, are all indexed: xx_v(1:(Iend-Istart)). This should really be moot -- It's way easier to follow the example code for ex50f90.F and pass their arrays into a subroutine which casts the shape in the "PETSc way", indexed from (Istart:Iend-1). Ethan --- #define PETSC_USE_FORTRAN_MODULES 1 #include "finclude/petscsysdef.h" #include "finclude/petscvecdef.h" program test use petsc implicit none Vec x PetscInt Istart, Iend PetscScalar, pointer:: xx_v(:) PetscInt i PetscErrorCode ierr PetscInt twenty PetscInt rank twenty = 20 call PetscInitialize(PETSC_NULL_CHARACTER, ierr) call mpi_comm_rank(PETSC_COMM_WORLD,rank,ierr) call VecCreateMPI(PETSC_COMM_WORLD, PETSC_DECIDE, twenty, x, ierr) call VecGetArrayF90(x, xx_v, ierr) xx_v = rank+1 call VecRestoreArrayF90(x, xx_v, ierr) call VecView(x, PETSC_VIEWER_STDOUT_WORLD, ierr) call VecGetOwnershipRange(x, Istart, Iend) write (*,*) 'range on rank', rank, ':', Istart, Iend call VecGetArrayF90(x, xx_v, ierr) CHKMEMQ do i = 1,(Iend-Istart) write (*,*) i, xx_v(i), rank enddo CHKMEMQ call VecRestoreArrayF90(x, xx_v, ierr) call VecDestroy(x, ierr) call PetscFinalize(ierr) end program test ----- $> ${PETSC_DIR}/${PETSC_ARCH}/bin/mpiexec -n 2 ./test Vector Object: type: mpi Process [0] 1 1 1 1 1 1 1 1 1 1 Process [1] 2 2 2 2 2 2 2 2 2 2 range on rank 1 : 10 20 1 2.0000000000000000 1 2 2.0000000000000000 1 3 2.0000000000000000 1 4 2.0000000000000000 1 5 2.0000000000000000 1 6 2.0000000000000000 1 7 2.0000000000000000 1 8 2.0000000000000000 1 9 2.0000000000000000 1 10 2.0000000000000000 1 range on rank 0 : 0 10 1 1.00000000000000000 0 2 1.00000000000000000 0 3 1.00000000000000000 0 4 1.00000000000000000 0 5 1.00000000000000000 0 6 1.00000000000000000 0 7 1.00000000000000000 0 8 1.00000000000000000 0 9 1.00000000000000000 0 10 1.00000000000000000 0 --- On Wed, 2011-01-05 at 13:21 -0600, Barry Smith wrote: > I don't think this is correct. You are suppose to use the local indexing for each process. With the strange index starting at 1 instead of 0. > > > Barry > > > On Jan 5, 2011, at 1:03 PM, Ethan Coon wrote: > > > On all processors, the array you get is indexed: > > > > xx_v(1:(Iend-Istart)) > > > > while the Istart and Iend are global indices into the global Vec. > > > > It's only by luck that the values on proc 3 are correct (this code > > should probably seg fault as it is accessing memory outside of the > > bounds of xx_v). > > > > To access the array, you'll want: > > > > do i=1, (Iend-Istart) > > write(6,*)'check xx_v',i,xx_v(i),myid > > enddo > > > > or (better yet) pass it into a subroutine to get the array indexed > > correctly, like demonstrated in the example. > > > > Ethan > > > > On Wed, 2011-01-05 at 11:23 -0600, Peter Wang wrote: > >> Thanks, Satish, > >> > >> The index of the array is modified to i+1: > >> !=================== > >> do i=Istart,Iend-1 > >> write(6,*)'check xx_v',i+1,xx_v(i+1),myid > >> enddo > >> !=================== > >> > >> However, only the elements on root process (process 0) and the > >> last process (process 3) are corrent, is there any ohter logical > >> error? > >> > >> check xx_v 1 3999.9999999999982 0 > >> check xx_v 2 3999.9999999999982 0 > >> check xx_v 3 3999.9999999999982 0 > >> check xx_v 4 3999.9999999999982 0 > >> check xx_v 5 3999.9999999999982 0 > >> check xx_v 6 3000.0000000000005 0 > >> check xx_v 7 3000.0000000000005 0 > >> check xx_v 8 2.61360726650019422E-321 1 > >> check xx_v 9 7.90505033345994471E-323 1 > >> check xx_v 10 1.69759663277221785E-312 1 > >> check xx_v 11 6.16840020108069212E-317 1 > >> check xx_v 12 6.16840316547456717E-317 1 > >> check xx_v 13 6.16832658529946177E-317 1 > >> check xx_v 14 1.99665037664579820E-314 2 > >> check xx_v 15 6.19784009071943448E-317 2 > >> check xx_v 16 6.20249221284067566E-317 2 > >> check xx_v 17 6.20218737433719161E-317 2 > >> check xx_v 18 6.18236051996958238E-317 2 > >> check xx_v 19 6.16840316547456717E-317 2 > >> check xx_v 20 6.18107199676522841E-317 3 > >> check xx_v 21 0.0000000000000000 3 > >> check xx_v 22 0.0000000000000000 3 > >> check xx_v 23 0.0000000000000000 3 > >> check xx_v 24 0.0000000000000000 3 > >> check xx_v 25 0.0000000000000000 3 > >> > >> > >> > >>> Date: Tue, 4 Jan 2011 22:49:50 -0600 > >>> From: balay at mcs.anl.gov > >>> To: petsc-users at mcs.anl.gov > >>> Subject: Re: [petsc-users] error in calling VecGetArrayf90() > >>> > >>> The global index starts at Istart - but the array index starts at 1 > >> [for fortran array] > >>> > >>> Satish > >>> > >>> On Tue, 4 Jan 2011, Peter Wang wrote: > >>> > >>>> > >>>> In last question, the pointer xx_v is local data. However, if > >> write them to the monitor or assign them to another array, the value > >> is incorrect. > >>>> > >>>> The protion of the code to display them on the monitor is like as > >> following: > >>>> call MatGetOwnershipRange(A,Istart,Iend,ierr) > >>>> call VecGetArrayF90(x,xx_v,ierr) ! Vector x is matched with Matrix > >> A in the same communicator > >>>> > >>>> write(*,*)xx_v,myid ! write the poiner array together > >>>> > >>>> do i=Istart,Iend-1 > >>>> write(6,*)'check xx_v',i,xx_v(i),myid !write the element of the > >> array one by one with local range (Istart to Iend-1) > >>>> enddo > >>>> > >>>> > >>>> =========The result is as following: ( the values of the elements > >> from 7 to 20 are not correct !!) > >>>> > >>>> 3999.9999999999982 3999.9999999999982 3999.9999999999982 > >> 3999.9999999999982 3999.9999999999982 3000.0000000000005 > >> 3000.0000000000005 0 > >>>> > >>>> 3000.0000000000009 3000.0000000000009 3000.0000000000009 > >> 2000.0000000000011 2000.0000000000011 2000.0000000000000 1 > >>>> > >>>> 2000.0000000000009 2000.0000000000009 1000.0000000000003 > >> 1000.0000000000003 1000.0000000000003 999.99999999999989 2 > >>>> > >>>> 1000.0000000000000 0.0000000000000000 0.0000000000000000 > >> 0.0000000000000000 0.0000000000000000 0.0000000000000000 3 > >>>> > >>>> > >>>> check xx_v 0 0.0000000000000000 0.0000000000000000 0 > >>>> check xx_v 1 3999.9999999999982 3999.9999999999982 0 > >>>> check xx_v 2 3999.9999999999982 3999.9999999999982 0 > >>>> check xx_v 3 3999.9999999999982 3999.9999999999982 0 > >>>> check xx_v 4 3999.9999999999982 3999.9999999999982 0 > >>>> check xx_v 5 3999.9999999999982 3999.9999999999982 0 > >>>> check xx_v 6 3000.0000000000005 3000.0000000000005 0 > >>>> check xx_v 7 1.99665037664579820E-314 1.99665037664579820E-314 1 > >>>> check xx_v 8 2.61360726650019422E-321 2.61360726650019422E-321 1 > >>>> check xx_v 9 7.90505033345994471E-323 7.90505033345994471E-323 1 > >>>> check xx_v 10 1.69759663277221785E-312 1.69759663277221785E-312 1 > >>>> check xx_v 11 6.16846344148335980E-317 6.16846344148335980E-317 1 > >>>> check xx_v 12 6.16846640587723485E-317 6.16846640587723485E-317 1 > >>>> check xx_v 13 6.16838982570212945E-317 6.16838982570212945E-317 2 > >>>> check xx_v 14 1.99665037664579820E-314 1.99665037664579820E-314 2 > >>>> check xx_v 15 6.19790333112210216E-317 6.19790333112210216E-317 2 > >>>> check xx_v 16 6.20255545324334334E-317 6.20255545324334334E-317 2 > >>>> check xx_v 17 6.20225061473985929E-317 6.20225061473985929E-317 2 > >>>> check xx_v 18 6.18242376037225006E-317 6.18242376037225006E-317 2 > >>>> check xx_v 19 6.16846640587723485E-317 6.16846640587723485E-317 3 > >>>> check xx_v 20 6.18113523716789609E-317 6.18113523716789609E-317 3 > >>>> check xx_v 21 0.0000000000000000 0.0000000000000000 3 > >>>> check xx_v 22 0.0000000000000000 0.0000000000000000 3 > >>>> check xx_v 23 0.0000000000000000 0.0000000000000000 3 > >>>> check xx_v 24 0.0000000000000000 0.0000000000000000 3 > >>>> > >>>> ======The vector x is : > >>>> Process [0] > >>>> 4000 > >>>> 4000 > >>>> 4000 > >>>> 4000 > >>>> 4000 > >>>> 3000 > >>>> 3000 > >>>> Process [1] > >>>> 3000 > >>>> 3000 > >>>> 3000 > >>>> 2000 > >>>> 2000 > >>>> 2000 > >>>> Process [2] > >>>> 2000 > >>>> 2000 > >>>> 1000 > >>>> 1000 > >>>> 1000 > >>>> 1000 > >>>> Process [3] > >>>> 1000 > >>>> 0 > >>>> 0 > >>>> 0 > >>>> 0 > >>>> 0 > >>>> > >>>> > >>>> > >>>> > >>>>> Date: Tue, 4 Jan 2011 17:50:11 -0600 > >>>>> From: balay at mcs.anl.gov > >>>>> To: petsc-users at mcs.anl.gov > >>>>> Subject: Re: [petsc-users] error in calling VecGetArrayf90() > >>>>> > >>>>> Did you included "finclude/petscvec.h90" in your code - as the > >> example did? > >>>>> > >>>>> satish > >>>>> > >>>>> On Tue, 4 Jan 2011, Peter Wang wrote: > >>>>> > >>>>>> > >>>>>> I am trying to obtain the value of each element of a solution > >> Vector by KSPsolve(). > >>>>>> > >>>>>> The variables are defined according the example of ex4f90.F in > >> \petsc-3.1-p5\src\snes\examples\tutorials\ as following, > >>>>>> > >>>>>> PetscScalar, pointer :: xx_v(:) > >>>>>> > >>>>>> ... > >>>>>> call KSPSolve(ksp,b,x,ierr) > >>>>>> call VecView(x,PETSC_VIEWER_STDOUT_WORLD,ierr) > >>>>>> > >>>>>> call VecGetArrayF90(x,xx_v,ierr) > >>>>>> call VecRestoreArrayF90(x,xx_v,ierr) > >>>>>> > >>>>>> ... > >>>>>> > >>>>>> But, the error keeps coming out when call > >> VecGetArrayF90(x,xx_v,ierr) and call VecRestoreArrayF90(x,xx_v,ierr) > >> are not commented off. > >>>>>> > >>>>>> > >>>>>> The error information shows: > >>>>>> Caught signal number 11 SEGV: Segmentation Violation, probably > >> memory access out of range > >>>>>> > >>>>>> [0]PETSC ERROR: --------------------- Stack Frames > >> ------------------------------------ > >>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are > >> not available, > >>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of the > >> function > >>>>>> [0]PETSC ERROR: is given. > >>>>>> [0]PETSC ERROR: [0] F90Array1dCreate line 52 > >> src/sys/f90-src/f90_cwrap.c > >>>>>> [0]PETSC ERROR: --------------------- Error Message > >> ------------------------------------ > >>>>>> > >>>>>> I checked the code according the example, but cannot see any > >> difference to that. Just don't know why the pointer array xx_v doesn't > >> work here? Thanks. > >>>>>> > >>>>>> > >>>>> > >>>> > >>> > > > > -- > > ------------------------------------- > > Ethan Coon > > Post-Doctoral Researcher > > Mathematical Modeling and Analysis > > Los Alamos National Laboratory > > 505-665-8289 > > > > http://www.ldeo.columbia.edu/~ecoon/ > > ------------------------------------- > > > -- ------------------------------------- Ethan Coon Post-Doctoral Researcher Mathematical Modeling and Analysis Los Alamos National Laboratory 505-665-8289 http://www.ldeo.columbia.edu/~ecoon/ ------------------------------------- From pengxwang at hotmail.com Wed Jan 5 14:21:47 2011 From: pengxwang at hotmail.com (Peter Wang) Date: Wed, 5 Jan 2011 14:21:47 -0600 Subject: [petsc-users] error in calling VecGetArrayf90() In-Reply-To: <1294257140.4768.47.camel@echo.lanl.gov> References: ,, , ,, ,, , , <1294254228.4768.36.camel@echo.lanl.gov>, <8ECB9816-7803-4DDE-B179-A847B72F44C2@mcs.anl.gov>, <1294257140.4768.47.camel@echo.lanl.gov> Message-ID: Thanks to all of you. Yes, Ethan is correct. I just changed the code as shown as below, and the result is correct: !======================== do i=1,Iend-Istart write(6,*)'check xx_v',i,xx_v(i),myid enddo !======================== Now, I got the main concept of the pointer to the array, it is locally and begin with 1. If I want to use them globally, I should convert the local index to the global. Thanks again for all the considerate responses. Your advices always clear the trouble on the way to the final goal. I appreciate very much. > From: ecoon at lanl.gov > To: petsc-users at mcs.anl.gov > Date: Wed, 5 Jan 2011 12:52:20 -0700 > Subject: Re: [petsc-users] error in calling VecGetArrayf90() > > See the below program and output (done with dev, but I don't think this > has changed). In F90, the local arrays, gotten by VecGetArrayF90, are > all indexed: > > xx_v(1:(Iend-Istart)). > > This should really be moot -- It's way easier to follow the example code > for ex50f90.F and pass their arrays into a subroutine which casts the > shape in the "PETSc way", indexed from (Istart:Iend-1). > > Ethan > > > --- > > > #define PETSC_USE_FORTRAN_MODULES 1 > #include "finclude/petscsysdef.h" > #include "finclude/petscvecdef.h" > > program test > use petsc > implicit none > > Vec x > PetscInt Istart, Iend > PetscScalar, pointer:: xx_v(:) > PetscInt i > PetscErrorCode ierr > PetscInt twenty > PetscInt rank > > twenty = 20 > > call PetscInitialize(PETSC_NULL_CHARACTER, ierr) > call mpi_comm_rank(PETSC_COMM_WORLD,rank,ierr) > call VecCreateMPI(PETSC_COMM_WORLD, PETSC_DECIDE, twenty, x, ierr) > > call VecGetArrayF90(x, xx_v, ierr) > xx_v = rank+1 > call VecRestoreArrayF90(x, xx_v, ierr) > > call VecView(x, PETSC_VIEWER_STDOUT_WORLD, ierr) > > call VecGetOwnershipRange(x, Istart, Iend) > write (*,*) 'range on rank', rank, ':', Istart, Iend > > call VecGetArrayF90(x, xx_v, ierr) > CHKMEMQ > do i = 1,(Iend-Istart) > write (*,*) i, xx_v(i), rank > enddo > CHKMEMQ > call VecRestoreArrayF90(x, xx_v, ierr) > > call VecDestroy(x, ierr) > call PetscFinalize(ierr) > end program test > > ----- > > $> ${PETSC_DIR}/${PETSC_ARCH}/bin/mpiexec -n 2 ./test > Vector Object: > type: mpi > Process [0] > 1 > 1 > 1 > 1 > 1 > 1 > 1 > 1 > 1 > 1 > Process [1] > 2 > 2 > 2 > 2 > 2 > 2 > 2 > 2 > 2 > 2 > range on rank 1 : 10 20 > 1 2.0000000000000000 1 > 2 2.0000000000000000 1 > 3 2.0000000000000000 1 > 4 2.0000000000000000 1 > 5 2.0000000000000000 1 > 6 2.0000000000000000 1 > 7 2.0000000000000000 1 > 8 2.0000000000000000 1 > 9 2.0000000000000000 1 > 10 2.0000000000000000 1 > range on rank 0 : 0 10 > 1 1.00000000000000000 0 > 2 1.00000000000000000 0 > 3 1.00000000000000000 0 > 4 1.00000000000000000 0 > 5 1.00000000000000000 0 > 6 1.00000000000000000 0 > 7 1.00000000000000000 0 > 8 1.00000000000000000 0 > 9 1.00000000000000000 0 > 10 1.00000000000000000 0 > > --- > > > On Wed, 2011-01-05 at 13:21 -0600, Barry Smith wrote: > > I don't think this is correct. You are suppose to use the local indexing for each process. With the strange index starting at 1 instead of 0. > > > > > > Barry > > > > > > On Jan 5, 2011, at 1:03 PM, Ethan Coon wrote: > > > > > On all processors, the array you get is indexed: > > > > > > xx_v(1:(Iend-Istart)) > > > > > > while the Istart and Iend are global indices into the global Vec. > > > > > > It's only by luck that the values on proc 3 are correct (this code > > > should probably seg fault as it is accessing memory outside of the > > > bounds of xx_v). > > > > > > To access the array, you'll want: > > > > > > do i=1, (Iend-Istart) > > > write(6,*)'check xx_v',i,xx_v(i),myid > > > enddo > > > > > > or (better yet) pass it into a subroutine to get the array indexed > > > correctly, like demonstrated in the example. > > > > > > Ethan > > > > > > On Wed, 2011-01-05 at 11:23 -0600, Peter Wang wrote: > > >> Thanks, Satish, > > >> > > >> The index of the array is modified to i+1: > > >> !=================== > > >> do i=Istart,Iend-1 > > >> write(6,*)'check xx_v',i+1,xx_v(i+1),myid > > >> enddo > > >> !=================== > > >> > > >> However, only the elements on root process (process 0) and the > > >> last process (process 3) are corrent, is there any ohter logical > > >> error? > > >> > > >> check xx_v 1 3999.9999999999982 0 > > >> check xx_v 2 3999.9999999999982 0 > > >> check xx_v 3 3999.9999999999982 0 > > >> check xx_v 4 3999.9999999999982 0 > > >> check xx_v 5 3999.9999999999982 0 > > >> check xx_v 6 3000.0000000000005 0 > > >> check xx_v 7 3000.0000000000005 0 > > >> check xx_v 8 2.61360726650019422E-321 1 > > >> check xx_v 9 7.90505033345994471E-323 1 > > >> check xx_v 10 1.69759663277221785E-312 1 > > >> check xx_v 11 6.16840020108069212E-317 1 > > >> check xx_v 12 6.16840316547456717E-317 1 > > >> check xx_v 13 6.16832658529946177E-317 1 > > >> check xx_v 14 1.99665037664579820E-314 2 > > >> check xx_v 15 6.19784009071943448E-317 2 > > >> check xx_v 16 6.20249221284067566E-317 2 > > >> check xx_v 17 6.20218737433719161E-317 2 > > >> check xx_v 18 6.18236051996958238E-317 2 > > >> check xx_v 19 6.16840316547456717E-317 2 > > >> check xx_v 20 6.18107199676522841E-317 3 > > >> check xx_v 21 0.0000000000000000 3 > > >> check xx_v 22 0.0000000000000000 3 > > >> check xx_v 23 0.0000000000000000 3 > > >> check xx_v 24 0.0000000000000000 3 > > >> check xx_v 25 0.0000000000000000 3 > > >> > > >> > > >> > > >>> Date: Tue, 4 Jan 2011 22:49:50 -0600 > > >>> From: balay at mcs.anl.gov > > >>> To: petsc-users at mcs.anl.gov > > >>> Subject: Re: [petsc-users] error in calling VecGetArrayf90() > > >>> > > >>> The global index starts at Istart - but the array index starts at 1 > > >> [for fortran array] > > >>> > > >>> Satish > > >>> > > >>> On Tue, 4 Jan 2011, Peter Wang wrote: > > >>> > > >>>> > > >>>> In last question, the pointer xx_v is local data. However, if > > >> write them to the monitor or assign them to another array, the value > > >> is incorrect. > > >>>> > > >>>> The protion of the code to display them on the monitor is like as > > >> following: > > >>>> call MatGetOwnershipRange(A,Istart,Iend,ierr) > > >>>> call VecGetArrayF90(x,xx_v,ierr) ! Vector x is matched with Matrix > > >> A in the same communicator > > >>>> > > >>>> write(*,*)xx_v,myid ! write the poiner array together > > >>>> > > >>>> do i=Istart,Iend-1 > > >>>> write(6,*)'check xx_v',i,xx_v(i),myid !write the element of the > > >> array one by one with local range (Istart to Iend-1) > > >>>> enddo > > >>>> > > >>>> > > >>>> =========The result is as following: ( the values of the elements > > >> from 7 to 20 are not correct !!) > > >>>> > > >>>> 3999.9999999999982 3999.9999999999982 3999.9999999999982 > > >> 3999.9999999999982 3999.9999999999982 3000.0000000000005 > > >> 3000.0000000000005 0 > > >>>> > > >>>> 3000.0000000000009 3000.0000000000009 3000.0000000000009 > > >> 2000.0000000000011 2000.0000000000011 2000.0000000000000 1 > > >>>> > > >>>> 2000.0000000000009 2000.0000000000009 1000.0000000000003 > > >> 1000.0000000000003 1000.0000000000003 999.99999999999989 2 > > >>>> > > >>>> 1000.0000000000000 0.0000000000000000 0.0000000000000000 > > >> 0.0000000000000000 0.0000000000000000 0.0000000000000000 3 > > >>>> > > >>>> > > >>>> check xx_v 0 0.0000000000000000 0.0000000000000000 0 > > >>>> check xx_v 1 3999.9999999999982 3999.9999999999982 0 > > >>>> check xx_v 2 3999.9999999999982 3999.9999999999982 0 > > >>>> check xx_v 3 3999.9999999999982 3999.9999999999982 0 > > >>>> check xx_v 4 3999.9999999999982 3999.9999999999982 0 > > >>>> check xx_v 5 3999.9999999999982 3999.9999999999982 0 > > >>>> check xx_v 6 3000.0000000000005 3000.0000000000005 0 > > >>>> check xx_v 7 1.99665037664579820E-314 1.99665037664579820E-314 1 > > >>>> check xx_v 8 2.61360726650019422E-321 2.61360726650019422E-321 1 > > >>>> check xx_v 9 7.90505033345994471E-323 7.90505033345994471E-323 1 > > >>>> check xx_v 10 1.69759663277221785E-312 1.69759663277221785E-312 1 > > >>>> check xx_v 11 6.16846344148335980E-317 6.16846344148335980E-317 1 > > >>>> check xx_v 12 6.16846640587723485E-317 6.16846640587723485E-317 1 > > >>>> check xx_v 13 6.16838982570212945E-317 6.16838982570212945E-317 2 > > >>>> check xx_v 14 1.99665037664579820E-314 1.99665037664579820E-314 2 > > >>>> check xx_v 15 6.19790333112210216E-317 6.19790333112210216E-317 2 > > >>>> check xx_v 16 6.20255545324334334E-317 6.20255545324334334E-317 2 > > >>>> check xx_v 17 6.20225061473985929E-317 6.20225061473985929E-317 2 > > >>>> check xx_v 18 6.18242376037225006E-317 6.18242376037225006E-317 2 > > >>>> check xx_v 19 6.16846640587723485E-317 6.16846640587723485E-317 3 > > >>>> check xx_v 20 6.18113523716789609E-317 6.18113523716789609E-317 3 > > >>>> check xx_v 21 0.0000000000000000 0.0000000000000000 3 > > >>>> check xx_v 22 0.0000000000000000 0.0000000000000000 3 > > >>>> check xx_v 23 0.0000000000000000 0.0000000000000000 3 > > >>>> check xx_v 24 0.0000000000000000 0.0000000000000000 3 > > >>>> > > >>>> ======The vector x is : > > >>>> Process [0] > > >>>> 4000 > > >>>> 4000 > > >>>> 4000 > > >>>> 4000 > > >>>> 4000 > > >>>> 3000 > > >>>> 3000 > > >>>> Process [1] > > >>>> 3000 > > >>>> 3000 > > >>>> 3000 > > >>>> 2000 > > >>>> 2000 > > >>>> 2000 > > >>>> Process [2] > > >>>> 2000 > > >>>> 2000 > > >>>> 1000 > > >>>> 1000 > > >>>> 1000 > > >>>> 1000 > > >>>> Process [3] > > >>>> 1000 > > >>>> 0 > > >>>> 0 > > >>>> 0 > > >>>> 0 > > >>>> 0 > > >>>> > > >>>> > > >>>> > > >>>> > > >>>>> Date: Tue, 4 Jan 2011 17:50:11 -0600 > > >>>>> From: balay at mcs.anl.gov > > >>>>> To: petsc-users at mcs.anl.gov > > >>>>> Subject: Re: [petsc-users] error in calling VecGetArrayf90() > > >>>>> > > >>>>> Did you included "finclude/petscvec.h90" in your code - as the > > >> example did? > > >>>>> > > >>>>> satish > > >>>>> > > >>>>> On Tue, 4 Jan 2011, Peter Wang wrote: > > >>>>> > > >>>>>> > > >>>>>> I am trying to obtain the value of each element of a solution > > >> Vector by KSPsolve(). > > >>>>>> > > >>>>>> The variables are defined according the example of ex4f90.F in > > >> \petsc-3.1-p5\src\snes\examples\tutorials\ as following, > > >>>>>> > > >>>>>> PetscScalar, pointer :: xx_v(:) > > >>>>>> > > >>>>>> ... > > >>>>>> call KSPSolve(ksp,b,x,ierr) > > >>>>>> call VecView(x,PETSC_VIEWER_STDOUT_WORLD,ierr) > > >>>>>> > > >>>>>> call VecGetArrayF90(x,xx_v,ierr) > > >>>>>> call VecRestoreArrayF90(x,xx_v,ierr) > > >>>>>> > > >>>>>> ... > > >>>>>> > > >>>>>> But, the error keeps coming out when call > > >> VecGetArrayF90(x,xx_v,ierr) and call VecRestoreArrayF90(x,xx_v,ierr) > > >> are not commented off. > > >>>>>> > > >>>>>> > > >>>>>> The error information shows: > > >>>>>> Caught signal number 11 SEGV: Segmentation Violation, probably > > >> memory access out of range > > >>>>>> > > >>>>>> [0]PETSC ERROR: --------------------- Stack Frames > > >> ------------------------------------ > > >>>>>> [0]PETSC ERROR: Note: The EXACT line numbers in the stack are > > >> not available, > > >>>>>> [0]PETSC ERROR: INSTEAD the line number of the start of the > > >> function > > >>>>>> [0]PETSC ERROR: is given. > > >>>>>> [0]PETSC ERROR: [0] F90Array1dCreate line 52 > > >> src/sys/f90-src/f90_cwrap.c > > >>>>>> [0]PETSC ERROR: --------------------- Error Message > > >> ------------------------------------ > > >>>>>> > > >>>>>> I checked the code according the example, but cannot see any > > >> difference to that. Just don't know why the pointer array xx_v doesn't > > >> work here? Thanks. > > >>>>>> > > >>>>>> > > >>>>> > > >>>> > > >>> > > > > > > -- > > > ------------------------------------- > > > Ethan Coon > > > Post-Doctoral Researcher > > > Mathematical Modeling and Analysis > > > Los Alamos National Laboratory > > > 505-665-8289 > > > > > > http://www.ldeo.columbia.edu/~ecoon/ > > > ------------------------------------- > > > > > > > -- > ------------------------------------- > Ethan Coon > Post-Doctoral Researcher > Mathematical Modeling and Analysis > Los Alamos National Laboratory > 505-665-8289 > > http://www.ldeo.columbia.edu/~ecoon/ > ------------------------------------- > -------------- next part -------------- An HTML attachment was scrubbed... URL: From gshy2014 at gmail.com Wed Jan 5 14:43:35 2011 From: gshy2014 at gmail.com (Gu Shiyuan) Date: Wed, 5 Jan 2011 14:43:35 -0600 Subject: [petsc-users] PetscLogEvent Message-ID: Hi all, I make up a learning examples but cannot explain the result: In the following codes, everything inside the stage[1] is between PetscLogEventBegin() and PetscLogEventEnd(), so I expect the time of the EVENT_MGSolver should take up 100% percent of the time in this stage. But log_summary only gives me 97%. Where is the 3%? PetscLogStagePush(stages[1]); PetscLogEventBegin(MGSolver,0,0,0,0); for(l=0;l<100;l++){ ///// some functions calls } PetscLogEventEnd(MGSolver,0,0,0,0); PetscLogStagePop(); -log_summary: --- Event Stage 2: MatMult 241600 1.0 3.4837e+01 1.0 1.06e+10 1.0 0.0e+00 0.0e+00 0.0e+00 64 75 0 0 0 65 75 0 0 0 303 MatMultAdd 53200 1.0 2.9537e+00 1.0 6.91e+08 1.0 0.0e+00 0.0e+00 0.0e+00 5 5 0 0 0 6 5 0 0 0 234 MatSolve 35200 1.0 6.0787e-02 1.0 3.01e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 50 VecDot 272000 1.0 1.6983e+00 1.0 7.64e+08 1.0 0.0e+00 0.0e+00 0.0e+00 3 5 0 0 0 3 5 0 0 0 450 VecNorm 35200 1.0 4.8882e-02 1.0 5.28e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 11 VecScale 117600 1.0 6.2790e-01 1.0 2.62e+08 1.0 0.0e+00 0.0e+00 0.0e+00 1 2 0 0 0 1 2 0 0 0 417 VecCopy 141600 1.0 1.4639e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 3 0 0 0 0 3 0 0 0 0 0 VecSet 35200 1.0 3.4475e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 35200 1.0 3.4980e-02 1.0 5.63e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 16 VecAYPX 288400 1.0 3.4488e+00 1.0 8.93e+08 1.0 0.0e+00 0.0e+00 0.0e+00 6 6 0 0 0 6 6 0 0 0 259 VecWAXPY 100800 1.0 1.3907e+00 1.0 3.81e+08 1.0 0.0e+00 0.0e+00 0.0e+00 3 3 0 0 0 3 3 0 0 0 274 VecPointwiseMult 320000 1.0 4.8273e+00 1.0 5.72e+08 1.0 0.0e+00 0.0e+00 0.0e+00 9 4 0 0 0 9 4 0 0 0 119 KSPSolve 17600 1.0 5.6411e-01 1.0 6.14e+06 1.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 11 PCApply 35200 1.0 1.2111e-01 1.0 3.01e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 25 MGSolver 1 1.0 5.3215e+01 1.0 1.41e+10 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 100100 0 0 0 265 ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Moreover, What are the last four parameters of PetscLogEventBegin() used for? The manually vaguely states that "objects associated with the event". The provided examples set them to zeros. In what situation should we pass a non-zero? I can obtain profiling information with -log_summary but I get nothing with -log or -log_all. Did I miss any steps for -log/-log_all to work? Thanks. -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Jan 5 14:58:48 2011 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 5 Jan 2011 20:58:48 +0000 Subject: [petsc-users] PetscLogEvent In-Reply-To: References: Message-ID: On Wed, Jan 5, 2011 at 8:43 PM, Gu Shiyuan wrote: > Hi all, > I make up a learning examples but cannot explain the result: > In the following codes, everything inside the stage[1] is between > PetscLogEventBegin() and PetscLogEventEnd(), so I expect the time of the > EVENT_MGSolver should take up 100% percent of the time in this stage. But > log_summary only gives me 97%. Where is the 3%? > I would need to see the code to answer completely, however things happen from the program launch until your EventBegin() and from your EventEnd() until termination. > > PetscLogStagePush(stages[1]); > PetscLogEventBegin(MGSolver,0,0,0,0); > for(l=0;l<100;l++){ > ///// some functions calls > } > PetscLogEventEnd(MGSolver,0,0,0,0); > PetscLogStagePop(); > > -log_summary: > --- Event Stage 2: > > MatMult 241600 1.0 3.4837e+01 1.0 1.06e+10 1.0 0.0e+00 0.0e+00 > 0.0e+00 64 75 0 0 0 65 75 0 0 0 303 > MatMultAdd 53200 1.0 2.9537e+00 1.0 6.91e+08 1.0 0.0e+00 0.0e+00 > 0.0e+00 5 5 0 0 0 6 5 0 0 0 234 > MatSolve 35200 1.0 6.0787e-02 1.0 3.01e+06 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 50 > VecDot 272000 1.0 1.6983e+00 1.0 7.64e+08 1.0 0.0e+00 0.0e+00 > 0.0e+00 3 5 0 0 0 3 5 0 0 0 450 > VecNorm 35200 1.0 4.8882e-02 1.0 5.28e+05 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 11 > VecScale 117600 1.0 6.2790e-01 1.0 2.62e+08 1.0 0.0e+00 0.0e+00 > 0.0e+00 1 2 0 0 0 1 2 0 0 0 417 > VecCopy 141600 1.0 1.4639e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 3 0 0 0 0 3 0 0 0 0 0 > VecSet 35200 1.0 3.4475e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > VecAXPY 35200 1.0 3.4980e-02 1.0 5.63e+05 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 16 > VecAYPX 288400 1.0 3.4488e+00 1.0 8.93e+08 1.0 0.0e+00 0.0e+00 > 0.0e+00 6 6 0 0 0 6 6 0 0 0 259 > VecWAXPY 100800 1.0 1.3907e+00 1.0 3.81e+08 1.0 0.0e+00 0.0e+00 > 0.0e+00 3 3 0 0 0 3 3 0 0 0 274 > VecPointwiseMult 320000 1.0 4.8273e+00 1.0 5.72e+08 1.0 0.0e+00 0.0e+00 > 0.0e+00 9 4 0 0 0 9 4 0 0 0 119 > KSPSolve 17600 1.0 5.6411e-01 1.0 6.14e+06 1.0 0.0e+00 0.0e+00 > 0.0e+00 1 0 0 0 0 1 0 0 0 0 11 > PCApply 35200 1.0 1.2111e-01 1.0 3.01e+06 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 25 > MGSolver 1 1.0 5.3215e+01 1.0 1.41e+10 1.0 0.0e+00 0.0e+00 > 0.0e+00 97100 0 0 0 100100 0 0 0 265 > > ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- > Moreover, > What are the last four parameters of PetscLogEventBegin() used for? The > manually vaguely states that "objects associated with the event". > The provided examples set them to zeros. In what situation should we pass a > non-zero? > Never, they were there for an old visualization tool. > I can obtain profiling information with -log_summary but I get nothing > with -log or -log_all. Did I miss any steps for -log/-log_all to work? > Those are for tracing. Thanks, Matt > Thanks. > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From gshy2014 at gmail.com Wed Jan 5 16:25:05 2011 From: gshy2014 at gmail.com (Gu Shiyuan) Date: Wed, 5 Jan 2011 16:25:05 -0600 Subject: [petsc-users] PetscLogEvent Message-ID: Please forget my first question. I was looking at the wrong column. -------------- next part -------------- An HTML attachment was scrubbed... URL: From vishy at stat.purdue.edu Wed Jan 5 21:06:30 2011 From: vishy at stat.purdue.edu (S V N Vishwanathan) Date: Wed, 05 Jan 2011 22:06:30 -0500 Subject: [petsc-users] Reshaping a vector into a matrix Message-ID: <1294283190.21492.126.camel@vishy-laptop> Hi In one of my programs I need to reshape a PETSc dense vector of dimensions n*k into a dense matrix of dimension n x k (similar to what you can do in, say, Matlab). The vector is either sequential or parallel. What is the most painless way to achieve this? My first crude attempt was as follows: info=MPI_Comm_size(PETSC_COMM_WORLD,&size);CHKERRQ(info); if(size==1){ PetscScalar *vec_array; info=VecGetArray(vec,&vec_array);CHKERRQ(info); info=MatCreateSeqDense(PETSC_COMM_WORLD,n,k,vec_array,&matrix);CHKERRQ(info); MatView(matrix,PETSC_VIEWER_STDOUT_WORLD); info=VecRestoreArray(vec,&vec_array);CHKERRQ(info); // MatDestroy here? }else{ // Don't know how to handle this } This gave me a access violation :( If you are curious as to the application where this arises, I am using TAO to solve a machine learning problem. TAO expects the optimization parameters to be a vector, but when I evaluate the objective function I need to reshape them into a matrix. vishy From bsmith at mcs.anl.gov Wed Jan 5 22:38:23 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 5 Jan 2011 22:38:23 -0600 Subject: [petsc-users] Reshaping a vector into a matrix In-Reply-To: <1294283190.21492.126.camel@vishy-laptop> References: <1294283190.21492.126.camel@vishy-laptop> Message-ID: <4E5B3ADE-C2CA-47A8-B9AD-AA856096EA5F@mcs.anl.gov> On Jan 5, 2011, at 9:06 PM, S V N Vishwanathan wrote: > Hi > > In one of my programs I need to reshape a PETSc dense vector of > dimensions n*k into a dense matrix of dimension n x k (similar to what > you can do in, say, Matlab). The vector is either sequential or > parallel. What is the most painless way to achieve this? > > My first crude attempt was as follows: > > info=MPI_Comm_size(PETSC_COMM_WORLD,&size);CHKERRQ(info); > > if(size==1){ > PetscScalar *vec_array; > info=VecGetArray(vec,&vec_array);CHKERRQ(info); > info=MatCreateSeqDense(PETSC_COMM_WORLD,n,k,vec_array,&matrix);CHKERRQ(info); > MatView(matrix,PETSC_VIEWER_STDOUT_WORLD); > info=VecRestoreArray(vec,&vec_array);CHKERRQ(info); > // MatDestroy here? > }else{ > // Don't know how to handle this > } > > This gave me a access violation :( > > If you are curious as to the application where this arises, I am using > TAO to solve a machine learning problem. TAO expects the optimization > parameters to be a vector, but when I evaluate the objective function I > need to reshape them into a matrix. > Do you mean a two dimensional array or do you mean a matrix -- that is the representation of a linear operator that applies to a vector giving a new vector? Barry > vishy > From vishy at stat.purdue.edu Thu Jan 6 00:02:46 2011 From: vishy at stat.purdue.edu (S V N Vishwanathan) Date: Thu, 06 Jan 2011 01:02:46 -0500 Subject: [petsc-users] Reshaping a vector into a matrix In-Reply-To: <4E5B3ADE-C2CA-47A8-B9AD-AA856096EA5F@mcs.anl.gov> References: <1294283190.21492.126.camel@vishy-laptop> <4E5B3ADE-C2CA-47A8-B9AD-AA856096EA5F@mcs.anl.gov> Message-ID: <1294293766.21492.144.camel@vishy-laptop> > > If you are curious as to the application where this arises, I am using > > TAO to solve a machine learning problem. TAO expects the optimization > > parameters to be a vector, but when I evaluate the objective function I > > need to reshape them into a matrix. > > > > Do you mean a two dimensional array or do you mean a matrix -- that is the representation of a linear operator that applies to a vector giving a new vector? I mean a two dimensional array. Basically my parameter vector is of the form vec = (vec1^t, vec2^t, vec3^t,...veck^t)^t which I represent as a Petsc Vector. In my objective function calculation I am given a matrix X and need to compute fx = (X.vec1, X.vec2, ..., X.veck)^t vishy From C.Klaij at marin.nl Thu Jan 6 01:20:15 2011 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Thu, 6 Jan 2011 07:20:15 +0000 Subject: [petsc-users] which version of openmpi for petsc-3.1 Message-ID: Thanks for your advice! I'll take the most recent version. Chris dr. ir. Christiaan Klaij CFD Researcher Research & Development E mailto:C.Klaij at marin.nl T +31 317 49 33 44 MARIN 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl From gianmail at gmail.com Thu Jan 6 06:01:06 2011 From: gianmail at gmail.com (Gianluca Meneghello) Date: Thu, 6 Jan 2011 13:01:06 +0100 Subject: [petsc-users] VecGetSubVector In-Reply-To: References: Message-ID: Dear Barry, thanks a lot for your answer. I tried to do some experiments with MatGetSubMatrix, but I guess I'm doing something wrong as it looks like being painfully slow. I changed approach and now I'm using the ASM preconditioner. What I'm actually trying to do is to split the domain in different parts --- like interior and boundaries --- and relax (solve) each one with a different smoother (solver). In your opinion, is this the right approach? So far it looks much faster than my previous approach of extracting each submatrix. Also, please let me ask you one more thing. When using ASM with different subdomains on the same process, is the order in which the domains are solved the same as the one in which they are stored in the IS array passed to PCASMSetLocalSubdomains()? I would be interested in controlling this in order to build a downstream marching smoother. Looking at the references, I've noticed you have worked on multigrid. What I'm trying to do is close to what is described in Diskin, Thomas, Mineck, "Textbook Multigrid Efficiency for Leading Edge Stagnation", in case you already know the paper. http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20040081104_2004082284.pdf Again, thanks a lot. Gianluca On 3 January 2011 17:43, Barry Smith wrote: > > ?Gianluca, > > ? ?The expected use is with the VecScatter object. First you create a VecScatter object with VecScatterCreate() then each time you need the "subvector" you call VecScatterBegin() followed by VecScatterEnd() Note that usually the VecScatter object is retained and used many times. > > ? Barry > > > > On Jan 3, 2011, at 5:22 AM, Gianluca Meneghello wrote: > >> Hi, >> >> I'm new to PETSc, so that this can be a very simple question: >> >> I'm looking for something like VecGetSubVector, which I've seen it >> exists in the dev version but not in the released one. >> >> I need to write a smoother for a multigrid algorithm (something like a >> block Gauss Seidel) which can be written in matlab as >> >> for j = 1:ny >> ?P = ; >> ?du(P) = L(P,P) \ ( ?rhs(P) - L(P,:)*du + D2(P,P)*du(P) ); >> end >> >> where L is a matrix (in my case the linearized Navier Stokes). >> >> I was thinking about using IS for declaring P, so that D2(P,P) can be >> obtained using MatGetSubMatrix. I would need the same for the vector >> du. >> >> Is there a way to do that without using the developer version? (I >> really don't feel like being "experienced with building, using and >> debugging PETSc). >> >> Thanks in advance >> >> Gianluca > > From jed at 59A2.org Thu Jan 6 10:36:10 2011 From: jed at 59A2.org (Jed Brown) Date: Thu, 6 Jan 2011 08:36:10 -0800 Subject: [petsc-users] VecGetSubVector In-Reply-To: References: Message-ID: On Thu, Jan 6, 2011 at 04:01, Gianluca Meneghello wrote: > When using ASM with > different subdomains on the same process, is the order in which the > domains are solved the same as the one in which they are stored in the > IS array passed to PCASMSetLocalSubdomains()? I would be interested in > controlling this in order to build a downstream marching smoother. > PCASM is "Additive" rather than multiplicative so there is no order. PCFieldSplit includes multiplicative and symmetric-multiplicative variants, but each subdomain ("split" in this context) is global which exposes less concurrency (subdomains can be empty on some processes so this only affects performance, not which Schwarz-like algorithms are implementable). -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at 59A2.org Thu Jan 6 10:40:09 2011 From: jed at 59A2.org (Jed Brown) Date: Thu, 6 Jan 2011 08:40:09 -0800 Subject: [petsc-users] Reshaping a vector into a matrix In-Reply-To: <1294293766.21492.144.camel@vishy-laptop> References: <1294283190.21492.126.camel@vishy-laptop> <4E5B3ADE-C2CA-47A8-B9AD-AA856096EA5F@mcs.anl.gov> <1294293766.21492.144.camel@vishy-laptop> Message-ID: On Wed, Jan 5, 2011 at 22:02, S V N Vishwanathan wrote: > I mean a two dimensional array. Basically my parameter vector is of the > form > > vec = (vec1^t, vec2^t, vec3^t,...veck^t)^t > > which I represent as a Petsc Vector. In my objective function > calculation I am given a matrix X and need to compute > > fx = (X.vec1, X.vec2, ..., X.veck)^t > 1. How big is "k"? 2. Since each vec1,vec2,... is the same size, this is likely to produce poor memory performance. If you care about speed, I suggest interlacing the values in vec1,...,veck. In that case, you can create an MAIJ matrix that acts on this "multi-vector". -------------- next part -------------- An HTML attachment was scrubbed... URL: From spam.wax at gmail.com Thu Jan 6 11:46:32 2011 From: spam.wax at gmail.com (Hamid M.) Date: Thu, 6 Jan 2011 12:46:32 -0500 Subject: [petsc-users] PETSc and dense matrices Message-ID: Hello, In our research, we solve the diffusion equation PDE using Boundary Element Method (BEM). I am trying to parallelize the code we already have and I was wondering if PETSc is the right tool for us. As you know, BEM produces a dense LHS matrix that needs to be solved. Also due to the size of our problems, populating the entities of the LHS matrix needs to be done on different processes as it won't fit on a single process of our cluster. So I was wondering if you guys can answer my questions: 1- Can I use PETSc to build/populate my LHS matrix on different nodes of a cluster (as opposed to constructing it on a single node and then distributing it) ? 2- Are there optimized parallel solvers for dense matrices in PETSc ? 3- If the answer to question 1 is 'No', can I build my LHS matrix independent of PETSc and then direct PETSc to solve it for me ? thanks in advance, Hamid From vishy at stat.purdue.edu Thu Jan 6 11:48:37 2011 From: vishy at stat.purdue.edu (S V N Vishwanathan) Date: Thu, 06 Jan 2011 12:48:37 -0500 Subject: [petsc-users] Reshaping a vector into a matrix In-Reply-To: References: <1294283190.21492.126.camel@vishy-laptop> <4E5B3ADE-C2CA-47A8-B9AD-AA856096EA5F@mcs.anl.gov> <1294293766.21492.144.camel@vishy-laptop> Message-ID: <1294336117.21492.153.camel@vishy-laptop> On Thu, 2011-01-06 at 08:40 -0800, Jed Brown wrote: > On Wed, Jan 5, 2011 at 22:02, S V N Vishwanathan > wrote: > I mean a two dimensional array. Basically my parameter vector > is of the > form > > vec = (vec1^t, vec2^t, vec3^t,...veck^t)^t > > which I represent as a Petsc Vector. In my objective function > calculation I am given a matrix X and need to compute > > fx = (X.vec1, X.vec2, ..., X.veck)^t > > 1. How big is "k"? My k is typically of the order of 10 to a 100 (max). > 2. Since each vec1,vec2,... is the same size, this is likely to > produce poor memory performance. If you care about speed, I suggest > interlacing the values in vec1,...,veck. In that case, you can create > an MAIJ matrix that acts on this "multi-vector". I do not want to repeat X since it is a large sparse matrix (typically 1 million x 50 thousand) with around 4 -5% non-zero entries. However this varies a lot from problem to problem. In some cases a dense matrix is the most appropriate representation for X (in this case the dimensions are typically 50thousand x 200). I am trying to use the block matrices and vectors provided by PetscExt to see if they can solve my problem. vishy From jed at 59A2.org Thu Jan 6 11:55:48 2011 From: jed at 59A2.org (Jed Brown) Date: Thu, 6 Jan 2011 09:55:48 -0800 Subject: [petsc-users] PETSc and dense matrices In-Reply-To: References: Message-ID: On Thu, Jan 6, 2011 at 09:46, Hamid M. wrote: > 1- Can I use PETSc to build/populate my LHS matrix on different nodes > of a cluster (as opposed to constructing it on a single node and then > distributing it) ? > You should always do this. > > 2- Are there optimized parallel solvers for dense matrices in PETSc ? > The current implementation uses PLAPACK. You might also consider using FMM (see http://barbagroup.bu.edu/Barba_group/PetFMM.html) since explicit storage of dense matrices cannot scale. -------------- next part -------------- An HTML attachment was scrubbed... URL: From PRaeth at hpti.com Thu Jan 6 12:04:48 2011 From: PRaeth at hpti.com (Raeth, Peter) Date: Thu, 6 Jan 2011 18:04:48 +0000 Subject: [petsc-users] PETSc and dense matrices In-Reply-To: References: Message-ID: <3474F869C1954540B771FD9CAEBCB65704A9AF68@CORTINA.HPTI.COM> Am only myself just learning PETSc. But, I have used it to compute the Kronecker Tensor Product K = A ! B where K and A are distributed. Someone with more experience may have different answers. BEM produces a dense LHS matrix that needs to be solved. While I can not put my finger on it, I thought I saw on a man page that PETSc was only designed for the solution of sparse matrices. Can I use PETSc to build/populate my LHS matrix on different nodes of a cluster (as opposed to constructing it on a single node and then distributing it) ? This is something I do all the time. I find what part of a matrix is co-located with the process and then have the process operate on that part of the matrix. This can come down to matrix initialization. Best, Peter. Peter G. Raeth, Ph.D. Senior Staff Scientist Signal and Image Processing High Performance Technologies, Inc 937-904-5147 praeth at hpti.com ________________________________________ From: petsc-users-bounces at mcs.anl.gov [petsc-users-bounces at mcs.anl.gov] on behalf of Hamid M. [spam.wax at gmail.com] Sent: Thursday, January 06, 2011 12:46 PM To: petsc-users at mcs.anl.gov Subject: [petsc-users] PETSc and dense matrices Hello, In our research, we solve the diffusion equation PDE using Boundary Element Method (BEM). I am trying to parallelize the code we already have and I was wondering if PETSc is the right tool for us. As you know, BEM produces a dense LHS matrix that needs to be solved. Also due to the size of our problems, populating the entities of the LHS matrix needs to be done on different processes as it won't fit on a single process of our cluster. So I was wondering if you guys can answer my questions: 1- Can I use PETSc to build/populate my LHS matrix on different nodes of a cluster (as opposed to constructing it on a single node and then distributing it) ? 2- Are there optimized parallel solvers for dense matrices in PETSc ? 3- If the answer to question 1 is 'No', can I build my LHS matrix independent of PETSc and then direct PETSc to solve it for me ? thanks in advance, Hamid From renzhengyong at gmail.com Thu Jan 6 12:18:00 2011 From: renzhengyong at gmail.com (RenZhengYong) Date: Thu, 6 Jan 2011 19:18:00 +0100 Subject: [petsc-users] PETSc and dense matrices In-Reply-To: References: Message-ID: Hi, Hamid, I am also using BEM to solve 3D EM problems. But I only am using the PETSc on one machine. Kind regards, Zhengyong On Thu, Jan 6, 2011 at 6:46 PM, Hamid M. wrote: > Hello, > > In our research, we solve the diffusion equation PDE using Boundary > Element Method (BEM). > I am trying to parallelize the code we already have and I was > wondering if PETSc is the right tool for us. > > As you know, BEM produces a dense LHS matrix that needs to be solved. > Also due to the size of our problems, populating the entities of the > LHS matrix needs to be done on different processes as it won't fit on > a single process of our cluster. > > So I was wondering if you guys can answer my questions: > > 1- Can I use PETSc to build/populate my LHS matrix on different nodes > of a cluster (as opposed to constructing it on a single node and then > distributing it) ? > > 2- Are there optimized parallel solvers for dense matrices in PETSc ? > > 3- If the answer to question 1 is 'No', can I build my LHS matrix > independent of PETSc and then direct PETSc to solve it for me ? > > thanks in advance, > > Hamid > -- Zhengyong Ren AUG Group, Institute of Geophysics Department of Geosciences, ETH Zurich NO H 47 Sonneggstrasse 5 CH-8092, Z?rich, Switzerland Tel: +41 44 633 37561 e-mail: zhengyong.ren at aug.ig.erdw.ethz.ch Gmail: renzhengyong at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From PRaeth at hpti.com Thu Jan 6 12:19:01 2011 From: PRaeth at hpti.com (Raeth, Peter) Date: Thu, 6 Jan 2011 18:19:01 +0000 Subject: [petsc-users] PETSc and dense matrices In-Reply-To: References: , Message-ID: <3474F869C1954540B771FD9CAEBCB65704A9AF87@CORTINA.HPTI.COM> Are your matrices dense? Peter G. Raeth, Ph.D. Senior Staff Scientist Signal and Image Processing High Performance Technologies, Inc 937-904-5147 praeth at hpti.com ________________________________ From: petsc-users-bounces at mcs.anl.gov [petsc-users-bounces at mcs.anl.gov] on behalf of RenZhengYong [renzhengyong at gmail.com] Sent: Thursday, January 06, 2011 1:18 PM To: PETSc users list Subject: Re: [petsc-users] PETSc and dense matrices Hi, Hamid, I am also using BEM to solve 3D EM problems. But I only am using the PETSc on one machine. Kind regards, Zhengyong On Thu, Jan 6, 2011 at 6:46 PM, Hamid M. > wrote: Hello, In our research, we solve the diffusion equation PDE using Boundary Element Method (BEM). I am trying to parallelize the code we already have and I was wondering if PETSc is the right tool for us. As you know, BEM produces a dense LHS matrix that needs to be solved. Also due to the size of our problems, populating the entities of the LHS matrix needs to be done on different processes as it won't fit on a single process of our cluster. So I was wondering if you guys can answer my questions: 1- Can I use PETSc to build/populate my LHS matrix on different nodes of a cluster (as opposed to constructing it on a single node and then distributing it) ? 2- Are there optimized parallel solvers for dense matrices in PETSc ? 3- If the answer to question 1 is 'No', can I build my LHS matrix independent of PETSc and then direct PETSc to solve it for me ? thanks in advance, Hamid -- Zhengyong Ren AUG Group, Institute of Geophysics Department of Geosciences, ETH Zurich NO H 47 Sonneggstrasse 5 CH-8092, Z?rich, Switzerland Tel: +41 44 633 37561 e-mail: zhengyong.ren at aug.ig.erdw.ethz.ch Gmail: renzhengyong at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From renzhengyong at gmail.com Thu Jan 6 12:21:23 2011 From: renzhengyong at gmail.com (RenZhengYong) Date: Thu, 6 Jan 2011 19:21:23 +0100 Subject: [petsc-users] PETSc and dense matrices In-Reply-To: <3474F869C1954540B771FD9CAEBCB65704A9AF87@CORTINA.HPTI.COM> References: <3474F869C1954540B771FD9CAEBCB65704A9AF87@CORTINA.HPTI.COM> Message-ID: YES, :). On Thu, Jan 6, 2011 at 7:19 PM, Raeth, Peter wrote: > Are your matrices dense? > > Peter G. Raeth, Ph.D. > Senior Staff Scientist > Signal and Image Processing > High Performance Technologies, Inc > 937-904-5147 > praeth at hpti.com > ------------------------------ > *From:* petsc-users-bounces at mcs.anl.gov [petsc-users-bounces at mcs.anl.gov] > on behalf of RenZhengYong [renzhengyong at gmail.com] > *Sent:* Thursday, January 06, 2011 1:18 PM > *To:* PETSc users list > *Subject:* Re: [petsc-users] PETSc and dense matrices > > Hi, Hamid, > I am also using BEM to solve 3D EM problems. > But I only am using the PETSc on one machine. > > Kind regards, > Zhengyong > > On Thu, Jan 6, 2011 at 6:46 PM, Hamid M. wrote: > >> Hello, >> >> In our research, we solve the diffusion equation PDE using Boundary >> Element Method (BEM). >> I am trying to parallelize the code we already have and I was >> wondering if PETSc is the right tool for us. >> >> As you know, BEM produces a dense LHS matrix that needs to be solved. >> Also due to the size of our problems, populating the entities of the >> LHS matrix needs to be done on different processes as it won't fit on >> a single process of our cluster. >> >> So I was wondering if you guys can answer my questions: >> >> 1- Can I use PETSc to build/populate my LHS matrix on different nodes >> of a cluster (as opposed to constructing it on a single node and then >> distributing it) ? >> >> 2- Are there optimized parallel solvers for dense matrices in PETSc ? >> >> 3- If the answer to question 1 is 'No', can I build my LHS matrix >> independent of PETSc and then direct PETSc to solve it for me ? >> >> thanks in advance, >> >> Hamid >> > > > > -- > Zhengyong Ren > AUG Group, Institute of Geophysics > Department of Geosciences, ETH Zurich > NO H 47 Sonneggstrasse 5 > CH-8092, Z?rich, Switzerland > Tel: +41 44 633 37561 > e-mail: zhengyong.ren at aug.ig.erdw.ethz.ch > Gmail: renzhengyong at gmail.com > -- Zhengyong Ren AUG Group, Institute of Geophysics Department of Geosciences, ETH Zurich NO H 47 Sonneggstrasse 5 CH-8092, Z?rich, Switzerland Tel: +41 44 633 37561 e-mail: zhengyong.ren at aug.ig.erdw.ethz.ch Gmail: renzhengyong at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From PRaeth at hpti.com Thu Jan 6 12:21:49 2011 From: PRaeth at hpti.com (Raeth, Peter) Date: Thu, 6 Jan 2011 18:21:49 +0000 Subject: [petsc-users] PETSc and dense matrices In-Reply-To: References: <3474F869C1954540B771FD9CAEBCB65704A9AF87@CORTINA.HPTI.COM>, Message-ID: <3474F869C1954540B771FD9CAEBCB65704A9AF9D@CORTINA.HPTI.COM> Thanks. Needed to check my understanding of what PETSc will deal with. Peter G. Raeth, Ph.D. Senior Staff Scientist Signal and Image Processing High Performance Technologies, Inc 937-904-5147 praeth at hpti.com ________________________________ From: petsc-users-bounces at mcs.anl.gov [petsc-users-bounces at mcs.anl.gov] on behalf of RenZhengYong [renzhengyong at gmail.com] Sent: Thursday, January 06, 2011 1:21 PM To: PETSc users list Subject: Re: [petsc-users] PETSc and dense matrices YES, :). On Thu, Jan 6, 2011 at 7:19 PM, Raeth, Peter > wrote: Are your matrices dense? Peter G. Raeth, Ph.D. Senior Staff Scientist Signal and Image Processing High Performance Technologies, Inc 937-904-5147 praeth at hpti.com ________________________________ From: petsc-users-bounces at mcs.anl.gov [petsc-users-bounces at mcs.anl.gov] on behalf of RenZhengYong [renzhengyong at gmail.com] Sent: Thursday, January 06, 2011 1:18 PM To: PETSc users list Subject: Re: [petsc-users] PETSc and dense matrices Hi, Hamid, I am also using BEM to solve 3D EM problems. But I only am using the PETSc on one machine. Kind regards, Zhengyong On Thu, Jan 6, 2011 at 6:46 PM, Hamid M. > wrote: Hello, In our research, we solve the diffusion equation PDE using Boundary Element Method (BEM). I am trying to parallelize the code we already have and I was wondering if PETSc is the right tool for us. As you know, BEM produces a dense LHS matrix that needs to be solved. Also due to the size of our problems, populating the entities of the LHS matrix needs to be done on different processes as it won't fit on a single process of our cluster. So I was wondering if you guys can answer my questions: 1- Can I use PETSc to build/populate my LHS matrix on different nodes of a cluster (as opposed to constructing it on a single node and then distributing it) ? 2- Are there optimized parallel solvers for dense matrices in PETSc ? 3- If the answer to question 1 is 'No', can I build my LHS matrix independent of PETSc and then direct PETSc to solve it for me ? thanks in advance, Hamid -- Zhengyong Ren AUG Group, Institute of Geophysics Department of Geosciences, ETH Zurich NO H 47 Sonneggstrasse 5 CH-8092, Z?rich, Switzerland Tel: +41 44 633 37561 e-mail: zhengyong.ren at aug.ig.erdw.ethz.ch Gmail: renzhengyong at gmail.com -- Zhengyong Ren AUG Group, Institute of Geophysics Department of Geosciences, ETH Zurich NO H 47 Sonneggstrasse 5 CH-8092, Z?rich, Switzerland Tel: +41 44 633 37561 e-mail: zhengyong.ren at aug.ig.erdw.ethz.ch Gmail: renzhengyong at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From spam.wax at gmail.com Thu Jan 6 12:29:35 2011 From: spam.wax at gmail.com (Hamid M.) Date: Thu, 6 Jan 2011 13:29:35 -0500 Subject: [petsc-users] PETSc and dense matrices In-Reply-To: References: Message-ID: On Thu, Jan 6, 2011 at 12:55 PM, Jed Brown wrote: >> >> 2- Are there optimized parallel solvers for dense matrices in PETSc ? > > The current implementation uses PLAPACK. > > You might also consider using FMM (see > http://barbagroup.bu.edu/Barba_group/PetFMM.html) since explicit storage of > dense matrices cannot scale. I am not sure why my case would be explicit storage, even if I use MatCreate and MatSetValues do I still need to be concerned about scalability issues ? thanks, Hamid From bsmith at mcs.anl.gov Thu Jan 6 12:30:57 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 6 Jan 2011 12:30:57 -0600 Subject: [petsc-users] PETSc and dense matrices In-Reply-To: References: Message-ID: <93F1DB79-0D2F-4F8D-ABED-81E895F00ED1@mcs.anl.gov> On Jan 6, 2011, at 11:46 AM, Hamid M. wrote: > Hello, > > In our research, we solve the diffusion equation PDE using Boundary > Element Method (BEM). > I am trying to parallelize the code we already have and I was > wondering if PETSc is the right tool for us. > > As you know, BEM produces a dense LHS matrix that needs to be solved. > Also due to the size of our problems, populating the entities of the > LHS matrix needs to be done on different processes as it won't fit on > a single process of our cluster. > > So I was wondering if you guys can answer my questions: > > 1- Can I use PETSc to build/populate my LHS matrix on different nodes > of a cluster (as opposed to constructing it on a single node and then > distributing it) ? MatCreateMPIDense() then use MatGetArray() to access the raw array on each process or use MatSetValues() to put values into the matrix. > > 2- Are there optimized parallel solvers for dense matrices in PETSc ? The Krylov solvers all work with parallel dense matrices so if your matrix is well conditioned (which it often is with BEM) you can just use GMRES or CG with diagonal scaling. -ksp_type gmres -pc_type jacobi You can also use the direct solvers in PLAPACK with PETSc (configure PETSc with --download-plapack) but frankly if you need to use direct solvers you are in trouble in how large a problem you can run. Barry > > 3- If the answer to question 1 is 'No', can I build my LHS matrix > independent of PETSc and then direct PETSc to solve it for me ? > > thanks in advance, > > Hamid From bsmith at mcs.anl.gov Thu Jan 6 14:37:25 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 6 Jan 2011 14:37:25 -0600 Subject: [petsc-users] Reshaping a vector into a matrix In-Reply-To: <1294336117.21492.153.camel@vishy-laptop> References: <1294283190.21492.126.camel@vishy-laptop> <4E5B3ADE-C2CA-47A8-B9AD-AA856096EA5F@mcs.anl.gov> <1294293766.21492.144.camel@vishy-laptop> <1294336117.21492.153.camel@vishy-laptop> Message-ID: On Jan 6, 2011, at 11:48 AM, S V N Vishwanathan wrote: > On Thu, 2011-01-06 at 08:40 -0800, Jed Brown wrote: >> On Wed, Jan 5, 2011 at 22:02, S V N Vishwanathan >> wrote: >> I mean a two dimensional array. Basically my parameter vector >> is of the >> form >> >> vec = (vec1^t, vec2^t, vec3^t,...veck^t)^t >> >> which I represent as a Petsc Vector. In my objective function >> calculation I am given a matrix X and need to compute >> >> fx = (X.vec1, X.vec2, ..., X.veck)^t >> >> 1. How big is "k"? > > My k is typically of the order of 10 to a 100 (max). > >> 2. Since each vec1,vec2,... is the same size, this is likely to >> produce poor memory performance. If you care about speed, I suggest >> interlacing the values in vec1,...,veck. In that case, you can create >> an MAIJ matrix that acts on this "multi-vector". > > I do not want to repeat X since it is a large sparse matrix (typically 1 > million x 50 thousand) with around 4 -5% non-zero entries. However this > varies a lot from problem to problem. In some cases a dense matrix is > the most appropriate representation for X (in this case the dimensions > are typically 50thousand x 200). > > I am trying to use the block matrices and vectors provided by PetscExt > to see if they can solve my problem. You most definitely want to use the MAIJ. MAIJ does not "repeat" X, it uses the original matrix passed in but does efficient multiple matrix-vector products at the same time. Barry > > vishy > > From bsmith at mcs.anl.gov Thu Jan 6 14:55:37 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 6 Jan 2011 14:55:37 -0600 Subject: [petsc-users] VecGetSubVector In-Reply-To: References: Message-ID: <76A8F83B-2110-4056-98EF-AC998FA9EE1C@mcs.anl.gov> On Jan 6, 2011, at 6:01 AM, Gianluca Meneghello wrote: > Dear Barry, > > thanks a lot for your answer. > > I tried to do some experiments with MatGetSubMatrix, but I guess I'm > doing something wrong as it looks like being painfully slow. Hmm, we've always found the getsubmatrix takes a few percent of the time. Perhaps you are calling it repeatedly for the each domain, rather than once and reusing it? Also use MatGetSubMatrices() and get all the submatrices in one call rather than one at a time. > > I changed approach and now I'm using the ASM preconditioner. What I'm > actually trying to do is to split the domain in different parts --- > like interior and boundaries --- and relax (solve) each one with a > different smoother (solver). In your opinion, is this the right > approach? Worth trying since it is easy. You can experiment with different smoothers on the subdomains using the -sub_pc_type etc options and set different prefixes for different subdomains. > So far it looks much faster than my previous approach of > extracting each submatrix. ASM just uses MatGetSubMatrices() so shouldn't be faster or slower than a custom code that does the same thing. > > Also, please let me ask you one more thing. When using ASM with > different subdomains on the same process, is the order in which the > domains are solved the same as the one in which they are stored in the > IS array passed to PCASMSetLocalSubdomains()? I would be interested in > controlling this in order to build a downstream marching smoother. It is only additive, there is no order as Jed noted. Doing multiplicative in general is tricky because you want to just update the parts of the residual that need to be updated. > > Looking at the references, I've noticed you have worked on multigrid. > What I'm trying to do is close to what is described in Diskin, Thomas, > Mineck, "Textbook Multigrid Efficiency for Leading Edge Stagnation", > in case you already know the paper. > http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20040081104_2004082284.pdf > > Again, thanks a lot. > > Gianluca > > On 3 January 2011 17:43, Barry Smith wrote: >> >> Gianluca, >> >> The expected use is with the VecScatter object. First you create a VecScatter object with VecScatterCreate() then each time you need the "subvector" you call VecScatterBegin() followed by VecScatterEnd() Note that usually the VecScatter object is retained and used many times. >> >> Barry >> >> >> >> On Jan 3, 2011, at 5:22 AM, Gianluca Meneghello wrote: >> >>> Hi, >>> >>> I'm new to PETSc, so that this can be a very simple question: >>> >>> I'm looking for something like VecGetSubVector, which I've seen it >>> exists in the dev version but not in the released one. >>> >>> I need to write a smoother for a multigrid algorithm (something like a >>> block Gauss Seidel) which can be written in matlab as >>> >>> for j = 1:ny >>> P = ; >>> du(P) = L(P,P) \ ( rhs(P) - L(P,:)*du + D2(P,P)*du(P) ); >>> end >>> >>> where L is a matrix (in my case the linearized Navier Stokes). >>> >>> I was thinking about using IS for declaring P, so that D2(P,P) can be >>> obtained using MatGetSubMatrix. I would need the same for the vector >>> du. >>> >>> Is there a way to do that without using the developer version? (I >>> really don't feel like being "experienced with building, using and >>> debugging PETSc). >>> >>> Thanks in advance >>> >>> Gianluca >> >> From jed at 59A2.org Thu Jan 6 15:01:52 2011 From: jed at 59A2.org (Jed Brown) Date: Thu, 6 Jan 2011 13:01:52 -0800 Subject: [petsc-users] PETSc and dense matrices In-Reply-To: <3474F869C1954540B771FD9CAEBCB65704A9AF68@CORTINA.HPTI.COM> References: <3474F869C1954540B771FD9CAEBCB65704A9AF68@CORTINA.HPTI.COM> Message-ID: On Thu, Jan 6, 2011 at 10:04, Raeth, Peter wrote: > While I can not put my finger on it, I thought I saw on a man page that > PETSc was only designed for the solution of sparse matrices. I can try to explain: Some choices in PETSc are not great if the problem has no "structure". I loosely define "structure" to mean that there is a way to store the forward operator in less than O(N^2) space and that there is a way to multiply the forward operator my a vector in less than O(N^2) time. In addition to sparse matrices, this includes operators that can be applied using fast transforms like FFT and FMM, have tensor product structure, are the Schur complement or a low-rank correction of something fitting the above description, etc. If you only solve dense problems with no additional structure, then PETSc cannot have the absolute best performance. But it should be perfectly adequate for most dense problems and if you have some problems with structure and some dense problems, it offers a uniform high-level interface and a lot of algorithmic flexibility that you won't find in a dense-only package. -------------- next part -------------- An HTML attachment was scrubbed... URL: From vishy at stat.purdue.edu Thu Jan 6 17:24:51 2011 From: vishy at stat.purdue.edu (S V N Vishwanathan) Date: Thu, 06 Jan 2011 18:24:51 -0500 Subject: [petsc-users] Reshaping a vector into a matrix In-Reply-To: References: <1294283190.21492.126.camel@vishy-laptop> <4E5B3ADE-C2CA-47A8-B9AD-AA856096EA5F@mcs.anl.gov> <1294293766.21492.144.camel@vishy-laptop> <1294336117.21492.153.camel@vishy-laptop> Message-ID: <1294356291.21492.225.camel@vishy-laptop> > >> 2. Since each vec1,vec2,... is the same size, this is likely to > >> produce poor memory performance. If you care about speed, I suggest > >> interlacing the values in vec1,...,veck. In that case, you can create > >> an MAIJ matrix that acts on this "multi-vector". > You most definitely want to use the MAIJ. MAIJ does not "repeat" X, it uses the > original matrix passed in but does efficient multiple matrix-vector products at the same time. Is there any place to read up about MAIJ matrices. I read the manual pages but they were rather cryptic. Perhaps some example code? vishy From bsmith at mcs.anl.gov Thu Jan 6 17:32:05 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 6 Jan 2011 17:32:05 -0600 Subject: [petsc-users] Reshaping a vector into a matrix In-Reply-To: <1294356291.21492.225.camel@vishy-laptop> References: <1294283190.21492.126.camel@vishy-laptop> <4E5B3ADE-C2CA-47A8-B9AD-AA856096EA5F@mcs.anl.gov> <1294293766.21492.144.camel@vishy-laptop> <1294336117.21492.153.camel@vishy-laptop> <1294356291.21492.225.camel@vishy-laptop> Message-ID: <1A2A44F6-679A-44CE-8AAB-FA278A6524E7@mcs.anl.gov> On Jan 6, 2011, at 5:24 PM, S V N Vishwanathan wrote: >>>> 2. Since each vec1,vec2,... is the same size, this is likely to >>>> produce poor memory performance. If you care about speed, I suggest >>>> interlacing the values in vec1,...,veck. In that case, you can create >>>> an MAIJ matrix that acts on this "multi-vector". > >> You most definitely want to use the MAIJ. MAIJ does not "repeat" X, it uses the >> original matrix passed in but does efficient multiple matrix-vector products at the same time. > > Is there any place to read up about MAIJ matrices. I read the manual > pages but they were rather cryptic. Perhaps some example code? http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/src/snes/examples/tutorials/ex48.c.html is the only example that uses. You can also look at its source code (linked directly from the manual page) to see what it does. > > vishy > > > From PRaeth at hpti.com Thu Jan 6 18:27:02 2011 From: PRaeth at hpti.com (Raeth, Peter) Date: Fri, 7 Jan 2011 00:27:02 +0000 Subject: [petsc-users] PETSc and dense matrices In-Reply-To: References: <3474F869C1954540B771FD9CAEBCB65704A9AF68@CORTINA.HPTI.COM>, Message-ID: <3474F869C1954540B771FD9CAEBCB65704A9B00C@CORTINA.HPTI.COM> So then, while PETSc may not be efficient in such cases, it will still work correctly. Thanks for this. It is a good explanation. Best, Peter. Peter G. Raeth, Ph.D. Senior Staff Scientist Signal and Image Processing High Performance Technologies, Inc 937-904-5147 praeth at hpti.com ________________________________ From: petsc-users-bounces at mcs.anl.gov [petsc-users-bounces at mcs.anl.gov] on behalf of Jed Brown [jed at 59A2.org] Sent: Thursday, January 06, 2011 4:01 PM To: PETSc users list Subject: Re: [petsc-users] PETSc and dense matrices On Thu, Jan 6, 2011 at 10:04, Raeth, Peter > wrote: While I can not put my finger on it, I thought I saw on a man page that PETSc was only designed for the solution of sparse matrices. I can try to explain: Some choices in PETSc are not great if the problem has no "structure". I loosely define "structure" to mean that there is a way to store the forward operator in less than O(N^2) space and that there is a way to multiply the forward operator my a vector in less than O(N^2) time. In addition to sparse matrices, this includes operators that can be applied using fast transforms like FFT and FMM, have tensor product structure, are the Schur complement or a low-rank correction of something fitting the above description, etc. If you only solve dense problems with no additional structure, then PETSc cannot have the absolute best performance. But it should be perfectly adequate for most dense problems and if you have some problems with structure and some dense problems, it offers a uniform high-level interface and a lot of algorithmic flexibility that you won't find in a dense-only package. -------------- next part -------------- An HTML attachment was scrubbed... URL: From vishy at stat.purdue.edu Thu Jan 6 21:46:12 2011 From: vishy at stat.purdue.edu (S V N Vishwanathan) Date: Thu, 06 Jan 2011 22:46:12 -0500 Subject: [petsc-users] Reshaping a vector into a matrix In-Reply-To: <1A2A44F6-679A-44CE-8AAB-FA278A6524E7@mcs.anl.gov> References: <1294283190.21492.126.camel@vishy-laptop> <4E5B3ADE-C2CA-47A8-B9AD-AA856096EA5F@mcs.anl.gov> <1294293766.21492.144.camel@vishy-laptop> <1294336117.21492.153.camel@vishy-laptop> <1294356291.21492.225.camel@vishy-laptop> <1A2A44F6-679A-44CE-8AAB-FA278A6524E7@mcs.anl.gov> Message-ID: <1294371972.21492.231.camel@vishy-laptop> > > > > Is there any place to read up about MAIJ matrices. I read the manual > > pages but they were rather cryptic. Perhaps some example code? > > http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/src/snes/examples/tutorials/ex48.c.html is the only example that uses. I wrote a little example (attached) to test my understanding of the MAIJ matrices. When I run it on a single processor I have no trouble but when I use mpiexec -n 2 I get the following errors. Any pointers on how to overcome this problem are appreciated. The code loads a matrix from a file, which can be downloaded from http://www.stat.purdue.edu/~vishy/adult9.train.x Before someone has ideas ;-), I should clarify that this is a standard machine learning dataset. [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Nonconforming object sizes! [0]PETSC ERROR: Mat mat,Vec y: local dim 48843 48842! [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: ./chumma on a linux-gnu named vishy-laptop by vishy Thu Jan 6 22:28:26 2011 [0]PETSC ERROR: Libraries linked from /home/vishy/Repositories/GenEntropy/Code/hpc/petsc-3.1-p7/linux-gnu-cxx-debug/lib [0]PETSC ERROR: Configure run at Thu Dec 30 23:48:41 2010 [0]PETSC ERROR: Configure options --with-debugging=yes --with-clanguage=C++ --with-shared --download-mpich=ifneeded [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: MatMult() line 1889 in src/mat/interface/matrix.c [0]PETSC ERROR: main() line 61 in chumma.cpp application called MPI_Abort(MPI_COMM_WORLD, 60) - process 0[cli_0]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, 60) - process 0 [0]0:Return code = 60 [0]1:Return code = 0, signaled with Interrupt vishy -------------- next part -------------- A non-text attachment was scrubbed... Name: chumma.cpp Type: text/x-c++src Size: 2464 bytes Desc: not available URL: From bsmith at mcs.anl.gov Thu Jan 6 21:53:19 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 6 Jan 2011 21:53:19 -0600 Subject: [petsc-users] Reshaping a vector into a matrix In-Reply-To: <1294371972.21492.231.camel@vishy-laptop> References: <1294283190.21492.126.camel@vishy-laptop> <4E5B3ADE-C2CA-47A8-B9AD-AA856096EA5F@mcs.anl.gov> <1294293766.21492.144.camel@vishy-laptop> <1294336117.21492.153.camel@vishy-laptop> <1294356291.21492.225.camel@vishy-laptop> <1A2A44F6-679A-44CE-8AAB-FA278A6524E7@mcs.anl.gov> <1294371972.21492.231.camel@vishy-laptop> Message-ID: <4771856B-9092-4BFB-BC84-2B140DFDCC40@mcs.anl.gov> You've assumed that the PETSC_DECIDE for the MatLoad() for local sizes will match that for the VecCreate(). In general they will not be the same. Once you've created your MATMAIJ (by they way you don't need to call MatAssembly on the MAIJ matrix) call MatGetLocalSize() and use that local size in the construction of the vector. Barry On Jan 6, 2011, at 9:46 PM, S V N Vishwanathan wrote: >>> >>> Is there any place to read up about MAIJ matrices. I read the manual >>> pages but they were rather cryptic. Perhaps some example code? >> >> http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/src/snes/examples/tutorials/ex48.c.html is the only example that uses. > > I wrote a little example (attached) to test my understanding of the MAIJ > matrices. When I run it on a single processor I have no trouble but when > I use mpiexec -n 2 I get the following errors. Any pointers on how to > overcome this problem are appreciated. The code loads a matrix from a > file, which can be downloaded from > > http://www.stat.purdue.edu/~vishy/adult9.train.x > > Before someone has ideas ;-), I should clarify that this is a standard > machine learning dataset. > > [0]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: Nonconforming object sizes! > [0]PETSC ERROR: Mat mat,Vec y: local dim 48843 48842! > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 > 14:26:37 CST 2010 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: ./chumma on a linux-gnu named vishy-laptop by vishy Thu > Jan 6 22:28:26 2011 > [0]PETSC ERROR: Libraries linked > from /home/vishy/Repositories/GenEntropy/Code/hpc/petsc-3.1-p7/linux-gnu-cxx-debug/lib > [0]PETSC ERROR: Configure run at Thu Dec 30 23:48:41 2010 > [0]PETSC ERROR: Configure options --with-debugging=yes > --with-clanguage=C++ --with-shared --download-mpich=ifneeded > [0]PETSC ERROR: > ------------------------------------------------------------------------ > [0]PETSC ERROR: MatMult() line 1889 in src/mat/interface/matrix.c > [0]PETSC ERROR: main() line 61 in chumma.cpp > application called MPI_Abort(MPI_COMM_WORLD, 60) - process 0[cli_0]: > aborting job: > application called MPI_Abort(MPI_COMM_WORLD, 60) - process 0 > [0]0:Return code = 60 > [0]1:Return code = 0, signaled with Interrupt > > vishy > From vishy at stat.purdue.edu Fri Jan 7 12:21:10 2011 From: vishy at stat.purdue.edu (S V N Vishwanathan) Date: Fri, 07 Jan 2011 13:21:10 -0500 Subject: [petsc-users] Reshaping a vector into a matrix In-Reply-To: <4771856B-9092-4BFB-BC84-2B140DFDCC40@mcs.anl.gov> References: <1294283190.21492.126.camel@vishy-laptop> <4E5B3ADE-C2CA-47A8-B9AD-AA856096EA5F@mcs.anl.gov> <1294293766.21492.144.camel@vishy-laptop> <1294336117.21492.153.camel@vishy-laptop> <1294356291.21492.225.camel@vishy-laptop> <1A2A44F6-679A-44CE-8AAB-FA278A6524E7@mcs.anl.gov> <1294371972.21492.231.camel@vishy-laptop> <4771856B-9092-4BFB-BC84-2B140DFDCC40@mcs.anl.gov> Message-ID: <1294424470.15672.17.camel@vishy-laptop> Dear Barry, Thanks for your help. Now the code works as expected (attached). There is one more problem that I am facing. If you remember I have vec = (vec1^t, vec2^t,...,veck^t)^t and given the MAIJ matrix X I need to compute fx = (X.vec1, X.vec2, ...,X.veck) >From what I understand so far, if I compute X.vec then I get a vector of the form x_1.vec1 x_1.vec2 . . . x_n.veck where x_i denotes the i-th row of X. In my application I need access to (x_i.vec1, x_i.vec2, ...., x_i.veck) on the same processor. If I use the local size provided by the MatGetLocalSize how do I ensure that the local vectors are aligned w.r.t k? vishy On Thu, 2011-01-06 at 21:53 -0600, Barry Smith wrote: > You've assumed that the PETSC_DECIDE for the MatLoad() for local sizes will match that for the VecCreate(). In general they will not be the same. > Once you've created your MATMAIJ (by they way you don't need to call MatAssembly on the MAIJ matrix) call MatGetLocalSize() and use that local size in the construction of the vector. > > Barry > > > On Jan 6, 2011, at 9:46 PM, S V N Vishwanathan wrote: > > >>> > >>> Is there any place to read up about MAIJ matrices. I read the manual > >>> pages but they were rather cryptic. Perhaps some example code? > >> > >> http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/src/snes/examples/tutorials/ex48.c.html is the only example that uses. > > > > I wrote a little example (attached) to test my understanding of the MAIJ > > matrices. When I run it on a single processor I have no trouble but when > > I use mpiexec -n 2 I get the following errors. Any pointers on how to > > overcome this problem are appreciated. The code loads a matrix from a > > file, which can be downloaded from > > > > http://www.stat.purdue.edu/~vishy/adult9.train.x > > > > Before someone has ideas ;-), I should clarify that this is a standard > > machine learning dataset. > > > > [0]PETSC ERROR: --------------------- Error Message > > ------------------------------------ > > [0]PETSC ERROR: Nonconforming object sizes! > > [0]PETSC ERROR: Mat mat,Vec y: local dim 48843 48842! > > [0]PETSC ERROR: > > ------------------------------------------------------------------------ > > [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 > > 14:26:37 CST 2010 > > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > > [0]PETSC ERROR: See docs/index.html for manual pages. > > [0]PETSC ERROR: > > ------------------------------------------------------------------------ > > [0]PETSC ERROR: ./chumma on a linux-gnu named vishy-laptop by vishy Thu > > Jan 6 22:28:26 2011 > > [0]PETSC ERROR: Libraries linked > > from /home/vishy/Repositories/GenEntropy/Code/hpc/petsc-3.1-p7/linux-gnu-cxx-debug/lib > > [0]PETSC ERROR: Configure run at Thu Dec 30 23:48:41 2010 > > [0]PETSC ERROR: Configure options --with-debugging=yes > > --with-clanguage=C++ --with-shared --download-mpich=ifneeded > > [0]PETSC ERROR: > > ------------------------------------------------------------------------ > > [0]PETSC ERROR: MatMult() line 1889 in src/mat/interface/matrix.c > > [0]PETSC ERROR: main() line 61 in chumma.cpp > > application called MPI_Abort(MPI_COMM_WORLD, 60) - process 0[cli_0]: > > aborting job: > > application called MPI_Abort(MPI_COMM_WORLD, 60) - process 0 > > [0]0:Return code = 60 > > [0]1:Return code = 0, signaled with Interrupt > > > > vishy > > > -------------- next part -------------- A non-text attachment was scrubbed... Name: chumma.cpp Type: text/x-c++src Size: 2659 bytes Desc: not available URL: From bsmith at mcs.anl.gov Fri Jan 7 14:07:03 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 7 Jan 2011 14:07:03 -0600 Subject: [petsc-users] Reshaping a vector into a matrix In-Reply-To: <1294424470.15672.17.camel@vishy-laptop> References: <1294283190.21492.126.camel@vishy-laptop> <4E5B3ADE-C2CA-47A8-B9AD-AA856096EA5F@mcs.anl.gov> <1294293766.21492.144.camel@vishy-laptop> <1294336117.21492.153.camel@vishy-laptop> <1294356291.21492.225.camel@vishy-laptop> <1A2A44F6-679A-44CE-8AAB-FA278A6524E7@mcs.anl.gov> <1294371972.21492.231.camel@vishy-laptop> <4771856B-9092-4BFB-BC84-2B140DFDCC40@mcs.anl.gov> <1294424470.15672.17.camel@vishy-laptop> Message-ID: <97A5080D-5F8C-4013-AC15-EAA102AA479F@mcs.anl.gov> On Jan 7, 2011, at 12:21 PM, S V N Vishwanathan wrote: > Dear Barry, > > Thanks for your help. Now the code works as expected (attached). There > is one more problem that I am facing. > > If you remember I have > > vec = (vec1^t, vec2^t,...,veck^t)^t > > and given the MAIJ matrix X I need to compute > > fx = (X.vec1, X.vec2, ...,X.veck) > >> From what I understand so far, if I compute X.vec then I get a vector of > the form > > x_1.vec1 > x_1.vec2 > . > . > . > x_n.veck > > where x_i denotes the i-th row of X. > > In my application I need access to (x_i.vec1, x_i.vec2, ...., x_i.veck) > on the same processor. If I use the local size provided by the > MatGetLocalSize how do I ensure that the local vectors are aligned w.r.t After you do the MatLoad() of the AIJ matrix get its local size, say nlocal. The local size of the MAIJ matrix created from the AIJ matrix will be k*nlocal. You will then create your vector with a local size of k*nlocal. I think that everything lines up perfectly. If this is not the case then please send a demonstration of its failure to petsc-maint at mcs.anl.gov (code) and we'll see what is going on. Barry > k? > > vishy > > > > On Thu, 2011-01-06 at 21:53 -0600, Barry Smith wrote: >> You've assumed that the PETSC_DECIDE for the MatLoad() for local sizes will match that for the VecCreate(). In general they will not be the same. >> Once you've created your MATMAIJ (by they way you don't need to call MatAssembly on the MAIJ matrix) call MatGetLocalSize() and use that local size in the construction of the vector. >> >> Barry >> >> >> On Jan 6, 2011, at 9:46 PM, S V N Vishwanathan wrote: >> >>>>> >>>>> Is there any place to read up about MAIJ matrices. I read the manual >>>>> pages but they were rather cryptic. Perhaps some example code? >>>> >>>> http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/src/snes/examples/tutorials/ex48.c.html is the only example that uses. >>> >>> I wrote a little example (attached) to test my understanding of the MAIJ >>> matrices. When I run it on a single processor I have no trouble but when >>> I use mpiexec -n 2 I get the following errors. Any pointers on how to >>> overcome this problem are appreciated. The code loads a matrix from a >>> file, which can be downloaded from >>> >>> http://www.stat.purdue.edu/~vishy/adult9.train.x >>> >>> Before someone has ideas ;-), I should clarify that this is a standard >>> machine learning dataset. >>> >>> [0]PETSC ERROR: --------------------- Error Message >>> ------------------------------------ >>> [0]PETSC ERROR: Nonconforming object sizes! >>> [0]PETSC ERROR: Mat mat,Vec y: local dim 48843 48842! >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 >>> 14:26:37 CST 2010 >>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>> [0]PETSC ERROR: See docs/index.html for manual pages. >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: ./chumma on a linux-gnu named vishy-laptop by vishy Thu >>> Jan 6 22:28:26 2011 >>> [0]PETSC ERROR: Libraries linked >>> from /home/vishy/Repositories/GenEntropy/Code/hpc/petsc-3.1-p7/linux-gnu-cxx-debug/lib >>> [0]PETSC ERROR: Configure run at Thu Dec 30 23:48:41 2010 >>> [0]PETSC ERROR: Configure options --with-debugging=yes >>> --with-clanguage=C++ --with-shared --download-mpich=ifneeded >>> [0]PETSC ERROR: >>> ------------------------------------------------------------------------ >>> [0]PETSC ERROR: MatMult() line 1889 in src/mat/interface/matrix.c >>> [0]PETSC ERROR: main() line 61 in chumma.cpp >>> application called MPI_Abort(MPI_COMM_WORLD, 60) - process 0[cli_0]: >>> aborting job: >>> application called MPI_Abort(MPI_COMM_WORLD, 60) - process 0 >>> [0]0:Return code = 60 >>> [0]1:Return code = 0, signaled with Interrupt >>> >>> vishy >>> >> > > From aron.ahmadia at kaust.edu.sa Sat Jan 8 11:42:02 2011 From: aron.ahmadia at kaust.edu.sa (Aron Ahmadia) Date: Sat, 8 Jan 2011 14:42:02 -0300 Subject: [petsc-users] Questions about PETSc In-Reply-To: References: Message-ID: if the file is stored in the PETSc format, you can use PETSc to pull the file in using MPIIO, which should be (hopefully) faster. http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/Viewer/PetscViewerBinarySetMPIIO.html Feel free to pop these questions to petsc-users (cc'd), there are more of them and they usually know more than me :) -Aron On Sat, Jan 8, 2011 at 2:21 PM, Zuhair Khayyat wrote: > Dear Aron, > > Currently I am using PETSc for my research to implement a graph mining > algorithm on a large cluster, and I would like ask you some questions due to > your experience with this tool. > > Have you ever tried to optimize allocating a very large matrix from input > file in parallel? I have a very large file (around 10 GB) and it takes too > long to allocate the matrix through the main node. I am planning to > implement a distributed parallel file reader that split the original file > and make each node reads it separately into a common matrix. > > Is there and other tools that are comparable to PETSc in which they have > parallel file reader? > > Thank you for your help > > Regards, > Zuhair Khayyat > -------------- next part -------------- An HTML attachment was scrubbed... URL: From aron.ahmadia at kaust.edu.sa Sun Jan 9 04:57:59 2011 From: aron.ahmadia at kaust.edu.sa (Aron Ahmadia) Date: Sun, 9 Jan 2011 07:57:59 -0300 Subject: [petsc-users] Questions about PETSc In-Reply-To: References: Message-ID: MPIIO is a parallel file reader. Your application should output in the PETSc binary format (it's easy) for the best performance, an ASCII file cannot be brought in quickly using any existing code. Need to dash, there are MATLAB code examples in bin/ that show how to write out the binary format. A On Sun, Jan 9, 2011 at 4:45 AM, Zuhair Khayyat wrote: > Dear Aron, > > Actually the file is an output of another application, and is stored in a > simple text file. I have read about MPIIO, however couldn't figure out if it > is a parallel file reader or not. > > Have you ever tried or worked on a parallel file reader to a common global > matrix in PETSc, where each processor has part of the file? Thank you > > Regards, > Zuhair Khayyat > > > On Sat, Jan 8, 2011 at 8:42 PM, Aron Ahmadia wrote: > >> if the file is stored in the PETSc format, you can use PETSc to pull the >> file in using MPIIO, which should be (hopefully) faster. >> >> >> http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/Viewer/PetscViewerBinarySetMPIIO.html >> >> Feel free to pop these questions to petsc-users (cc'd), there are more of >> them and they usually know more than me :) >> >> -Aron >> >> >> On Sat, Jan 8, 2011 at 2:21 PM, Zuhair Khayyat < >> zuhair.khayyat at kaust.edu.sa> wrote: >> >>> Dear Aron, >>> >>> Currently I am using PETSc for my research to implement a graph mining >>> algorithm on a large cluster, and I would like ask you some questions due to >>> your experience with this tool. >>> >>> Have you ever tried to optimize allocating a very large matrix from input >>> file in parallel? I have a very large file (around 10 GB) and it takes too >>> long to allocate the matrix through the main node. I am planning to >>> implement a distributed parallel file reader that split the original file >>> and make each node reads it separately into a common matrix. >>> >>> Is there and other tools that are comparable to PETSc in which they have >>> parallel file reader? >>> >>> Thank you for your help >>> >>> Regards, >>> Zuhair Khayyat >>> >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From zuhair.khayyat at kaust.edu.sa Sun Jan 9 01:45:34 2011 From: zuhair.khayyat at kaust.edu.sa (Zuhair Khayyat) Date: Sun, 9 Jan 2011 10:45:34 +0300 Subject: [petsc-users] Questions about PETSc In-Reply-To: References: Message-ID: Dear Aron, Actually the file is an output of another application, and is stored in a simple text file. I have read about MPIIO, however couldn't figure out if it is a parallel file reader or not. Have you ever tried or worked on a parallel file reader to a common global matrix in PETSc, where each processor has part of the file? Thank you Regards, Zuhair Khayyat On Sat, Jan 8, 2011 at 8:42 PM, Aron Ahmadia wrote: > if the file is stored in the PETSc format, you can use PETSc to pull the > file in using MPIIO, which should be (hopefully) faster. > > > http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/Viewer/PetscViewerBinarySetMPIIO.html > > Feel free to pop these questions to petsc-users (cc'd), there are more of > them and they usually know more than me :) > > -Aron > > > On Sat, Jan 8, 2011 at 2:21 PM, Zuhair Khayyat < > zuhair.khayyat at kaust.edu.sa> wrote: > >> Dear Aron, >> >> Currently I am using PETSc for my research to implement a graph mining >> algorithm on a large cluster, and I would like ask you some questions due to >> your experience with this tool. >> >> Have you ever tried to optimize allocating a very large matrix from input >> file in parallel? I have a very large file (around 10 GB) and it takes too >> long to allocate the matrix through the main node. I am planning to >> implement a distributed parallel file reader that split the original file >> and make each node reads it separately into a common matrix. >> >> Is there and other tools that are comparable to PETSc in which they have >> parallel file reader? >> >> Thank you for your help >> >> Regards, >> Zuhair Khayyat >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From vishy at stat.purdue.edu Sun Jan 9 10:31:02 2011 From: vishy at stat.purdue.edu (S V N Vishwanathan) Date: Sun, 09 Jan 2011 11:31:02 -0500 Subject: [petsc-users] Reshaping a vector into a matrix In-Reply-To: <97A5080D-5F8C-4013-AC15-EAA102AA479F@mcs.anl.gov> References: <1294283190.21492.126.camel@vishy-laptop> <4E5B3ADE-C2CA-47A8-B9AD-AA856096EA5F@mcs.anl.gov> <1294293766.21492.144.camel@vishy-laptop> <1294336117.21492.153.camel@vishy-laptop> <1294356291.21492.225.camel@vishy-laptop> <1A2A44F6-679A-44CE-8AAB-FA278A6524E7@mcs.anl.gov> <1294371972.21492.231.camel@vishy-laptop> <4771856B-9092-4BFB-BC84-2B140DFDCC40@mcs.anl.gov> <1294424470.15672.17.camel@vishy-laptop> <97A5080D-5F8C-4013-AC15-EAA102AA479F@mcs.anl.gov> Message-ID: <1294590662.5114.7.camel@vishy-laptop> > > > > In my application I need access to (x_i.vec1, x_i.vec2, ...., x_i.veck) > > on the same processor. If I use the local size provided by the > > MatGetLocalSize how do I ensure that the local vectors are aligned w.r.t > > After you do the MatLoad() of the AIJ matrix get its local size, say nlocal. The local size of the MAIJ matrix created from the AIJ matrix will be k*nlocal. > You will then create your vector with a local size of k*nlocal. I think that everything lines up perfectly. > If this is not the case then please send a demonstration of its failure to petsc-maint at mcs.anl.gov (code) and we'll see what is going on. This is indeed the case but I was not sure if it was just fluke or by design. Thanks for the clarification. vishy From bsmith at mcs.anl.gov Sun Jan 9 11:38:45 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sun, 9 Jan 2011 11:38:45 -0600 Subject: [petsc-users] Questions about PETSc In-Reply-To: References: Message-ID: <681FFBF5-6ADF-4871-A323-34A0266500A7@mcs.anl.gov> On Jan 9, 2011, at 1:45 AM, Zuhair Khayyat wrote: > Dear Aron, > > Actually the file is an output of another application, and is stored in a simple text file. I have read about MPIIO, however couldn't figure out if it is a parallel file reader or not. > > Have you ever tried or worked on a parallel file reader to a common global matrix in PETSc, where each processor has part of the file? Thank you If the file is ASCII there is no way to load it in efficiently if it is large. Say more than a couple of megabytes. If it is ASCII and small just read it in on process 0. If it is large you will have to change the other application to save in a binary format instead of ASCII. Barry > > Regards, > Zuhair Khayyat > > On Sat, Jan 8, 2011 at 8:42 PM, Aron Ahmadia wrote: > if the file is stored in the PETSc format, you can use PETSc to pull the file in using MPIIO, which should be (hopefully) faster. > > http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/Viewer/PetscViewerBinarySetMPIIO.html > > Feel free to pop these questions to petsc-users (cc'd), there are more of them and they usually know more than me :) > > -Aron > > > On Sat, Jan 8, 2011 at 2:21 PM, Zuhair Khayyat wrote: > Dear Aron, > > Currently I am using PETSc for my research to implement a graph mining algorithm on a large cluster, and I would like ask you some questions due to your experience with this tool. > > Have you ever tried to optimize allocating a very large matrix from input file in parallel? I have a very large file (around 10 GB) and it takes too long to allocate the matrix through the main node. I am planning to implement a distributed parallel file reader that split the original file and make each node reads it separately into a common matrix. > > Is there and other tools that are comparable to PETSc in which they have parallel file reader? > > Thank you for your help > > Regards, > Zuhair Khayyat > > From C.Klaij at marin.nl Mon Jan 10 04:39:03 2011 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Mon, 10 Jan 2011 10:39:03 +0000 Subject: [petsc-users] KSPGCR and fortran? Message-ID: I'm trying to use KSPGCR in my F90 program (petsc-3.1-p7) but I get a compiliation error: error #6404: This name does not have a type, and must have an explicit type. [KSPGCR] call KSPSetType(AAksp,KSPGCR,ierr) My program uses #include "finclude/petsckspdef.h" and "use petscksp" following strategy 4 of the UsingFortran instructions; it compiles fine with KSPFGMRES. Apparently GCR is not available for fortran? At least, this is what I get on my system: $ grep -i KSPGMRES $PETSC_DIR/include/finclude/petsckspdef.h #define KSPGMRESCGSRefinementType PetscEnum #define KSPGMRES 'gmres' $ grep -i KSPGCR $PETSC_DIR/include/finclude/petsckspdef.h $ grep -i KSPGCR $PETSC_DIR/include/finclude/*.h $ Chris dr. ir. Christiaan Klaij CFD Researcher Research & Development E mailto:C.Klaij at marin.nl T +31 317 49 33 44 MARIN 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl From bsmith at mcs.anl.gov Mon Jan 10 07:45:26 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 10 Jan 2011 07:45:26 -0600 Subject: [petsc-users] KSPGCR and fortran? In-Reply-To: References: Message-ID: Just an oversight in our Fortran includes. You can either just call it 'gcr' in your code instead of KSPGCR or add the following line to include/finclude/petsckspdef.h #define KSPGCR 'gcr' after the line #define KSPPYTHON 'python' Barry On Jan 10, 2011, at 4:39 AM, Klaij, Christiaan wrote: > I'm trying to use KSPGCR in my F90 program (petsc-3.1-p7) > but I get a compiliation error: > > error #6404: This name does not have a type, and must have an explicit type. [KSPGCR] > call KSPSetType(AAksp,KSPGCR,ierr) > > My program uses #include "finclude/petsckspdef.h" and "use > petscksp" following strategy 4 of the UsingFortran instructions; > it compiles fine with KSPFGMRES. > > Apparently GCR is not available for fortran? At least, this is > what I get on my system: > > $ grep -i KSPGMRES $PETSC_DIR/include/finclude/petsckspdef.h > #define KSPGMRESCGSRefinementType PetscEnum > #define KSPGMRES 'gmres' > $ grep -i KSPGCR $PETSC_DIR/include/finclude/petsckspdef.h > $ grep -i KSPGCR $PETSC_DIR/include/finclude/*.h > $ > > Chris > > > dr. ir. Christiaan Klaij > CFD Researcher > Research & Development > E mailto:C.Klaij at marin.nl > T +31 317 49 33 44 > > MARIN > 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands > T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl > From C.Klaij at marin.nl Tue Jan 11 01:34:42 2011 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Tue, 11 Jan 2011 07:34:42 +0000 Subject: [petsc-users] KSPGCR and fortran? Message-ID: Thanks Barry, I edited petsckspdef and now it works. Chris Date: Mon, 10 Jan 2011 07:45:26 -0600 From: Barry Smith Subject: Re: [petsc-users] KSPGCR and fortran? To: PETSc users list Message-ID: Content-Type: text/plain; charset=us-ascii Just an oversight in our Fortran includes. You can either just call it 'gcr' in your code instead of KSPGCR or add the following line to include/finclude/petsckspdef.h #define KSPGCR 'gcr' after the line #define KSPPYTHON 'python' Barry dr. ir. Christiaan Klaij CFD Researcher Research & Development E mailto:C.Klaij at marin.nl T +31 317 49 33 44 MARIN 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl From a.mesgarnejad at gmail.com Wed Jan 12 13:30:26 2011 From: a.mesgarnejad at gmail.com (Ataollah Mesgarnejad) Date: Wed, 12 Jan 2011 13:30:26 -0600 Subject: [petsc-users] PetscMallocAlign Message-ID: Dear All, I have a memory leak problem with my program; it eventually exhausts all the memory on my system and program aborts. I was checking memory usage with Valgrind and there is a persistent error that always traces back PetscMallocAlign. Something like: ==2918== ==2918== 18,792 bytes in 2 blocks are possibly lost in loss record 2,256 of 2,257 ==2918== at 0x4A04360: memalign (vg_replace_malloc.c:532) ==2918== by 0x8AC18A: PetscMallocAlign(unsigned long, int, char const*, char const*, char const*, void**) (mal.c:30) ==2918== by 0x8ADA8E: PetscTrMallocDefault(unsigned long, int, char const*, char const*, char const*, void**) (mtr.c:192) ==2918== by 0x9E4230: VecCreate_MPI_Private(_p_Vec*, PetscTruth, int, double const*) (pbvec.c:187) ==2918== by 0x9E48EA: VecCreate_MPI (pbvec.c:232) ==2918== by 0x9A9E12: VecSetType(_p_Vec*, char const*) (vecreg.c:54) ==2918== by 0x9FEA80: VecCreateMPI(ompi_communicator_t*, int, int, _p_Vec**) (vmpicr.c:42) ==2918== by 0xBBFA6C: DACreateGlobalVector(_p_DA*, _p_Vec**) (dadist.c:42) ==2918== by 0x6CE34F: main (PFMAT-main.cpp:76) ==2918== I'm wondering if this is related to a some error in my declarations or a known issue or an issue at all? PS: I get the same kind of error both on my Mac os X 10.6 and Fedora 13. Best, Ata Mesgarnejad -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Wed Jan 12 13:36:30 2011 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 12 Jan 2011 13:36:30 -0600 (CST) Subject: [petsc-users] PetscMallocAlign In-Reply-To: References: Message-ID: perhaps you are missing a call to VecDestroy() on the vec created by DACreateGlobalVector? Satish On Wed, 12 Jan 2011, Ataollah Mesgarnejad wrote: > Dear All, > > I have a memory leak problem with my program; it eventually exhausts all the memory on my system and program aborts. I was checking memory usage with Valgrind and there is a persistent error that always traces back PetscMallocAlign. Something like: > > ==2918== > ==2918== 18,792 bytes in 2 blocks are possibly lost in loss record 2,256 of 2,257 > ==2918== at 0x4A04360: memalign (vg_replace_malloc.c:532) > ==2918== by 0x8AC18A: PetscMallocAlign(unsigned long, int, char const*, char const*, char const*, void**) (mal.c:30) > ==2918== by 0x8ADA8E: PetscTrMallocDefault(unsigned long, int, char const*, char const*, char const*, void**) (mtr.c:192) > ==2918== by 0x9E4230: VecCreate_MPI_Private(_p_Vec*, PetscTruth, int, double const*) (pbvec.c:187) > ==2918== by 0x9E48EA: VecCreate_MPI (pbvec.c:232) > ==2918== by 0x9A9E12: VecSetType(_p_Vec*, char const*) (vecreg.c:54) > ==2918== by 0x9FEA80: VecCreateMPI(ompi_communicator_t*, int, int, _p_Vec**) (vmpicr.c:42) > ==2918== by 0xBBFA6C: DACreateGlobalVector(_p_DA*, _p_Vec**) (dadist.c:42) > ==2918== by 0x6CE34F: main (PFMAT-main.cpp:76) > ==2918== > > > I'm wondering if this is related to a some error in my declarations or a known issue or an issue at all? > > PS: I get the same kind of error both on my Mac os X 10.6 and Fedora 13. > > Best, > Ata Mesgarnejad From a.mesgarnejad at gmail.com Wed Jan 12 13:59:36 2011 From: a.mesgarnejad at gmail.com (Ataollah Mesgarnejad) Date: Wed, 12 Jan 2011 13:59:36 -0600 Subject: [petsc-users] PetscMallocAlign In-Reply-To: References: Message-ID: <5FBF69E8-FF9D-41CD-A14D-3ADC5663DAE7@gmail.com> It's not only the case with Vectors it come up again for example when I use VecView,even though I flush and destroy the viewer : ==3347== ==3347== 18,276 bytes in 1 blocks are possibly lost in loss record 3,420 of 3,439 ==3347== at 0x4A04360: memalign (vg_replace_malloc.c:532) ==3347== by 0x8AC2AA: PetscMallocAlign(unsigned long, int, char const*, char const*, char const*, void**) (mal.c:30) ==3347== by 0x8ADBAE: PetscTrMallocDefault(unsigned long, int, char const*, char const*, char const*, void**) (mtr.c:192) ==3347== by 0x9E4350: VecCreate_MPI_Private(_p_Vec*, PetscTruth, int, double const*) (pbvec.c:187) ==3347== by 0x9E4A0A: VecCreate_MPI (pbvec.c:232) ==3347== by 0x9A9F32: VecSetType(_p_Vec*, char const*) (vecreg.c:54) ==3347== by 0x9FEBA0: VecCreateMPI(ompi_communicator_t*, int, int, _p_Vec**) (vmpicr.c:42) ==3347== by 0xBC06F2: DACreateNaturalVector(_p_DA*, _p_Vec**) (dadist.c:99) ==3347== by 0xBC24B8: DAView_VTK(_p_DA*, _p_PetscViewer*) (daview.c:145) ==3347== by 0xBC3336: DAView(_p_DA*, _p_PetscViewer*) (daview.c:244) ==3347== by 0x6D612C: WriteOutput(AppCtx*, int) (PFMAT-Init.cpp:312) ==3347== by 0x6CFE59: main (PFMAT-main.cpp:206) ==3347== Ata On Jan 12, 2011, at 1:36 PM, Satish Balay wrote: > perhaps you are missing a call to VecDestroy() on the vec created by > DACreateGlobalVector? > > Satish > > On Wed, 12 Jan 2011, Ataollah Mesgarnejad wrote: > >> Dear All, >> >> I have a memory leak problem with my program; it eventually exhausts all the memory on my system and program aborts. I was checking memory usage with Valgrind and there is a persistent error that always traces back PetscMallocAlign. Something like: >> >> ==2918== >> ==2918== 18,792 bytes in 2 blocks are possibly lost in loss record 2,256 of 2,257 >> ==2918== at 0x4A04360: memalign (vg_replace_malloc.c:532) >> ==2918== by 0x8AC18A: PetscMallocAlign(unsigned long, int, char const*, char const*, char const*, void**) (mal.c:30) >> ==2918== by 0x8ADA8E: PetscTrMallocDefault(unsigned long, int, char const*, char const*, char const*, void**) (mtr.c:192) >> ==2918== by 0x9E4230: VecCreate_MPI_Private(_p_Vec*, PetscTruth, int, double const*) (pbvec.c:187) >> ==2918== by 0x9E48EA: VecCreate_MPI (pbvec.c:232) >> ==2918== by 0x9A9E12: VecSetType(_p_Vec*, char const*) (vecreg.c:54) >> ==2918== by 0x9FEA80: VecCreateMPI(ompi_communicator_t*, int, int, _p_Vec**) (vmpicr.c:42) >> ==2918== by 0xBBFA6C: DACreateGlobalVector(_p_DA*, _p_Vec**) (dadist.c:42) >> ==2918== by 0x6CE34F: main (PFMAT-main.cpp:76) >> ==2918== >> >> >> I'm wondering if this is related to a some error in my declarations or a known issue or an issue at all? >> >> PS: I get the same kind of error both on my Mac os X 10.6 and Fedora 13. >> >> Best, >> Ata Mesgarnejad > From bsmith at mcs.anl.gov Wed Jan 12 14:18:38 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 12 Jan 2011 14:18:38 -0600 Subject: [petsc-users] PetscMallocAlign In-Reply-To: <5FBF69E8-FF9D-41CD-A14D-3ADC5663DAE7@gmail.com> References: <5FBF69E8-FF9D-41CD-A14D-3ADC5663DAE7@gmail.com> Message-ID: All mallocs in PETSc go through PetscMallocAlign() (it is just a simple wrapper that makes sure all returned values are double aligned). The problem is not with PetscMallocAlign(). The problem is some objects are not getting destroyed. Likely it is the DA object itself. Barry On Jan 12, 2011, at 1:59 PM, Ataollah Mesgarnejad wrote: > It's not only the case with Vectors it come up again for example when I use VecView,even though I flush and destroy the viewer : > > ==3347== > ==3347== 18,276 bytes in 1 blocks are possibly lost in loss record 3,420 of 3,439 > ==3347== at 0x4A04360: memalign (vg_replace_malloc.c:532) > ==3347== by 0x8AC2AA: PetscMallocAlign(unsigned long, int, char const*, char const*, char const*, void**) (mal.c:30) > ==3347== by 0x8ADBAE: PetscTrMallocDefault(unsigned long, int, char const*, char const*, char const*, void**) (mtr.c:192) > ==3347== by 0x9E4350: VecCreate_MPI_Private(_p_Vec*, PetscTruth, int, double const*) (pbvec.c:187) > ==3347== by 0x9E4A0A: VecCreate_MPI (pbvec.c:232) > ==3347== by 0x9A9F32: VecSetType(_p_Vec*, char const*) (vecreg.c:54) > ==3347== by 0x9FEBA0: VecCreateMPI(ompi_communicator_t*, int, int, _p_Vec**) (vmpicr.c:42) > ==3347== by 0xBC06F2: DACreateNaturalVector(_p_DA*, _p_Vec**) (dadist.c:99) > ==3347== by 0xBC24B8: DAView_VTK(_p_DA*, _p_PetscViewer*) (daview.c:145) > ==3347== by 0xBC3336: DAView(_p_DA*, _p_PetscViewer*) (daview.c:244) > ==3347== by 0x6D612C: WriteOutput(AppCtx*, int) (PFMAT-Init.cpp:312) > ==3347== by 0x6CFE59: main (PFMAT-main.cpp:206) > ==3347== > > Ata > > On Jan 12, 2011, at 1:36 PM, Satish Balay wrote: > >> perhaps you are missing a call to VecDestroy() on the vec created by >> DACreateGlobalVector? >> >> Satish >> >> On Wed, 12 Jan 2011, Ataollah Mesgarnejad wrote: >> >>> Dear All, >>> >>> I have a memory leak problem with my program; it eventually exhausts all the memory on my system and program aborts. I was checking memory usage with Valgrind and there is a persistent error that always traces back PetscMallocAlign. Something like: >>> >>> ==2918== >>> ==2918== 18,792 bytes in 2 blocks are possibly lost in loss record 2,256 of 2,257 >>> ==2918== at 0x4A04360: memalign (vg_replace_malloc.c:532) >>> ==2918== by 0x8AC18A: PetscMallocAlign(unsigned long, int, char const*, char const*, char const*, void**) (mal.c:30) >>> ==2918== by 0x8ADA8E: PetscTrMallocDefault(unsigned long, int, char const*, char const*, char const*, void**) (mtr.c:192) >>> ==2918== by 0x9E4230: VecCreate_MPI_Private(_p_Vec*, PetscTruth, int, double const*) (pbvec.c:187) >>> ==2918== by 0x9E48EA: VecCreate_MPI (pbvec.c:232) >>> ==2918== by 0x9A9E12: VecSetType(_p_Vec*, char const*) (vecreg.c:54) >>> ==2918== by 0x9FEA80: VecCreateMPI(ompi_communicator_t*, int, int, _p_Vec**) (vmpicr.c:42) >>> ==2918== by 0xBBFA6C: DACreateGlobalVector(_p_DA*, _p_Vec**) (dadist.c:42) >>> ==2918== by 0x6CE34F: main (PFMAT-main.cpp:76) >>> ==2918== >>> >>> >>> I'm wondering if this is related to a some error in my declarations or a known issue or an issue at all? >>> >>> PS: I get the same kind of error both on my Mac os X 10.6 and Fedora 13. >>> >>> Best, >>> Ata Mesgarnejad >> > From Patrice.Goulet at gci.ulaval.ca Wed Jan 12 16:36:46 2011 From: Patrice.Goulet at gci.ulaval.ca (Patrice Goulet) Date: Wed, 12 Jan 2011 17:36:46 -0500 Subject: [petsc-users] Parallel direct solver Message-ID: <7F3778E04364E74F82086EE70794A0F713373C@EXCH-MBX-A.ulaval.ca> Hi, I'm trying to use the external package SuperLU_DIST_2.4-hg-v2 with petsc-3.1-p7 to be able to solve a linear system in parallel with a direct method. My code is written in C++, so here are the configuration options I use to compile PETSc. --with-matlab=0 --with-dynamic=0 --with-debugging=0 --with-clanguage=C++ --with-shared=0 -PETSC_ARCH=linux-mpi --download-f-blas-lapack=1 --download-parmetis=1 --download-superlu_dist=1 Everything goes well to compile and test my PETSc installation. When I try to compile my Solver, I get the following error: petsc-3.1-p7/linux-mpi/lib/libsuperlu_dist_2.4.a(get_perm_c_parmetis.o): In function `get_perm_c_parmetis': get_perm_c_parmetis.c:(.text+0x164d): undefined reference to `ParMETIS_V3_NodeND' collect2: ld returned 1 exit status Can you please help me to solve this problem? Thanks Patrice Goulet -------------- next part -------------- An HTML attachment was scrubbed... URL: From spam.wax at gmail.com Wed Jan 12 16:53:43 2011 From: spam.wax at gmail.com (Hamid M.) Date: Wed, 12 Jan 2011 17:53:43 -0500 Subject: [petsc-users] Parallel matrix assembly questions Message-ID: Hello, I need to build a dense matrix in parallel and looking at examples in ksp/examples/tutorials it seems there are two approaches. In ex3.c MPI_Comm_rank and MPI_Comm_size are used to compute the index range for the current processor and ex5.c uses MatGetOwnershipRange to figure out the index range. 1- Is there any advantage using either of these methods or they will behave the same when it comes to performance and portability ? Based on comments in ex5.c, PETSc partitions the matrix by contiguous chunks of rows. 2- Is there a way of changing this scheme using PETSc routines ? 3- If not, can I partition my matrix in a different way, say 2D block cycling method, and still be able to use PETSc to solve it ? Since my matrix is dense I am going to start with direct solvers and I am concerned whether the partitioning scheme will affect the solver's performance. thanks in advance, Hamid From SJ_Ormiston at UManitoba.CA Wed Jan 12 16:03:01 2011 From: SJ_Ormiston at UManitoba.CA (Ormiston, Scott J.) Date: Wed, 12 Jan 2011 16:03:01 -0600 Subject: [petsc-users] Getting a sequential copy of a parallel vector Message-ID: <20110112160301.17274zw45kyyoq8s@webtools.cc.umanitoba.ca> We are new to PETSc and are testing the KSP solvers. We think we can fill a PETSc matrix and right hand side vector and get a parallel solution vector, but we are not sure how to get a copy of the solution vector that we can use in other calculations and to write results to a file. We are using Fortran 95. We are looking at a sequence of calls involving ISCreateGeneral() VecScatterCreate() VecScatterBegin() VecScatterEnd() along with ISDestroy() VecScatterDestroy(). Is this the correct approach? We can only find ex30f.F as an example. Is there a better fortran example somewhere that shows how to do this? Thanks. Scott J. Ormiston Ph.D., Professor University of Manitoba From bsmith at mcs.anl.gov Wed Jan 12 17:02:23 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 12 Jan 2011 17:02:23 -0600 Subject: [petsc-users] Parallel direct solver In-Reply-To: <7F3778E04364E74F82086EE70794A0F713373C@EXCH-MBX-A.ulaval.ca> References: <7F3778E04364E74F82086EE70794A0F713373C@EXCH-MBX-A.ulaval.ca> Message-ID: <223B2735-2707-4DAA-9E73-8063A38943AE@mcs.anl.gov> SuperLU_dist uses parmetis, it looks like in your makefile you are not linking against the parametis libraries. If you used the PETSc makefiles it would automatically link against all needed files. Barry On Jan 12, 2011, at 4:36 PM, Patrice Goulet wrote: > Hi, > > I?m trying to use the external package SuperLU_DIST_2.4-hg-v2 with petsc-3.1-p7 to be able to solve a linear system in parallel with a direct method. My code is written in C++, so here are the configuration options I use to compile PETSc. > > --with-matlab=0 --with-dynamic=0 --with-debugging=0 --with-clanguage=C++ --with-shared=0 -PETSC_ARCH=linux-mpi --download-f-blas-lapack=1 --download-parmetis=1 --download-superlu_dist=1 > > Everything goes well to compile and test my PETSc installation. > > When I try to compile my Solver, I get the following error: > petsc-3.1-p7/linux-mpi/lib/libsuperlu_dist_2.4.a(get_perm_c_parmetis.o): In function `get_perm_c_parmetis': > get_perm_c_parmetis.c:(.text+0x164d): undefined reference to `ParMETIS_V3_NodeND' > collect2: ld returned 1 exit status > > Can you please help me to solve this problem? > > Thanks > > Patrice Goulet From bsmith at mcs.anl.gov Wed Jan 12 17:04:48 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 12 Jan 2011 17:04:48 -0600 Subject: [petsc-users] Parallel matrix assembly questions In-Reply-To: References: Message-ID: <01A60751-5E5F-4CE7-960E-A41C73F531A5@mcs.anl.gov> On Jan 12, 2011, at 4:53 PM, Hamid M. wrote: > Hello, > > I need to build a dense matrix in parallel and looking at examples in > ksp/examples/tutorials > it seems there are two approaches. > In ex3.c MPI_Comm_rank and MPI_Comm_size are used to compute the index > range for the > current processor and ex5.c uses MatGetOwnershipRange to figure out > the index range. > > 1- Is there any advantage using either of these methods or they will > behave the same when it comes to performance and portability ? Use MatGetOwnershipRange because it works with any Mat layout. > > Based on comments in ex5.c, PETSc partitions the matrix by contiguous > chunks of rows. > 2- Is there a way of changing this scheme using PETSc routines ? No > 3- If not, can I partition my matrix in a different way, say 2D block > cycling method, and still be able to use PETSc to solve it ? > No > Since my matrix is dense I am going to start with direct solvers and I > am concerned whether the partitioning scheme will affect the solver's > performance. > It will affect performance, unless you hope/plan to use iterative solvers there is no good reason to use PETSc for dense matrices with direct solvers. That is another world with a different world of software. Barry > thanks in advance, > Hamid From bsmith at mcs.anl.gov Wed Jan 12 17:07:40 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 12 Jan 2011 17:07:40 -0600 Subject: [petsc-users] Getting a sequential copy of a parallel vector In-Reply-To: <20110112160301.17274zw45kyyoq8s@webtools.cc.umanitoba.ca> References: <20110112160301.17274zw45kyyoq8s@webtools.cc.umanitoba.ca> Message-ID: To save the vector in binary format for later processing you should use VecView() with a PetscViewerOpenBinary() viewer, it saves all the monkeying around of moving a vector to one process. If you want the zeroth process or all processes to have the entire Vector you can use VecScatterCreateToAll() or VecScatterCreateToZero(), this handles all the business of building the IS and scatter for you. You just call the VecScatterBegin/End() Barry On Jan 12, 2011, at 4:03 PM, Ormiston, Scott J. wrote: > We are new to PETSc and are testing the KSP solvers. > > We think we can fill a PETSc matrix and right hand side vector and get a parallel solution vector, but we are not sure how to get a copy of the solution vector that we can use in other calculations and to write results to a file. > We are using Fortran 95. > > We are looking at a sequence of calls involving > > ISCreateGeneral() > VecScatterCreate() > VecScatterBegin() > VecScatterEnd() > > along with > ISDestroy() > VecScatterDestroy(). > > Is this the correct approach? > > We can only find ex30f.F as an example. Is there a better fortran example somewhere that shows how to do this? > Thanks. > > Scott J. Ormiston Ph.D., Professor > University of Manitoba > From spam.wax at gmail.com Wed Jan 12 17:31:49 2011 From: spam.wax at gmail.com (Hamid M.) Date: Wed, 12 Jan 2011 18:31:49 -0500 Subject: [petsc-users] Parallel matrix assembly questions In-Reply-To: <01A60751-5E5F-4CE7-960E-A41C73F531A5@mcs.anl.gov> References: <01A60751-5E5F-4CE7-960E-A41C73F531A5@mcs.anl.gov> Message-ID: > ? It will affect performance, unless you hope/plan to use iterative solvers there is no good reason to use PETSc for dense matrices with direct solvers. That is another world with a different world of software. > Thanks for the helpful response. Just from theoretical point of view, are iterative solvers suitable for a dense matrix or it doesn't really matter ? I know direct solvers would perform faster for problems with multiple RHS values and a fixed LHS matrix, but how what are the main factors one needs to consider when dealing with dense matrices and various solvers. Hamid From bsmith at mcs.anl.gov Wed Jan 12 19:51:39 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 12 Jan 2011 19:51:39 -0600 Subject: [petsc-users] Parallel matrix assembly questions In-Reply-To: References: <01A60751-5E5F-4CE7-960E-A41C73F531A5@mcs.anl.gov> Message-ID: On Jan 12, 2011, at 5:31 PM, Hamid M. wrote: >> It will affect performance, unless you hope/plan to use iterative solvers there is no good reason to use PETSc for dense matrices with direct solvers. That is another world with a different world of software. >> > > Thanks for the helpful response. > > Just from theoretical point of view, are iterative solvers suitable > for a dense matrix It depends on the conditioning of the matrix. For some matrices (for example from some boundary element methods) the system can be solved with a couple dozen GMES iterations which is much much faster than using a direct solver. For other problems GMRES would be slower than a direct solver. Barry > or it doesn't really matter ? > I know direct solvers would perform faster for problems with multiple > RHS values and a fixed LHS matrix, but how what are the main factors > one needs to consider when dealing with dense matrices and various > solvers. > > Hamid From fernandez858 at gmail.com Thu Jan 13 05:38:59 2011 From: fernandez858 at gmail.com (Michel Cancelliere) Date: Thu, 13 Jan 2011 12:38:59 +0100 Subject: [petsc-users] Using PETSc from MATLAB code, experimental In-Reply-To: <28CA197A-02C8-41FD-BF2D-E344FBC3CF97@mcs.anl.gov> References: <28CA197A-02C8-41FD-BF2D-E344FBC3CF97@mcs.anl.gov> Message-ID: Hi Barry, is Matlab under Windows with petsc-cygwin supported? Thanks, Michel On Sun, Dec 26, 2010 at 5:17 AM, Barry Smith wrote: > > PETSc users, > > It is now possible to write MATLAB programs (sequential) that use PETSc > KSP, SNES, and TS solvers directly in MATLAB. The code is still experimental > and incomplete. But if you are interested in trying it out, get the > development release of PETSc > http://www.mcs.anl.gov/petsc/petsc-as/developers/index.html join the > development mailing list petsc-dev > http://www.mcs.anl.gov/petsc/petsc-as/miscellaneous/mailing-lists.html, > read bin/matlab/classes/PetscInitialize.m, configure and make PETSc and join > the fun. We are definitely in need of more developers for this code. > > Barry > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dalcinl at gmail.com Thu Jan 13 11:18:35 2011 From: dalcinl at gmail.com (Lisandro Dalcin) Date: Thu, 13 Jan 2011 14:18:35 -0300 Subject: [petsc-users] Using PETSc from MATLAB code, experimental In-Reply-To: References: <28CA197A-02C8-41FD-BF2D-E344FBC3CF97@mcs.anl.gov> Message-ID: On 13 January 2011 08:38, Michel Cancelliere wrote: > Hi Barry, > is Matlab under Windows with petsc-cygwin supported? > Unlikely, we would need to build PETSc as a DLL... If anyone can manage to do that, then Barry's work should work out of the box. Or perhaps PETSc do build as a DLL under cygwin? > Thanks, > Michel > On Sun, Dec 26, 2010 at 5:17 AM, Barry Smith wrote: >> >> ?PETSc users, >> >> ? It is now possible to write MATLAB programs (sequential) that use PETSc >> KSP, SNES, and TS solvers directly in MATLAB. The code is still experimental >> and incomplete. But if you are interested in trying it out, get the >> development release of PETSc >> http://www.mcs.anl.gov/petsc/petsc-as/developers/index.html join the >> development mailing list petsc-dev >> http://www.mcs.anl.gov/petsc/petsc-as/miscellaneous/mailing-lists.html, read >> bin/matlab/classes/PetscInitialize.m, configure and make PETSc and join the >> fun. We are definitely in need of more developers for this code. >> >> ? Barry >> >> > > -- Lisandro Dalcin --------------- CIMEC (INTEC/CONICET-UNL) Predio CONICET-Santa Fe Colectora RN 168 Km 472, Paraje El Pozo Tel: +54-342-4511594 (ext 1011) Tel/Fax: +54-342-4511169 From gianmail at gmail.com Thu Jan 13 12:00:39 2011 From: gianmail at gmail.com (Gianluca Meneghello) Date: Thu, 13 Jan 2011 19:00:39 +0100 Subject: [petsc-users] VecGetSubVector In-Reply-To: <76A8F83B-2110-4056-98EF-AC998FA9EE1C@mcs.anl.gov> References: <76A8F83B-2110-4056-98EF-AC998FA9EE1C@mcs.anl.gov> Message-ID: Dear Barry and Jed, thanks again for your answers. I'm at the moment trying to understand more about how ASM and FieldSplit works. I've started reading Barry's book "Domain Decomposition, Parallel Multilevel Methods for Elliptic Partial Differential Equations". I hope it's the right starting point. Please let me know if you have further suggested readings, taking into account that I know nothing on domain decomposition in general --- but something in multigrid. Let me ask you a couple of questions for the moment: Is PCFIELDSPLIT additive the same as PCASM (I guess no as it does not use overlap)? Or is it a Substructuring Method (I haven't yet arrived to that chapter of the book!). What's the difference between multiplicative and symmetric-multiplicative for PCFIELDSPLIT? Does symmetric-multiplicative refers to eq 1.15 of "Domain Decomposition"? Is it possible use ASM and/or FieldSplit with a matrix-free method? Thanks Gianluca Il giorno 06/gen/2011, alle ore 21.55, Barry Smith ha scritto: > > On Jan 6, 2011, at 6:01 AM, Gianluca Meneghello wrote: > >> Dear Barry, >> >> thanks a lot for your answer. >> >> I tried to do some experiments with MatGetSubMatrix, but I guess I'm >> doing something wrong as it looks like being painfully slow. > > Hmm, we've always found the getsubmatrix takes a few percent of the time. Perhaps you are calling it repeatedly for the each domain, rather than once and reusing it? Also use MatGetSubMatrices() and get all the submatrices in one call rather than one at a time. > >> >> I changed approach and now I'm using the ASM preconditioner. What I'm >> actually trying to do is to split the domain in different parts --- >> like interior and boundaries --- and relax (solve) each one with a >> different smoother (solver). In your opinion, is this the right >> approach? > > Worth trying since it is easy. You can experiment with different smoothers on the subdomains using the -sub_pc_type etc options and set different prefixes for different subdomains. > >> So far it looks much faster than my previous approach of >> extracting each submatrix. > > ASM just uses MatGetSubMatrices() so shouldn't be faster or slower than a custom code that does the same thing. > >> >> Also, please let me ask you one more thing. When using ASM with >> different subdomains on the same process, is the order in which the >> domains are solved the same as the one in which they are stored in the >> IS array passed to PCASMSetLocalSubdomains()? I would be interested in >> controlling this in order to build a downstream marching smoother. > > It is only additive, there is no order as Jed noted. Doing multiplicative in general is tricky because you want to just update the parts of the residual that need to be updated. > >> >> Looking at the references, I've noticed you have worked on multigrid. >> What I'm trying to do is close to what is described in Diskin, Thomas, >> Mineck, "Textbook Multigrid Efficiency for Leading Edge Stagnation", >> in case you already know the paper. >> http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20040081104_2004082284.pdf >> >> Again, thanks a lot. >> >> Gianluca >> >> On 3 January 2011 17:43, Barry Smith wrote: >>> >>> Gianluca, >>> >>> The expected use is with the VecScatter object. First you create a VecScatter object with VecScatterCreate() then each time you need the "subvector" you call VecScatterBegin() followed by VecScatterEnd() Note that usually the VecScatter object is retained and used many times. >>> >>> Barry >>> >>> >>> >>> On Jan 3, 2011, at 5:22 AM, Gianluca Meneghello wrote: >>> >>>> Hi, >>>> >>>> I'm new to PETSc, so that this can be a very simple question: >>>> >>>> I'm looking for something like VecGetSubVector, which I've seen it >>>> exists in the dev version but not in the released one. >>>> >>>> I need to write a smoother for a multigrid algorithm (something like a >>>> block Gauss Seidel) which can be written in matlab as >>>> >>>> for j = 1:ny >>>> P = ; >>>> du(P) = L(P,P) \ ( rhs(P) - L(P,:)*du + D2(P,P)*du(P) ); >>>> end >>>> >>>> where L is a matrix (in my case the linearized Navier Stokes). >>>> >>>> I was thinking about using IS for declaring P, so that D2(P,P) can be >>>> obtained using MatGetSubMatrix. I would need the same for the vector >>>> du. >>>> >>>> Is there a way to do that without using the developer version? (I >>>> really don't feel like being "experienced with building, using and >>>> debugging PETSc). >>>> >>>> Thanks in advance >>>> >>>> Gianluca >>> >>> > From bsmith at mcs.anl.gov Thu Jan 13 13:14:47 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 13 Jan 2011 13:14:47 -0600 Subject: [petsc-users] VecGetSubVector In-Reply-To: References: <76A8F83B-2110-4056-98EF-AC998FA9EE1C@mcs.anl.gov> Message-ID: <71F508EE-171E-4967-8143-96D515BF1759@mcs.anl.gov> On Jan 13, 2011, at 12:00 PM, Gianluca Meneghello wrote: > Dear Barry and Jed, > > thanks again for your answers. I'm at the moment trying to understand more about how ASM and FieldSplit works. I've started reading Barry's book "Domain Decomposition, Parallel Multilevel Methods for Elliptic Partial Differential Equations". I hope it's the right starting point. > > Please let me know if you have further suggested readings, taking into account that I know nothing on domain decomposition in general --- but something in multigrid. > > Let me ask you a couple of questions for the moment: > Is PCFIELDSPLIT additive the same as PCASM (I guess no as it does not use overlap)? It can be additive or multiplicative depending on what you set with PCFieldSplitSetType() > Or is it a Substructuring Method (I haven't yet arrived to that chapter of the book!). No, nothing to do with substructuring. > What's the difference between multiplicative and symmetric-multiplicative for PCFIELDSPLIT? Does symmetric-multiplicative refers to eq 1.15 of "Domain Decomposition"? Yes > > Is it possible use ASM and/or FieldSplit with a matrix-free method? Yes, BUT the algorithms are coded around MatGetSubMatrix() and or MatGetSubMatrices() so to do matrix free you need to have code that applies "part" of the operator at a time (that is you cannot just have a matrix vector product that applies the entire operator to the entire vector. Once you have the ability to apply "part" of the operator at a time you need to code up a MATSHELL that responds appropriately to MatGetSubMatrix() and or MatGetSubMatrices() and returns new matrix-free shell matrices that apply only "their" part of the operator. This is non-trivial for many people but possible. Barry > > Thanks > > Gianluca > > Il giorno 06/gen/2011, alle ore 21.55, Barry Smith ha scritto: > >> >> On Jan 6, 2011, at 6:01 AM, Gianluca Meneghello wrote: >> >>> Dear Barry, >>> >>> thanks a lot for your answer. >>> >>> I tried to do some experiments with MatGetSubMatrix, but I guess I'm >>> doing something wrong as it looks like being painfully slow. >> >> Hmm, we've always found the getsubmatrix takes a few percent of the time. Perhaps you are calling it repeatedly for the each domain, rather than once and reusing it? Also use MatGetSubMatrices() and get all the submatrices in one call rather than one at a time. >> >>> >>> I changed approach and now I'm using the ASM preconditioner. What I'm >>> actually trying to do is to split the domain in different parts --- >>> like interior and boundaries --- and relax (solve) each one with a >>> different smoother (solver). In your opinion, is this the right >>> approach? >> >> Worth trying since it is easy. You can experiment with different smoothers on the subdomains using the -sub_pc_type etc options and set different prefixes for different subdomains. >> >>> So far it looks much faster than my previous approach of >>> extracting each submatrix. >> >> ASM just uses MatGetSubMatrices() so shouldn't be faster or slower than a custom code that does the same thing. >> >>> >>> Also, please let me ask you one more thing. When using ASM with >>> different subdomains on the same process, is the order in which the >>> domains are solved the same as the one in which they are stored in the >>> IS array passed to PCASMSetLocalSubdomains()? I would be interested in >>> controlling this in order to build a downstream marching smoother. >> >> It is only additive, there is no order as Jed noted. Doing multiplicative in general is tricky because you want to just update the parts of the residual that need to be updated. >> >>> >>> Looking at the references, I've noticed you have worked on multigrid. >>> What I'm trying to do is close to what is described in Diskin, Thomas, >>> Mineck, "Textbook Multigrid Efficiency for Leading Edge Stagnation", >>> in case you already know the paper. >>> http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20040081104_2004082284.pdf >>> >>> Again, thanks a lot. >>> >>> Gianluca >>> >>> On 3 January 2011 17:43, Barry Smith wrote: >>>> >>>> Gianluca, >>>> >>>> The expected use is with the VecScatter object. First you create a VecScatter object with VecScatterCreate() then each time you need the "subvector" you call VecScatterBegin() followed by VecScatterEnd() Note that usually the VecScatter object is retained and used many times. >>>> >>>> Barry >>>> >>>> >>>> >>>> On Jan 3, 2011, at 5:22 AM, Gianluca Meneghello wrote: >>>> >>>>> Hi, >>>>> >>>>> I'm new to PETSc, so that this can be a very simple question: >>>>> >>>>> I'm looking for something like VecGetSubVector, which I've seen it >>>>> exists in the dev version but not in the released one. >>>>> >>>>> I need to write a smoother for a multigrid algorithm (something like a >>>>> block Gauss Seidel) which can be written in matlab as >>>>> >>>>> for j = 1:ny >>>>> P = ; >>>>> du(P) = L(P,P) \ ( rhs(P) - L(P,:)*du + D2(P,P)*du(P) ); >>>>> end >>>>> >>>>> where L is a matrix (in my case the linearized Navier Stokes). >>>>> >>>>> I was thinking about using IS for declaring P, so that D2(P,P) can be >>>>> obtained using MatGetSubMatrix. I would need the same for the vector >>>>> du. >>>>> >>>>> Is there a way to do that without using the developer version? (I >>>>> really don't feel like being "experienced with building, using and >>>>> debugging PETSc). >>>>> >>>>> Thanks in advance >>>>> >>>>> Gianluca >>>> >>>> >> > From bsmith at mcs.anl.gov Thu Jan 13 13:23:21 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 13 Jan 2011 13:23:21 -0600 Subject: [petsc-users] Using PETSc from MATLAB code, experimental In-Reply-To: References: <28CA197A-02C8-41FD-BF2D-E344FBC3CF97@mcs.anl.gov> Message-ID: Michel Lisandro is exactly right. The key is building the .dll. If you are interested in pursing this we are eager to help but are not Windows experts in any way. 1) are you using the GNU compiler or Microsoft? 2) if you are using GNU you can try building with the same options as indicated in the docs. including the --with-shared-libraries flag. what happens? What is generated? Let us know and we'll go from there, Barry On Jan 13, 2011, at 11:18 AM, Lisandro Dalcin wrote: > On 13 January 2011 08:38, Michel Cancelliere wrote: >> Hi Barry, >> is Matlab under Windows with petsc-cygwin supported? >> > > Unlikely, we would need to build PETSc as a DLL... If anyone can > manage to do that, then Barry's work should work out of the box. > > Or perhaps PETSc do build as a DLL under cygwin? > >> Thanks, >> Michel >> On Sun, Dec 26, 2010 at 5:17 AM, Barry Smith wrote: >>> >>> PETSc users, >>> >>> It is now possible to write MATLAB programs (sequential) that use PETSc >>> KSP, SNES, and TS solvers directly in MATLAB. The code is still experimental >>> and incomplete. But if you are interested in trying it out, get the >>> development release of PETSc >>> http://www.mcs.anl.gov/petsc/petsc-as/developers/index.html join the >>> development mailing list petsc-dev >>> http://www.mcs.anl.gov/petsc/petsc-as/miscellaneous/mailing-lists.html, read >>> bin/matlab/classes/PetscInitialize.m, configure and make PETSc and join the >>> fun. We are definitely in need of more developers for this code. >>> >>> Barry >>> >>> >> >> > > > > -- > Lisandro Dalcin > --------------- > CIMEC (INTEC/CONICET-UNL) > Predio CONICET-Santa Fe > Colectora RN 168 Km 472, Paraje El Pozo > Tel: +54-342-4511594 (ext 1011) > Tel/Fax: +54-342-4511169 From balay at mcs.anl.gov Thu Jan 13 13:26:01 2011 From: balay at mcs.anl.gov (Satish Balay) Date: Thu, 13 Jan 2011 13:26:01 -0600 (CST) Subject: [petsc-users] Using PETSc from MATLAB code, experimental In-Reply-To: References: <28CA197A-02C8-41FD-BF2D-E344FBC3CF97@mcs.anl.gov> Message-ID: I suspect Matlab would work with MS dlls - and not cygwin/gnu dlls.. satish On Thu, 13 Jan 2011, Barry Smith wrote: > > > Michel > > Lisandro is exactly right. The key is building the .dll. If you are interested in pursing this we are eager to help but are not Windows experts in any way. > > 1) are you using the GNU compiler or Microsoft? > 2) if you are using GNU you can try building with the same options as indicated in the docs. including the --with-shared-libraries flag. what happens? > What is generated? > > Let us know and we'll go from there, > > > Barry > > > > On Jan 13, 2011, at 11:18 AM, Lisandro Dalcin wrote: > > > On 13 January 2011 08:38, Michel Cancelliere wrote: > >> Hi Barry, > >> is Matlab under Windows with petsc-cygwin supported? > >> > > > > Unlikely, we would need to build PETSc as a DLL... If anyone can > > manage to do that, then Barry's work should work out of the box. > > > > Or perhaps PETSc do build as a DLL under cygwin? > > > >> Thanks, > >> Michel > >> On Sun, Dec 26, 2010 at 5:17 AM, Barry Smith wrote: > >>> > >>> PETSc users, > >>> > >>> It is now possible to write MATLAB programs (sequential) that use PETSc > >>> KSP, SNES, and TS solvers directly in MATLAB. The code is still experimental > >>> and incomplete. But if you are interested in trying it out, get the > >>> development release of PETSc > >>> http://www.mcs.anl.gov/petsc/petsc-as/developers/index.html join the > >>> development mailing list petsc-dev > >>> http://www.mcs.anl.gov/petsc/petsc-as/miscellaneous/mailing-lists.html, read > >>> bin/matlab/classes/PetscInitialize.m, configure and make PETSc and join the > >>> fun. We are definitely in need of more developers for this code. > >>> > >>> Barry > >>> > >>> > >> > >> > > > > > > > > -- > > Lisandro Dalcin > > --------------- > > CIMEC (INTEC/CONICET-UNL) > > Predio CONICET-Santa Fe > > Colectora RN 168 Km 472, Paraje El Pozo > > Tel: +54-342-4511594 (ext 1011) > > Tel/Fax: +54-342-4511169 > > From bsmith at mcs.anl.gov Thu Jan 13 13:27:22 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 13 Jan 2011 13:27:22 -0600 Subject: [petsc-users] Using PETSc from MATLAB code, experimental In-Reply-To: References: <28CA197A-02C8-41FD-BF2D-E344FBC3CF97@mcs.anl.gov> Message-ID: On Jan 13, 2011, at 1:26 PM, Satish Balay wrote: > I suspect Matlab would work with MS dlls - and not cygwin/gnu dlls.. Yes, but it is much easier for checking than MS dlls so we have to start somewhere. Barry > > satish > > On Thu, 13 Jan 2011, Barry Smith wrote: > >> >> >> Michel >> >> Lisandro is exactly right. The key is building the .dll. If you are interested in pursing this we are eager to help but are not Windows experts in any way. >> >> 1) are you using the GNU compiler or Microsoft? >> 2) if you are using GNU you can try building with the same options as indicated in the docs. including the --with-shared-libraries flag. what happens? >> What is generated? >> >> Let us know and we'll go from there, >> >> >> Barry >> >> >> >> On Jan 13, 2011, at 11:18 AM, Lisandro Dalcin wrote: >> >>> On 13 January 2011 08:38, Michel Cancelliere wrote: >>>> Hi Barry, >>>> is Matlab under Windows with petsc-cygwin supported? >>>> >>> >>> Unlikely, we would need to build PETSc as a DLL... If anyone can >>> manage to do that, then Barry's work should work out of the box. >>> >>> Or perhaps PETSc do build as a DLL under cygwin? >>> >>>> Thanks, >>>> Michel >>>> On Sun, Dec 26, 2010 at 5:17 AM, Barry Smith wrote: >>>>> >>>>> PETSc users, >>>>> >>>>> It is now possible to write MATLAB programs (sequential) that use PETSc >>>>> KSP, SNES, and TS solvers directly in MATLAB. The code is still experimental >>>>> and incomplete. But if you are interested in trying it out, get the >>>>> development release of PETSc >>>>> http://www.mcs.anl.gov/petsc/petsc-as/developers/index.html join the >>>>> development mailing list petsc-dev >>>>> http://www.mcs.anl.gov/petsc/petsc-as/miscellaneous/mailing-lists.html, read >>>>> bin/matlab/classes/PetscInitialize.m, configure and make PETSc and join the >>>>> fun. We are definitely in need of more developers for this code. >>>>> >>>>> Barry >>>>> >>>>> >>>> >>>> >>> >>> >>> >>> -- >>> Lisandro Dalcin >>> --------------- >>> CIMEC (INTEC/CONICET-UNL) >>> Predio CONICET-Santa Fe >>> Colectora RN 168 Km 472, Paraje El Pozo >>> Tel: +54-342-4511594 (ext 1011) >>> Tel/Fax: +54-342-4511169 >> >> > From gianmail at gmail.com Thu Jan 13 14:09:00 2011 From: gianmail at gmail.com (Gianluca Meneghello) Date: Thu, 13 Jan 2011 21:09:00 +0100 Subject: [petsc-users] VecGetSubVector In-Reply-To: <71F508EE-171E-4967-8143-96D515BF1759@mcs.anl.gov> References: <76A8F83B-2110-4056-98EF-AC998FA9EE1C@mcs.anl.gov> <71F508EE-171E-4967-8143-96D515BF1759@mcs.anl.gov> Message-ID: Barry, I'm afraid I didn't explain myself well in my first question. If I use PCFIELDSPLIT with PCFieldSplitSetType(pc,PC_COMPOSITE_ADDITIVE), is it the same as using PCASM? Concerning the MATSHELL to be used with PCFIELDSPLIT, is there an example from where to start from? I guess I'm one of the people for which it is non-trivial :-) Thanks Gianluca Il giorno 13/gen/2011, alle ore 20.14, Barry Smith ha scritto: > > On Jan 13, 2011, at 12:00 PM, Gianluca Meneghello wrote: > >> Dear Barry and Jed, >> >> thanks again for your answers. I'm at the moment trying to understand more about how ASM and FieldSplit works. I've started reading Barry's book "Domain Decomposition, Parallel Multilevel Methods for Elliptic Partial Differential Equations". I hope it's the right starting point. >> >> Please let me know if you have further suggested readings, taking into account that I know nothing on domain decomposition in general --- but something in multigrid. >> >> Let me ask you a couple of questions for the moment: >> Is PCFIELDSPLIT additive the same as PCASM (I guess no as it does not use overlap)? > > It can be additive or multiplicative depending on what you set with PCFieldSplitSetType() > >> Or is it a Substructuring Method (I haven't yet arrived to that chapter of the book!). > > No, nothing to do with substructuring. > >> What's the difference between multiplicative and symmetric-multiplicative for PCFIELDSPLIT? Does symmetric-multiplicative refers to eq 1.15 of "Domain Decomposition"? > > Yes > >> >> Is it possible use ASM and/or FieldSplit with a matrix-free method? > > Yes, BUT the algorithms are coded around MatGetSubMatrix() and or MatGetSubMatrices() so to do matrix free you need to have code that applies "part" of the operator at a time (that is you cannot just have a matrix vector product that applies the entire operator to the entire vector. Once you have the ability to apply "part" of the operator at a time you need to code up a MATSHELL that responds appropriately to MatGetSubMatrix() and or MatGetSubMatrices() and returns new matrix-free shell matrices that apply only "their" part of the operator. This is non-trivial for many people but possible. > > Barry > > >> >> Thanks >> >> Gianluca >> >> Il giorno 06/gen/2011, alle ore 21.55, Barry Smith ha scritto: >> >>> >>> On Jan 6, 2011, at 6:01 AM, Gianluca Meneghello wrote: >>> >>>> Dear Barry, >>>> >>>> thanks a lot for your answer. >>>> >>>> I tried to do some experiments with MatGetSubMatrix, but I guess I'm >>>> doing something wrong as it looks like being painfully slow. >>> >>> Hmm, we've always found the getsubmatrix takes a few percent of the time. Perhaps you are calling it repeatedly for the each domain, rather than once and reusing it? Also use MatGetSubMatrices() and get all the submatrices in one call rather than one at a time. >>> >>>> >>>> I changed approach and now I'm using the ASM preconditioner. What I'm >>>> actually trying to do is to split the domain in different parts --- >>>> like interior and boundaries --- and relax (solve) each one with a >>>> different smoother (solver). In your opinion, is this the right >>>> approach? >>> >>> Worth trying since it is easy. You can experiment with different smoothers on the subdomains using the -sub_pc_type etc options and set different prefixes for different subdomains. >>> >>>> So far it looks much faster than my previous approach of >>>> extracting each submatrix. >>> >>> ASM just uses MatGetSubMatrices() so shouldn't be faster or slower than a custom code that does the same thing. >>> >>>> >>>> Also, please let me ask you one more thing. When using ASM with >>>> different subdomains on the same process, is the order in which the >>>> domains are solved the same as the one in which they are stored in the >>>> IS array passed to PCASMSetLocalSubdomains()? I would be interested in >>>> controlling this in order to build a downstream marching smoother. >>> >>> It is only additive, there is no order as Jed noted. Doing multiplicative in general is tricky because you want to just update the parts of the residual that need to be updated. >>> >>>> >>>> Looking at the references, I've noticed you have worked on multigrid. >>>> What I'm trying to do is close to what is described in Diskin, Thomas, >>>> Mineck, "Textbook Multigrid Efficiency for Leading Edge Stagnation", >>>> in case you already know the paper. >>>> http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20040081104_2004082284.pdf >>>> >>>> Again, thanks a lot. >>>> >>>> Gianluca >>>> >>>> On 3 January 2011 17:43, Barry Smith wrote: >>>>> >>>>> Gianluca, >>>>> >>>>> The expected use is with the VecScatter object. First you create a VecScatter object with VecScatterCreate() then each time you need the "subvector" you call VecScatterBegin() followed by VecScatterEnd() Note that usually the VecScatter object is retained and used many times. >>>>> >>>>> Barry >>>>> >>>>> >>>>> >>>>> On Jan 3, 2011, at 5:22 AM, Gianluca Meneghello wrote: >>>>> >>>>>> Hi, >>>>>> >>>>>> I'm new to PETSc, so that this can be a very simple question: >>>>>> >>>>>> I'm looking for something like VecGetSubVector, which I've seen it >>>>>> exists in the dev version but not in the released one. >>>>>> >>>>>> I need to write a smoother for a multigrid algorithm (something like a >>>>>> block Gauss Seidel) which can be written in matlab as >>>>>> >>>>>> for j = 1:ny >>>>>> P = ; >>>>>> du(P) = L(P,P) \ ( rhs(P) - L(P,:)*du + D2(P,P)*du(P) ); >>>>>> end >>>>>> >>>>>> where L is a matrix (in my case the linearized Navier Stokes). >>>>>> >>>>>> I was thinking about using IS for declaring P, so that D2(P,P) can be >>>>>> obtained using MatGetSubMatrix. I would need the same for the vector >>>>>> du. >>>>>> >>>>>> Is there a way to do that without using the developer version? (I >>>>>> really don't feel like being "experienced with building, using and >>>>>> debugging PETSc). >>>>>> >>>>>> Thanks in advance >>>>>> >>>>>> Gianluca >>>>> >>>>> >>> >> > From bsmith at mcs.anl.gov Thu Jan 13 15:16:44 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 13 Jan 2011 15:16:44 -0600 Subject: [petsc-users] VecGetSubVector In-Reply-To: References: <76A8F83B-2110-4056-98EF-AC998FA9EE1C@mcs.anl.gov> <71F508EE-171E-4967-8143-96D515BF1759@mcs.anl.gov> Message-ID: On Jan 13, 2011, at 2:09 PM, Gianluca Meneghello wrote: > Barry, > > I'm afraid I didn't explain myself well in my first question. > > If I use PCFIELDSPLIT with PCFieldSplitSetType(pc,PC_COMPOSITE_ADDITIVE), is it the same as using PCASM? Same in what sense? It solves a bunch of subproblems independently and adds together all the solutions. There can be overlapping in the fields or not depending how what you choose. The decomposition in ASM is by "geometry" while the decomposition in the PCFIELDSPLIT is between different "fields" or "types of variables". So yes they have many similarities. > > Concerning the MATSHELL to be used with PCFIELDSPLIT, is there an example from where to start from? I guess I'm one of the people for which it is non-trivial :-) Not really. Barry > > Thanks > > Gianluca > > Il giorno 13/gen/2011, alle ore 20.14, Barry Smith ha scritto: > >> >> On Jan 13, 2011, at 12:00 PM, Gianluca Meneghello wrote: >> >>> Dear Barry and Jed, >>> >>> thanks again for your answers. I'm at the moment trying to understand more about how ASM and FieldSplit works. I've started reading Barry's book "Domain Decomposition, Parallel Multilevel Methods for Elliptic Partial Differential Equations". I hope it's the right starting point. >>> >>> Please let me know if you have further suggested readings, taking into account that I know nothing on domain decomposition in general --- but something in multigrid. >>> >>> Let me ask you a couple of questions for the moment: >>> Is PCFIELDSPLIT additive the same as PCASM (I guess no as it does not use overlap)? >> >> It can be additive or multiplicative depending on what you set with PCFieldSplitSetType() >> >>> Or is it a Substructuring Method (I haven't yet arrived to that chapter of the book!). >> >> No, nothing to do with substructuring. >> >>> What's the difference between multiplicative and symmetric-multiplicative for PCFIELDSPLIT? Does symmetric-multiplicative refers to eq 1.15 of "Domain Decomposition"? >> >> Yes >> >>> >>> Is it possible use ASM and/or FieldSplit with a matrix-free method? >> >> Yes, BUT the algorithms are coded around MatGetSubMatrix() and or MatGetSubMatrices() so to do matrix free you need to have code that applies "part" of the operator at a time (that is you cannot just have a matrix vector product that applies the entire operator to the entire vector. Once you have the ability to apply "part" of the operator at a time you need to code up a MATSHELL that responds appropriately to MatGetSubMatrix() and or MatGetSubMatrices() and returns new matrix-free shell matrices that apply only "their" part of the operator. This is non-trivial for many people but possible. >> >> Barry >> >> >>> >>> Thanks >>> >>> Gianluca >>> >>> Il giorno 06/gen/2011, alle ore 21.55, Barry Smith ha scritto: >>> >>>> >>>> On Jan 6, 2011, at 6:01 AM, Gianluca Meneghello wrote: >>>> >>>>> Dear Barry, >>>>> >>>>> thanks a lot for your answer. >>>>> >>>>> I tried to do some experiments with MatGetSubMatrix, but I guess I'm >>>>> doing something wrong as it looks like being painfully slow. >>>> >>>> Hmm, we've always found the getsubmatrix takes a few percent of the time. Perhaps you are calling it repeatedly for the each domain, rather than once and reusing it? Also use MatGetSubMatrices() and get all the submatrices in one call rather than one at a time. >>>> >>>>> >>>>> I changed approach and now I'm using the ASM preconditioner. What I'm >>>>> actually trying to do is to split the domain in different parts --- >>>>> like interior and boundaries --- and relax (solve) each one with a >>>>> different smoother (solver). In your opinion, is this the right >>>>> approach? >>>> >>>> Worth trying since it is easy. You can experiment with different smoothers on the subdomains using the -sub_pc_type etc options and set different prefixes for different subdomains. >>>> >>>>> So far it looks much faster than my previous approach of >>>>> extracting each submatrix. >>>> >>>> ASM just uses MatGetSubMatrices() so shouldn't be faster or slower than a custom code that does the same thing. >>>> >>>>> >>>>> Also, please let me ask you one more thing. When using ASM with >>>>> different subdomains on the same process, is the order in which the >>>>> domains are solved the same as the one in which they are stored in the >>>>> IS array passed to PCASMSetLocalSubdomains()? I would be interested in >>>>> controlling this in order to build a downstream marching smoother. >>>> >>>> It is only additive, there is no order as Jed noted. Doing multiplicative in general is tricky because you want to just update the parts of the residual that need to be updated. >>>> >>>>> >>>>> Looking at the references, I've noticed you have worked on multigrid. >>>>> What I'm trying to do is close to what is described in Diskin, Thomas, >>>>> Mineck, "Textbook Multigrid Efficiency for Leading Edge Stagnation", >>>>> in case you already know the paper. >>>>> http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20040081104_2004082284.pdf >>>>> >>>>> Again, thanks a lot. >>>>> >>>>> Gianluca >>>>> >>>>> On 3 January 2011 17:43, Barry Smith wrote: >>>>>> >>>>>> Gianluca, >>>>>> >>>>>> The expected use is with the VecScatter object. First you create a VecScatter object with VecScatterCreate() then each time you need the "subvector" you call VecScatterBegin() followed by VecScatterEnd() Note that usually the VecScatter object is retained and used many times. >>>>>> >>>>>> Barry >>>>>> >>>>>> >>>>>> >>>>>> On Jan 3, 2011, at 5:22 AM, Gianluca Meneghello wrote: >>>>>> >>>>>>> Hi, >>>>>>> >>>>>>> I'm new to PETSc, so that this can be a very simple question: >>>>>>> >>>>>>> I'm looking for something like VecGetSubVector, which I've seen it >>>>>>> exists in the dev version but not in the released one. >>>>>>> >>>>>>> I need to write a smoother for a multigrid algorithm (something like a >>>>>>> block Gauss Seidel) which can be written in matlab as >>>>>>> >>>>>>> for j = 1:ny >>>>>>> P = ; >>>>>>> du(P) = L(P,P) \ ( rhs(P) - L(P,:)*du + D2(P,P)*du(P) ); >>>>>>> end >>>>>>> >>>>>>> where L is a matrix (in my case the linearized Navier Stokes). >>>>>>> >>>>>>> I was thinking about using IS for declaring P, so that D2(P,P) can be >>>>>>> obtained using MatGetSubMatrix. I would need the same for the vector >>>>>>> du. >>>>>>> >>>>>>> Is there a way to do that without using the developer version? (I >>>>>>> really don't feel like being "experienced with building, using and >>>>>>> debugging PETSc). >>>>>>> >>>>>>> Thanks in advance >>>>>>> >>>>>>> Gianluca >>>>>> >>>>>> >>>> >>> >> > From SJ_Ormiston at UManitoba.ca Fri Jan 14 14:14:59 2011 From: SJ_Ormiston at UManitoba.ca (Ormiston, Scott J.) Date: Fri, 14 Jan 2011 14:14:59 -0600 Subject: [petsc-users] Getting a copy of a PETSC vector into a user sequential vector Message-ID: <4D30AEC3.1050900@UManitoba.ca> We have been trying to use scatter to collect the entries of a PETSc solution vector into an ordinary sequential vector. We need this for our own post-processing and other calculations. We were able to get ex15f.F to create and solve a matrix. Then we added code from ex30f.F and we were able to get it to work. When we put the code into our own mainline, it fails with the run-time error message: 0: Subscript out of range for array xx_v (main-petsc.f95: 1649) subscript=261373973, lower bound=1, upper bound=1, dimension=1 -------------------------------------------------------------------------- mpiexec has exited due to process rank 0 with PID 23105 on node mecfd01 exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpiexec (as reported here). -------------------------------------------------------------------------- Any ideas of what might be wrong? We believe the link libraries are the same between the two make files, and we just inserted the working example 15/30 code into our existing, working mainline. For reference, an excerpt of the key code segments and the full relevent output are appended. Scott Ormiston =========================CODE EXCERPT=============================== !... REAL*8 sol(NILU) !NILU is the length of the solution vector ! #include "finclude/petscsys.h" #include "finclude/petscvec.h" #include "finclude/petscmat.h" #include "finclude/petscpc.h" #include "finclude/petscksp.h" #include "finclude/petscis.h" #define xx_a(ib) xx_v(xx_i + (ib)) PetscOffset xx_i PetscScalar xx_v(1) #include "finclude/petscvec.h90" PetscScalar, pointer :: xp_v(:) Vec seqV VecScatter scat1 IS fromis, tois PetscInt from(12), to(12) Vec Vx,Vb,Vu Mat A PC pc KSP ksp PetscScalar PSv,one,neg_one double precision norm,tol PetscErrorCode ierr PetscInt PETSCi,PETSCj,II,JJ,Istart PetscInt Iend,m,n,i1,its,five PetscMPIInt rank PetscTruth user_defined_pc,Pflg external SampleShellPCSetUp, SampleShellPCApply, SampleShellPCDestroy ! Common block to store data for user-provided preconditioner common /myshellpc/ diag Vec diag ! call PetscInitialize(PETSC_NULL_CHARACTER,ierr) ! ex15f.F code ... ! call KSPSolve(ksp,Vb,Vx,ierr) ! !####################################################### !-- put the solution from parallel vector vx into sol(*) do i=1,m*n from(i)=i-1 to(i) =i-1 end do call VecCreateSeq(PETSC_COMM_SELF,m*n,seqV,ierr) call VecSet(seqV,neg_one,ierr) if (rank.eq.0) then print *, "seqV OLD" call VecView(seqV,PETSC_VIEWER_STDOUT_SELF,ierr) end if call ISCreateGeneral(PETSC_COMM_SELF,m*n,from,fromis,ierr) call ISCreateGeneral(PETSC_COMM_SELF,m*n,to,tois,ierr) call VecScatterCreate(Vx,fromis,seqV,tois,scat1,ierr) call ISDestroy(fromis,ierr) call ISDestroy(tois,ierr) call VecScatterBegin(scat1,Vx,seqV,INSERT_VALUES,SCATTER_FORWARD,ierr) call VecScatterEnd(scat1,Vx,seqV,INSERT_VALUES,SCATTER_FORWARD,ierr) if (rank.eq.0) then print *, "seqV NEW" call VecView(seqV,PETSC_VIEWER_STDOUT_SELF,ierr) ! !------*.F version of VecGetArray call VecGetArray(seqV,xx_v,xx_i,ierr) write(6,*) 'after VecGetArray' write(6,*) 'n=',n, ' m=',m do PETSCi=1,m*n xx_a(PETSCi) = 100.0*PETSCi sol(i) = xx_a(i) + 1.55 sol(i) = xx_v( xx_i + (i)) end do call VecRestoreArray(seqV,xx_v,xx_i,ierr) ! !------*.f95 version of VecGetArray call VecGetArrayF90(seqV,xp_v,ierr) print*,ierr do i=1,m*n sol(i)= xp_v(i) - 5.0 xp_v(i) = 100.0*i end do call VecRestoreArrayF90(seqV,xp_v,ierr) print*,'the f95 version of test solution vector is:' do i=1,m*n print*,sol(i) end do end if !... ! Free work space. All PETSc objects should be destroyed when they ! are no longer needed. call KSPDestroy(ksp,ierr) call VecDestroy(Vu,ierr) call VecDestroy(Vx,ierr) call VecDestroy(Vb,ierr) call MatDestroy(A,ierr) ! Always call PetscFinalize() before exiting a program. ! call PetscFinalize(ierr) =========================OUTPUT===================================== KSP Object: type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-07, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: type: jacobi linear system matrix = precond matrix: Matrix Object: type=seqaij, rows=12, cols=12 total: nonzeros=46, allocated nonzeros=60 not using I-node routines seqV OLD -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 seqV NEW 1 1 1 1 1 1 1 1 1 1 1 1 after VecGetArray n= 4 m= 3 0: Subscript out of range for array xx_v (main-petsc.f95: 1649) subscript=261373973, lower bound=1, upper bound=1, dimension=1 -------------------------------------------------------------------------- mpiexec has exited due to process rank 0 with PID 23105 on node mecfd01 exiting without calling "finalize". This may have caused other processes in the application to be terminated by signals sent by mpiexec (as reported here). -------------------------------------------------------------------------- -------------- next part -------------- A non-text attachment was scrubbed... Name: SJ_Ormiston.vcf Type: text/x-vcard Size: 321 bytes Desc: not available URL: From 5202222ghj at 163.com Fri Jan 14 14:27:00 2011 From: 5202222ghj at 163.com (yuxuan) Date: Sat, 15 Jan 2011 04:27:00 +0800 (CST) Subject: [petsc-users] unstructured parallel Message-ID: <2ae1baa6.250.12d8635add0.Coremail.5202222ghj@163.com> I have implemented PETSc for my unstructured FVM problem by external package Sundials, I want to move to a parallel version. I just read the examplehttp://www.mcs.anl.gov/petsc/petsc-2/snapshots/petsc-current/src/dm/ao/examples/tutorials/ex2.c.html I want know if my understanding is right: DA is only for structured grid; There is not too much difference betweenhttp://www.mcs.anl.gov/petsc/petsc-2/snapshots/petsc-current/src/ts/examples/tutorials/ex3.c.html andhttp://www.mcs.anl.gov/petsc/petsc-2/snapshots/petsc-current/src/ts/examples/tutorials/ex4.c.html, because DA can manage most parallel related staff. At last is there any example I can learn to get a parallel version of my problem. Thanks! -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Fri Jan 14 16:13:44 2011 From: balay at mcs.anl.gov (Satish Balay) Date: Fri, 14 Jan 2011 16:13:44 -0600 (CST) Subject: [petsc-users] Getting a copy of a PETSC vector into a user sequential vector In-Reply-To: <4D30AEC3.1050900@UManitoba.ca> References: <4D30AEC3.1050900@UManitoba.ca> Message-ID: VecGetArray() is defined to always look out of bounds - so if you use a compiler flag to check for out of bound error - you will always get this error. Alternative is to use VecGetArrayF90() - and to use this correctly - you should include petscvec.h90 Satish On Fri, 14 Jan 2011, Ormiston, Scott J. wrote: > We have been trying to use scatter to collect the entries of a PETSc solution > vector into an ordinary sequential vector. We need this for our own > post-processing and other calculations. > > We were able to get ex15f.F to create and solve a matrix. Then we added code > from ex30f.F and we were able to get it to work. When we put the code into > our own mainline, it fails with the run-time error message: > > 0: Subscript out of range for array xx_v (main-petsc.f95: 1649) > subscript=261373973, lower bound=1, upper bound=1, dimension=1 > -------------------------------------------------------------------------- > mpiexec has exited due to process rank 0 with PID 23105 on > node mecfd01 exiting without calling "finalize". This may > have caused other processes in the application to be > terminated by signals sent by mpiexec (as reported here). > -------------------------------------------------------------------------- > > Any ideas of what might be wrong? We believe the link libraries are the same > between the two make files, and we just inserted the working example 15/30 > code into our existing, working mainline. > > For reference, an excerpt of the key code segments and the full relevent > output are appended. > > Scott Ormiston > =========================CODE EXCERPT=============================== > !... > REAL*8 sol(NILU) !NILU is the length of the solution vector > ! > #include "finclude/petscsys.h" > #include "finclude/petscvec.h" > #include "finclude/petscmat.h" > #include "finclude/petscpc.h" > #include "finclude/petscksp.h" > #include "finclude/petscis.h" > > #define xx_a(ib) xx_v(xx_i + (ib)) > PetscOffset xx_i > PetscScalar xx_v(1) > > #include "finclude/petscvec.h90" > PetscScalar, pointer :: xp_v(:) > Vec seqV > VecScatter scat1 > IS fromis, tois > PetscInt from(12), to(12) > > Vec Vx,Vb,Vu > Mat A > PC pc > KSP ksp > PetscScalar PSv,one,neg_one > double precision norm,tol > PetscErrorCode ierr > PetscInt PETSCi,PETSCj,II,JJ,Istart > PetscInt Iend,m,n,i1,its,five > PetscMPIInt rank > PetscTruth user_defined_pc,Pflg > > external SampleShellPCSetUp, SampleShellPCApply, SampleShellPCDestroy > > ! Common block to store data for user-provided preconditioner > common /myshellpc/ diag > Vec diag > ! > call PetscInitialize(PETSC_NULL_CHARACTER,ierr) > > ! ex15f.F code ... > ! > call KSPSolve(ksp,Vb,Vx,ierr) > ! > !####################################################### > !-- put the solution from parallel vector vx into sol(*) > do i=1,m*n > from(i)=i-1 > to(i) =i-1 > end do > > call VecCreateSeq(PETSC_COMM_SELF,m*n,seqV,ierr) > call VecSet(seqV,neg_one,ierr) > if (rank.eq.0) then > print *, "seqV OLD" > call VecView(seqV,PETSC_VIEWER_STDOUT_SELF,ierr) > end if > > call ISCreateGeneral(PETSC_COMM_SELF,m*n,from,fromis,ierr) > call ISCreateGeneral(PETSC_COMM_SELF,m*n,to,tois,ierr) > > call VecScatterCreate(Vx,fromis,seqV,tois,scat1,ierr) > > call ISDestroy(fromis,ierr) > call ISDestroy(tois,ierr) > > call VecScatterBegin(scat1,Vx,seqV,INSERT_VALUES,SCATTER_FORWARD,ierr) > call VecScatterEnd(scat1,Vx,seqV,INSERT_VALUES,SCATTER_FORWARD,ierr) > > if (rank.eq.0) then > print *, "seqV NEW" > call VecView(seqV,PETSC_VIEWER_STDOUT_SELF,ierr) > ! > !------*.F version of VecGetArray > call VecGetArray(seqV,xx_v,xx_i,ierr) > write(6,*) 'after VecGetArray' > write(6,*) 'n=',n, ' m=',m > do PETSCi=1,m*n > xx_a(PETSCi) = 100.0*PETSCi > sol(i) = xx_a(i) + 1.55 > sol(i) = xx_v( xx_i + (i)) > end do > call VecRestoreArray(seqV,xx_v,xx_i,ierr) > ! > !------*.f95 version of VecGetArray > call VecGetArrayF90(seqV,xp_v,ierr) > print*,ierr > do i=1,m*n > sol(i)= xp_v(i) - 5.0 > xp_v(i) = 100.0*i > end do > call VecRestoreArrayF90(seqV,xp_v,ierr) > print*,'the f95 version of test solution vector is:' > do i=1,m*n > print*,sol(i) > end do > end if > !... > ! Free work space. All PETSc objects should be destroyed when they > ! are no longer needed. > > call KSPDestroy(ksp,ierr) > call VecDestroy(Vu,ierr) > call VecDestroy(Vx,ierr) > call VecDestroy(Vb,ierr) > call MatDestroy(A,ierr) > > ! Always call PetscFinalize() before exiting a program. > ! > call PetscFinalize(ierr) > > =========================OUTPUT===================================== > KSP Object: > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-07, absolute=1e-50, divergence=10000 > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: > type: jacobi > linear system matrix = precond matrix: > Matrix Object: > type=seqaij, rows=12, cols=12 > total: nonzeros=46, allocated nonzeros=60 > not using I-node routines > seqV OLD > -1 > -1 > -1 > -1 > -1 > -1 > -1 > -1 > -1 > -1 > -1 > -1 > seqV NEW > 1 > 1 > 1 > 1 > 1 > 1 > 1 > 1 > 1 > 1 > 1 > 1 > after VecGetArray > n= 4 m= 3 > 0: Subscript out of range for array xx_v (main-petsc.f95: 1649) > subscript=261373973, lower bound=1, upper bound=1, dimension=1 > -------------------------------------------------------------------------- > mpiexec has exited due to process rank 0 with PID 23105 on > node mecfd01 exiting without calling "finalize". This may > have caused other processes in the application to be > terminated by signals sent by mpiexec (as reported here). > -------------------------------------------------------------------------- > > From hzhang at mcs.anl.gov Fri Jan 14 16:30:36 2011 From: hzhang at mcs.anl.gov (Hong Zhang) Date: Fri, 14 Jan 2011 16:30:36 -0600 Subject: [petsc-users] unstructured parallel In-Reply-To: <2ae1baa6.250.12d8635add0.Coremail.5202222ghj@163.com> References: <2ae1baa6.250.12d8635add0.Coremail.5202222ghj@163.com> Message-ID: > I have implemented PETSc for my unstructured FVM problem by external package > Sundials, I want to move to a parallel version. I just read the example > http://www.mcs.anl.gov/petsc/petsc-2/snapshots/petsc-current/src/dm/ao/examples/tutorials/ex2.c.html > > I want know if my understanding is right: > > DA is only for structured grid; Yes, only for logically structured grids. > > There is not too much difference between > http://www.mcs.anl.gov/petsc/petsc-2/snapshots/petsc-current/src/ts/examples/tutorials/ex3.c.html > and > http://www.mcs.anl.gov/petsc/petsc-2/snapshots/petsc-current/src/ts/examples/tutorials/ex4.c.html, > because DA can manage most parallel related staff. They solve same problem, but illustrate different TS solvers and approaches. Although ex4.c uses DA in main() to create vectors and is intended to illustrate a non-linear TS solver, its function evaluation does use DA. ex4.c is not a good example for using DA. Almost all petsc objects and solvers are parallel, which manage parallel data distribution and implementation. DA is one of objects. Hong > > At last is there any example I can learn to get a parallel version of my > problem. > > > > Thanks! > > From hzhang at mcs.anl.gov Fri Jan 14 16:32:18 2011 From: hzhang at mcs.anl.gov (Hong Zhang) Date: Fri, 14 Jan 2011 16:32:18 -0600 Subject: [petsc-users] unstructured parallel In-Reply-To: References: <2ae1baa6.250.12d8635add0.Coremail.5202222ghj@163.com> Message-ID: Sorry, I mean > They solve same problem, but illustrate different TS solvers and approaches. > Although ex4.c uses DA in main() to create vectors and is intended to > illustrate a non-linear TS solver, > its function evaluation does use DA. ex4.c is not a good example for using DA. dose NOT use DA. Hong > > Almost all petsc objects and solvers are parallel, which manage > parallel data distribution and implementation. > DA is one of objects. > > Hong > >> >> At last is there any example I can learn to get a parallel version of my >> problem. >> >> >> >> Thanks! >> >> > From gianmail at gmail.com Sat Jan 15 08:23:19 2011 From: gianmail at gmail.com (Gianluca Meneghello) Date: Sat, 15 Jan 2011 15:23:19 +0100 Subject: [petsc-users] VecGetSubVector In-Reply-To: References: <76A8F83B-2110-4056-98EF-AC998FA9EE1C@mcs.anl.gov> <71F508EE-171E-4967-8143-96D515BF1759@mcs.anl.gov> Message-ID: Dear Barry In order to use matrix-free methods with PCFIELDSPLIT, is it correct to start from MatShellSetOperation(mat,MATOP_MATGETSUBMATRIX, (void(*)(void)) PetscErrorCode (*UserMatGetSubMatrix)(......)); or does it require to dive deeper inside PETSc code? Thanks Gianluca On 13 January 2011 22:16, Barry Smith wrote: > > On Jan 13, 2011, at 2:09 PM, Gianluca Meneghello wrote: > >> Barry, >> >> I'm afraid I didn't explain myself well in my first question. >> >> If I use PCFIELDSPLIT with PCFieldSplitSetType(pc,PC_COMPOSITE_ADDITIVE), is it the same as using PCASM? > > ? Same in what sense? It solves a bunch of subproblems independently and adds together all the solutions. There can be overlapping in the fields or not depending how what you choose. The decomposition in ASM is by "geometry" while the decomposition in the PCFIELDSPLIT is between different "fields" or "types of variables". So yes they have many similarities. > > >> >> Concerning the MATSHELL to be used with PCFIELDSPLIT, is there an example from where to start from? I ?guess I'm one of the people for which it is non-trivial :-) > > ?Not really. > > ? Barry > >> >> Thanks >> >> Gianluca >> >> Il giorno 13/gen/2011, alle ore 20.14, Barry Smith ha scritto: >> >>> >>> On Jan 13, 2011, at 12:00 PM, Gianluca Meneghello wrote: >>> >>>> Dear Barry and Jed, >>>> >>>> thanks again for your answers. I'm at the moment trying to understand more about how ASM and FieldSplit works. I've started reading Barry's book "Domain Decomposition, Parallel Multilevel Methods for Elliptic Partial Differential Equations". I hope it's the right starting point. >>>> >>>> Please let me know if you have further suggested readings, taking into account that I know nothing on domain decomposition in general --- but something in multigrid. >>>> >>>> Let me ask you a couple of questions for the moment: >>>> Is PCFIELDSPLIT additive the same as PCASM (I guess no as it does not use overlap)? >>> >>> ?It can be additive or multiplicative depending on what you set with PCFieldSplitSetType() >>> >>>> Or is it a Substructuring Method (I haven't yet arrived to that chapter of the book!). >>> >>> ?No, nothing to do with substructuring. >>> >>>> What's the difference between multiplicative and symmetric-multiplicative for PCFIELDSPLIT? ?Does symmetric-multiplicative refers to eq 1.15 of "Domain Decomposition"? >>> >>> Yes >>> >>>> >>>> Is it possible use ASM and/or FieldSplit with a matrix-free method? >>> >>> ?Yes, BUT the algorithms are coded around MatGetSubMatrix() and or MatGetSubMatrices() so to do matrix free you need to have code that applies "part" of the operator at a time (that is you cannot just have a matrix vector product that applies the entire operator to the entire vector. Once you have the ability to apply "part" of the operator at a time you need to code up a MATSHELL that responds appropriately to MatGetSubMatrix() and or MatGetSubMatrices() and returns new matrix-free shell matrices that apply only "their" part of the operator. This is non-trivial for many people but possible. >>> >>> ?Barry >>> >>> >>>> >>>> Thanks >>>> >>>> Gianluca >>>> >>>> Il giorno 06/gen/2011, alle ore 21.55, Barry Smith ha scritto: >>>> >>>>> >>>>> On Jan 6, 2011, at 6:01 AM, Gianluca Meneghello wrote: >>>>> >>>>>> Dear Barry, >>>>>> >>>>>> thanks a lot for your answer. >>>>>> >>>>>> I tried to do some experiments with MatGetSubMatrix, but I guess I'm >>>>>> doing something wrong as it looks like being painfully slow. >>>>> >>>>> Hmm, we've always found the getsubmatrix takes a few percent of the time. Perhaps you are calling it repeatedly for the each domain, rather than once and reusing it? Also use MatGetSubMatrices() and get all the submatrices in one call rather than one at a time. >>>>> >>>>>> >>>>>> I changed approach and now I'm using the ASM preconditioner. What I'm >>>>>> actually trying to do is to split the domain in different parts --- >>>>>> like interior and boundaries --- and relax (solve) each one with a >>>>>> different smoother (solver). In your opinion, is this the right >>>>>> approach? >>>>> >>>>> Worth trying since it is easy. You can experiment with different smoothers on the subdomains using the -sub_pc_type etc options and set different prefixes for different subdomains. >>>>> >>>>>> So far it looks much faster than my previous approach of >>>>>> extracting each submatrix. >>>>> >>>>> ASM just uses MatGetSubMatrices() so shouldn't be faster or slower than a custom code that does the same thing. >>>>> >>>>>> >>>>>> Also, please let me ask you one more thing. When using ASM with >>>>>> different subdomains on the same process, is the order in which the >>>>>> domains are solved the same as the one in which they are stored in the >>>>>> IS array passed to PCASMSetLocalSubdomains()? I would be interested in >>>>>> controlling this in order to build a downstream marching smoother. >>>>> >>>>> It is only additive, there is no order as Jed noted. Doing multiplicative in general is tricky because you want to just update the parts of the residual that need to be updated. >>>>> >>>>>> >>>>>> Looking at the references, I've noticed you have worked on multigrid. >>>>>> What I'm trying to do is close to what is described in Diskin, Thomas, >>>>>> Mineck, "Textbook Multigrid Efficiency for Leading Edge Stagnation", >>>>>> in case you already know the paper. >>>>>> http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20040081104_2004082284.pdf >>>>>> >>>>>> Again, thanks a lot. >>>>>> >>>>>> Gianluca >>>>>> >>>>>> On 3 January 2011 17:43, Barry Smith wrote: >>>>>>> >>>>>>> Gianluca, >>>>>>> >>>>>>> The expected use is with the VecScatter object. First you create a VecScatter object with VecScatterCreate() then each time you need the "subvector" you call VecScatterBegin() followed by VecScatterEnd() Note that usually the VecScatter object is retained and used many times. >>>>>>> >>>>>>> Barry >>>>>>> >>>>>>> >>>>>>> >>>>>>> On Jan 3, 2011, at 5:22 AM, Gianluca Meneghello wrote: >>>>>>> >>>>>>>> Hi, >>>>>>>> >>>>>>>> I'm new to PETSc, so that this can be a very simple question: >>>>>>>> >>>>>>>> I'm looking for something like VecGetSubVector, which I've seen it >>>>>>>> exists in the dev version but not in the released one. >>>>>>>> >>>>>>>> I need to write a smoother for a multigrid algorithm (something like a >>>>>>>> block Gauss Seidel) which can be written in matlab as >>>>>>>> >>>>>>>> for j = 1:ny >>>>>>>> P = ; >>>>>>>> du(P) = L(P,P) \ ( ?rhs(P) - L(P,:)*du + D2(P,P)*du(P) ); >>>>>>>> end >>>>>>>> >>>>>>>> where L is a matrix (in my case the linearized Navier Stokes). >>>>>>>> >>>>>>>> I was thinking about using IS for declaring P, so that D2(P,P) can be >>>>>>>> obtained using MatGetSubMatrix. I would need the same for the vector >>>>>>>> du. >>>>>>>> >>>>>>>> Is there a way to do that without using the developer version? (I >>>>>>>> really don't feel like being "experienced with building, using and >>>>>>>> debugging PETSc). >>>>>>>> >>>>>>>> Thanks in advance >>>>>>>> >>>>>>>> Gianluca >>>>>>> >>>>>>> >>>>> >>>> >>> >> > > From knepley at gmail.com Sat Jan 15 09:43:07 2011 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 15 Jan 2011 09:43:07 -0600 Subject: [petsc-users] VecGetSubVector In-Reply-To: References: <76A8F83B-2110-4056-98EF-AC998FA9EE1C@mcs.anl.gov> <71F508EE-171E-4967-8143-96D515BF1759@mcs.anl.gov> Message-ID: On Sat, Jan 15, 2011 at 8:23 AM, Gianluca Meneghello wrote: > Dear Barry > > In order to use matrix-free methods with PCFIELDSPLIT, is it correct > to start from > > MatShellSetOperation(mat,MATOP_MATGETSUBMATRIX, (void(*)(void)) > PetscErrorCode (*UserMatGetSubMatrix)(......)); > > or does it require to dive deeper inside PETSc code? > That is fine. You must return a Mat object, probably a MatShell in your case, which applys the requested block of the operator. Matt > Thanks > > Gianluca > > On 13 January 2011 22:16, Barry Smith wrote: > > > > On Jan 13, 2011, at 2:09 PM, Gianluca Meneghello wrote: > > > >> Barry, > >> > >> I'm afraid I didn't explain myself well in my first question. > >> > >> If I use PCFIELDSPLIT with > PCFieldSplitSetType(pc,PC_COMPOSITE_ADDITIVE), is it the same as using > PCASM? > > > > Same in what sense? It solves a bunch of subproblems independently and > adds together all the solutions. There can be overlapping in the fields or > not depending how what you choose. The decomposition in ASM is by "geometry" > while the decomposition in the PCFIELDSPLIT is between different "fields" or > "types of variables". So yes they have many similarities. > > > > > >> > >> Concerning the MATSHELL to be used with PCFIELDSPLIT, is there an > example from where to start from? I guess I'm one of the people for which > it is non-trivial :-) > > > > Not really. > > > > Barry > > > >> > >> Thanks > >> > >> Gianluca > >> > >> Il giorno 13/gen/2011, alle ore 20.14, Barry Smith ha scritto: > >> > >>> > >>> On Jan 13, 2011, at 12:00 PM, Gianluca Meneghello wrote: > >>> > >>>> Dear Barry and Jed, > >>>> > >>>> thanks again for your answers. I'm at the moment trying to understand > more about how ASM and FieldSplit works. I've started reading Barry's book > "Domain Decomposition, Parallel Multilevel Methods for Elliptic Partial > Differential Equations". I hope it's the right starting point. > >>>> > >>>> Please let me know if you have further suggested readings, taking into > account that I know nothing on domain decomposition in general --- but > something in multigrid. > >>>> > >>>> Let me ask you a couple of questions for the moment: > >>>> Is PCFIELDSPLIT additive the same as PCASM (I guess no as it does not > use overlap)? > >>> > >>> It can be additive or multiplicative depending on what you set with > PCFieldSplitSetType() > >>> > >>>> Or is it a Substructuring Method (I haven't yet arrived to that > chapter of the book!). > >>> > >>> No, nothing to do with substructuring. > >>> > >>>> What's the difference between multiplicative and > symmetric-multiplicative for PCFIELDSPLIT? Does symmetric-multiplicative > refers to eq 1.15 of "Domain Decomposition"? > >>> > >>> Yes > >>> > >>>> > >>>> Is it possible use ASM and/or FieldSplit with a matrix-free method? > >>> > >>> Yes, BUT the algorithms are coded around MatGetSubMatrix() and or > MatGetSubMatrices() so to do matrix free you need to have code that applies > "part" of the operator at a time (that is you cannot just have a matrix > vector product that applies the entire operator to the entire vector. Once > you have the ability to apply "part" of the operator at a time you need to > code up a MATSHELL that responds appropriately to MatGetSubMatrix() and or > MatGetSubMatrices() and returns new matrix-free shell matrices that apply > only "their" part of the operator. This is non-trivial for many people but > possible. > >>> > >>> Barry > >>> > >>> > >>>> > >>>> Thanks > >>>> > >>>> Gianluca > >>>> > >>>> Il giorno 06/gen/2011, alle ore 21.55, Barry Smith ha scritto: > >>>> > >>>>> > >>>>> On Jan 6, 2011, at 6:01 AM, Gianluca Meneghello wrote: > >>>>> > >>>>>> Dear Barry, > >>>>>> > >>>>>> thanks a lot for your answer. > >>>>>> > >>>>>> I tried to do some experiments with MatGetSubMatrix, but I guess I'm > >>>>>> doing something wrong as it looks like being painfully slow. > >>>>> > >>>>> Hmm, we've always found the getsubmatrix takes a few percent of the > time. Perhaps you are calling it repeatedly for the each domain, rather than > once and reusing it? Also use MatGetSubMatrices() and get all the > submatrices in one call rather than one at a time. > >>>>> > >>>>>> > >>>>>> I changed approach and now I'm using the ASM preconditioner. What > I'm > >>>>>> actually trying to do is to split the domain in different parts --- > >>>>>> like interior and boundaries --- and relax (solve) each one with a > >>>>>> different smoother (solver). In your opinion, is this the right > >>>>>> approach? > >>>>> > >>>>> Worth trying since it is easy. You can experiment with different > smoothers on the subdomains using the -sub_pc_type etc options and set > different prefixes for different subdomains. > >>>>> > >>>>>> So far it looks much faster than my previous approach of > >>>>>> extracting each submatrix. > >>>>> > >>>>> ASM just uses MatGetSubMatrices() so shouldn't be faster or slower > than a custom code that does the same thing. > >>>>> > >>>>>> > >>>>>> Also, please let me ask you one more thing. When using ASM with > >>>>>> different subdomains on the same process, is the order in which the > >>>>>> domains are solved the same as the one in which they are stored in > the > >>>>>> IS array passed to PCASMSetLocalSubdomains()? I would be interested > in > >>>>>> controlling this in order to build a downstream marching smoother. > >>>>> > >>>>> It is only additive, there is no order as Jed noted. Doing > multiplicative in general is tricky because you want to just update the > parts of the residual that need to be updated. > >>>>> > >>>>>> > >>>>>> Looking at the references, I've noticed you have worked on > multigrid. > >>>>>> What I'm trying to do is close to what is described in Diskin, > Thomas, > >>>>>> Mineck, "Textbook Multigrid Efficiency for Leading Edge Stagnation", > >>>>>> in case you already know the paper. > >>>>>> > http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20040081104_2004082284.pdf > >>>>>> > >>>>>> Again, thanks a lot. > >>>>>> > >>>>>> Gianluca > >>>>>> > >>>>>> On 3 January 2011 17:43, Barry Smith wrote: > >>>>>>> > >>>>>>> Gianluca, > >>>>>>> > >>>>>>> The expected use is with the VecScatter object. First you create a > VecScatter object with VecScatterCreate() then each time you need the > "subvector" you call VecScatterBegin() followed by VecScatterEnd() Note that > usually the VecScatter object is retained and used many times. > >>>>>>> > >>>>>>> Barry > >>>>>>> > >>>>>>> > >>>>>>> > >>>>>>> On Jan 3, 2011, at 5:22 AM, Gianluca Meneghello wrote: > >>>>>>> > >>>>>>>> Hi, > >>>>>>>> > >>>>>>>> I'm new to PETSc, so that this can be a very simple question: > >>>>>>>> > >>>>>>>> I'm looking for something like VecGetSubVector, which I've seen it > >>>>>>>> exists in the dev version but not in the released one. > >>>>>>>> > >>>>>>>> I need to write a smoother for a multigrid algorithm (something > like a > >>>>>>>> block Gauss Seidel) which can be written in matlab as > >>>>>>>> > >>>>>>>> for j = 1:ny > >>>>>>>> P = ; > >>>>>>>> du(P) = L(P,P) \ ( rhs(P) - L(P,:)*du + D2(P,P)*du(P) ); > >>>>>>>> end > >>>>>>>> > >>>>>>>> where L is a matrix (in my case the linearized Navier Stokes). > >>>>>>>> > >>>>>>>> I was thinking about using IS for declaring P, so that D2(P,P) can > be > >>>>>>>> obtained using MatGetSubMatrix. I would need the same for the > vector > >>>>>>>> du. > >>>>>>>> > >>>>>>>> Is there a way to do that without using the developer version? (I > >>>>>>>> really don't feel like being "experienced with building, using and > >>>>>>>> debugging PETSc). > >>>>>>>> > >>>>>>>> Thanks in advance > >>>>>>>> > >>>>>>>> Gianluca > >>>>>>> > >>>>>>> > >>>>> > >>>> > >>> > >> > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at 59A2.org Sat Jan 15 13:46:40 2011 From: jed at 59A2.org (Jed Brown) Date: Sat, 15 Jan 2011 16:46:40 -0300 Subject: [petsc-users] VecGetSubVector In-Reply-To: References: <76A8F83B-2110-4056-98EF-AC998FA9EE1C@mcs.anl.gov> <71F508EE-171E-4967-8143-96D515BF1759@mcs.anl.gov> Message-ID: On Thu, Jan 13, 2011 at 18:16, Barry Smith wrote: > If I use PCFIELDSPLIT with PCFieldSplitSetType(pc,PC_COMPOSITE_ADDITIVE), > is it the same as using PCASM? > > Same in what sense? It solves a bunch of subproblems independently and > adds together all the solutions. There can be overlapping in the fields or > not depending how what you choose. The decomposition in ASM is by "geometry" > while the decomposition in the PCFIELDSPLIT is between different "fields" or > "types of variables". So yes they have many similarities. To put it differently, PCASM exposes as much concurrency as possible, making it efficient to use with spatially "local" subdomains. In contrast, PCFIELDSPLIT decomposes a multi-physics problem into sub-problems that are hopefully "better understood" so that efficient solvers are available. While you can make PCFIELDSPLIT use the same decomposition as PCASM uses, so that the algorithm is functionally equivalent, it does not expose the same concurrency, so would not scale in parallel. -------------- next part -------------- An HTML attachment was scrubbed... URL: From gaurish108 at gmail.com Sun Jan 16 15:21:15 2011 From: gaurish108 at gmail.com (Gaurish Telang) Date: Sun, 16 Jan 2011 16:21:15 -0500 Subject: [petsc-users] loading a matrix in PETSc : Problems Message-ID: Hi, Given matrix A and column vector b. I am trying to do a Least squares problem Ax=b on PETSc. A has dimenstion 2683x1274 I have A given to me in a text file in the form of three columns (Row, Column,NonZeroElement). E.g. If a matrix of dimension 2x3 such as 0 5 3 5 6 0 would be given in the text file as: 1 2 5 1 3 3 2 1 5 2 2 6 The third column just lists the non-zero elements and the corresponding first two columsn the position,. Now I am trying to input this textfile data into PETSc matrix A. But my matrix takes forever to load. The same code on small matrices of dimension say 4x3 works perfectly. I have pasted my PETSc code below. Any suggestions on improving it will be helpful. ***Explanation of my code****: For my code I start with two text files: N1.mat and b1.mat----> N1.mat containing the matrix A in (Row,Column,Nonzeroelement) format and b1.mat containing the right hand side b. Then I read in the three columns of N1.mat as a matrix B which I use to re-create the matrix A. The dimension of B is lenB x 3 where lenB is the number of lines in the text file containing B(39051 in my case). b1.mat gets read in directly as a the vector b. %========================================================================= %========================================================================= static char help[]="Reading in a matrix\n"; #include #include #include #include "petscvec.h" /* This enables us to use vectors. */ #include "petscmat.h" /* This enables us to use Matrices. It includes the petscvec header file*/ #include "petscksp.h" /* Now we can solve linear systems. Solvers used are KSP. */ int main(int argc, char **argv) { /* Declaration of Matrix A and some vectors x*/ Vec b; int tempi,tempj; Mat A; FILE *fp; MPI_Comm comm; PetscInt n=2683,m=1274,index,lenB=39051; PetscScalar scalar,rhs[n],B[lenB][3]; PetscErrorCode ierr; PetscInt i,j; comm = MPI_COMM_SELF; /* This part is needed for the help flag supplied at run-time*/ ierr = PetscInitialize(&argc,&argv,(char*)0,help);CHKERRQ(ierr); ierr = PetscOptionsGetInt(PETSC_NULL,"-n",&n,PETSC_NULL);CHKERRQ(ierr); /* Use options from the terminal to create a vector that is type shared or mpi. */ ierr = VecCreate(PETSC_COMM_WORLD,&b);CHKERRQ(ierr); /* Vector creation */ ierr = VecSetSizes(b,PETSC_DECIDE,n);CHKERRQ(ierr); /* Setting the vector size */ ierr = VecSetFromOptions(b);CHKERRQ(ierr); /* Setting the vector type (shared, mpi etc) */ /* The second argument is a PETSc scalar value.*/ ierr = VecSet(b,0);CHKERRQ(ierr); /* Reading in the RHS vector. */ /* Reading in the matrix from the file matrix.txt */ fp=fopen("b1.mat","r"); if (fp==NULL) { fprintf(stderr, "Cannot open file"); exit(1); } for (i = 0; i < n; i++) { if (fscanf(fp,"%lf", &rhs[i]) != 1) { fprintf(stderr, "Failed to read rhs vector[%d]\n", i); exit(1); } } index=0; /*Putting x into final form */ for (i=0; i From a.mesgarnejad at gmail.com Sun Jan 16 15:47:03 2011 From: a.mesgarnejad at gmail.com (Ataollah Mesgarnejad) Date: Sun, 16 Jan 2011 15:47:03 -0600 Subject: [petsc-users] info file Message-ID: <586AF98A-1415-4259-BC0E-BA69123CE01D@gmail.com> Dear All, I use VecView with a binary PETSc viewer to output my data it always creates bunch of blank info files. It's not a big inconvenience but is there a way to write my data without creating these info files? Best, Ata M From confirmacao at peticaopublica.com Sun Jan 16 15:48:52 2011 From: confirmacao at peticaopublica.com (Peticao Publica) Date: 16 Jan 2011 16:48:52 -0500 Subject: [petsc-users] =?iso-8859-1?q?Billy_enviou-lhe_a_seguinte_Peti=E7?= =?iso-8859-1?q?=E3o=2E?= Message-ID: <6544F6DD-E7B7-43E8-A3E1-DD4F3D4F9688@mail.peticaopublica.com> An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Jan 16 15:51:39 2011 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 16 Jan 2011 15:51:39 -0600 Subject: [petsc-users] loading a matrix in PETSc : Problems In-Reply-To: References: Message-ID: You need to preallocate the matrix memory. I would read through the file once, count the number of nonzeros in each row, preallocate the matrix, then read the file again to put in the data. Matt On Sun, Jan 16, 2011 at 3:21 PM, Gaurish Telang wrote: > Hi, > > > Given matrix A and column vector b. I am trying to do a Least squares > problem Ax=b on PETSc. > > A has dimenstion 2683x1274 > > I have A given to me in a text file in the form of three columns (Row, > Column,NonZeroElement). > > E.g. If a matrix of dimension 2x3 such as > > 0 5 3 > 5 6 0 > would be given in the text file as: > > 1 2 5 > 1 3 3 > 2 1 5 > 2 2 6 > > The third column just lists the non-zero elements and the corresponding > first two columsn the position,. > > Now I am trying to input this textfile data into PETSc matrix A. But my > matrix takes forever to load. > > The same code on small matrices of dimension say 4x3 works perfectly. > > I have pasted my PETSc code below. Any suggestions on improving it will be > helpful. > > ***Explanation of my code****: > > For my code I start with two text files: N1.mat and b1.mat----> > > N1.mat containing the matrix A in (Row,Column,Nonzeroelement) format and > b1.mat containing the right hand side b. > > Then I read in the three columns of N1.mat as a matrix B which I use to > re-create the matrix A. The dimension of B is lenB x 3 where lenB is the > number of lines in the text file containing B(39051 in my case). > b1.mat gets read in directly as a the vector b. > > %========================================================================= > %========================================================================= > static char help[]="Reading in a matrix\n"; > > #include > > #include > > #include > > #include "petscvec.h" /* This enables us to use vectors. */ > > #include "petscmat.h" /* This enables us to use Matrices. It includes the > petscvec header file*/ > > #include "petscksp.h" /* Now we can solve linear systems. Solvers used are > KSP. */ > > int main(int argc, char **argv) > > { > > /* Declaration of Matrix A and some vectors x*/ > > Vec b; > > int tempi,tempj; > > Mat A; > > FILE *fp; > > MPI_Comm comm; > > PetscInt n=2683,m=1274,index,lenB=39051; > > PetscScalar scalar,rhs[n],B[lenB][3]; > > PetscErrorCode ierr; > > PetscInt i,j; > > comm = MPI_COMM_SELF; > > /* This part is needed for the help flag supplied at run-time*/ > > ierr = PetscInitialize(&argc,&argv,(char*)0,help);CHKERRQ(ierr); > > ierr = PetscOptionsGetInt(PETSC_NULL,"-n",&n,PETSC_NULL);CHKERRQ(ierr); > > /* Use options from the terminal to create a vector that is type shared or > mpi. */ > > ierr = VecCreate(PETSC_COMM_WORLD,&b);CHKERRQ(ierr); /* Vector creation */ > > ierr = VecSetSizes(b,PETSC_DECIDE,n);CHKERRQ(ierr); /* Setting the vector > size */ > > ierr = VecSetFromOptions(b);CHKERRQ(ierr); /* Setting the vector type > (shared, mpi etc) */ > > /* The second argument is a PETSc scalar value.*/ > > ierr = VecSet(b,0);CHKERRQ(ierr); > > /* Reading in the RHS vector. */ > > /* Reading in the matrix from the file matrix.txt */ > > fp=fopen("b1.mat","r"); > > if (fp==NULL) > > { > > fprintf(stderr, "Cannot open file"); > > exit(1); > > } > > for (i = 0; i < n; i++) > > { > > if (fscanf(fp,"%lf", &rhs[i]) != 1) > > { > > fprintf(stderr, "Failed to read rhs vector[%d]\n", i); > > exit(1); > > } > > } > > index=0; > > /*Putting x into final form */ > > for (i=0; i > { > > ierr= VecSetValues(b,1,&index,&rhs[i],INSERT_VALUES);CHKERRQ(ierr); /* One > insertion per step. Thats what the 1 in second argument stands for */ > > index=index+1; > > } /* The third and fourth arguments are addresses. The fifth argument is > IORA */ > > /* Assembling the vector. */ > > ierr= VecAssemblyBegin(b);CHKERRQ(ierr); > > ierr=VecAssemblyEnd(b);CHKERRQ(ierr); > > /* Viewing the changed vector. */ > > ierr=PetscPrintf(PETSC_COMM_WORLD,"Vector b:\n");CHKERRQ(ierr); > > ierr=VecView(b,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr); > > /* > ------------------------------------------------------------------------------------------------------------------------------------------------------ > */ > > ierr=MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr); > > ierr = MatSetSizes(A,n,m,n,m);CHKERRQ(ierr); /* Setting the matrix size */ > > ierr = MatSetFromOptions(A);CHKERRQ(ierr); /* Setting the matrix type > (shared, mpi etc) */ > > /* Reading in the matrix from the file matrix.txt */ > > fp=fopen("N1.mat","r"); > > if (fp==NULL) > > { > > fprintf(stderr, "Cannot open file"); > > exit(1); > > } > > for (i = 0; i < lenB; i++) > > { > > for (j = 0; j < 3; j++) > > { > > if (fscanf(fp,"%lf", &B[i][j]) != 1) > > { > > fprintf(stderr, "Failed to read matrix[%d][%d]\n", i, j); > > exit(1); > > } > > } > > } > > /* Initializing the matrix A to zero matrix */ > > for(i=0;i > { > > for(j=0;j > { > > scalar=0.0; > > ierr=MatSetValues(A,1,&i,1,&j,&scalar,INSERT_VALUES);CHKERRQ(ierr); > > } > > } > > ierr=MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr); > > ierr=MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr); > > /* Inserting the non-zero elements. */ > > for(i=0;i > { > > tempi=(int)B[i][0]-1; > > tempj=(int)B[i][1]-1; > > > ierr=MatSetValues(A,1,&tempi,1,&tempj,&B[i][2],INSERT_VALUES);CHKERRQ(ierr); > > } > > ierr=MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr); > > ierr=MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr); > > //ierr=MatView(A,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr); > > /* > > Destroy any objects created. > > */ > > ierr=VecDestroy(b);CHKERRQ(ierr); > > ierr=MatDestroy(A);CHKERRQ(ierr); > > ierr=PetscFinalize();CHKERRQ(ierr); > > return 0; > > } > > > > > > > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Jan 16 15:53:01 2011 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 16 Jan 2011 15:53:01 -0600 Subject: [petsc-users] info file In-Reply-To: <586AF98A-1415-4259-BC0E-BA69123CE01D@gmail.com> References: <586AF98A-1415-4259-BC0E-BA69123CE01D@gmail.com> Message-ID: http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/Viewer/PetscViewerBinarySkipInfo.html Matt On Sun, Jan 16, 2011 at 3:47 PM, Ataollah Mesgarnejad < a.mesgarnejad at gmail.com> wrote: > Dear All, > > I use VecView with a binary PETSc viewer to output my data it always > creates bunch of blank info files. It's not a big inconvenience but is there > a way to write my data without creating these info files? > > Best, > Ata M -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From a.mesgarnejad at gmail.com Sun Jan 16 17:09:00 2011 From: a.mesgarnejad at gmail.com (Ataollah Mesgarnejad) Date: Sun, 16 Jan 2011 17:09:00 -0600 Subject: [petsc-users] info file In-Reply-To: References: <586AF98A-1415-4259-BC0E-BA69123CE01D@gmail.com> Message-ID: <3FEEBE00-1CA8-4960-AC26-C20EE8CCD15E@gmail.com> Neat. Thanks Matt. On Jan 16, 2011, at 3:53 PM, Matthew Knepley wrote: > http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/Viewer/PetscViewerBinarySkipInfo.html > > Matt > > On Sun, Jan 16, 2011 at 3:47 PM, Ataollah Mesgarnejad wrote: > Dear All, > > I use VecView with a binary PETSc viewer to output my data it always creates bunch of blank info files. It's not a big inconvenience but is there a way to write my data without creating these info files? > > Best, > Ata M > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bouloumag at gmail.com Sun Jan 16 17:59:56 2011 From: bouloumag at gmail.com (Darcoux Christine) Date: Sun, 16 Jan 2011 18:59:56 -0500 Subject: [petsc-users] matrix-free preconditionning of FGMRES Message-ID: I am new to PETSc and I am interested to use the nonlinear solver in a CFD code for low speed compressible fluid. According to some papers, it seems that GMRES (or any other Krylov method) could be used to precondition FGMRES in a way that is completely matrix-free. Is it something possible with the fgmres implementation provided by PETSc ? Christine -------------- next part -------------- An HTML attachment was scrubbed... URL: From gaurish108 at gmail.com Sun Jan 16 18:23:19 2011 From: gaurish108 at gmail.com (Gaurish Telang) Date: Sun, 16 Jan 2011 19:23:19 -0500 Subject: [petsc-users] Creating random matrices and doing Least squares in PETSc Message-ID: Hi, this might be a vague question but is it possible to create a random rectangular matrix in PETSc of arbitrary specified dimension? More specifically, I would like the rectangular matrix to be sparse and have 1.14% non-zeros in the matrix. If there is no such direct way, is there a way for MATLAB to do this and then possibly output it to a text file so that it could be fed to PETSc? The thing is I want to know how fast PETSc does least squares problems |Ax-b| for matrices A of dimensions like 2683x1274 where A is sparse with about 1.14% non-zeros. If you have any kind of timing studies / weblinks describing how fast PETSc does least square problems that would be really helpful. Sincere thanks, Gaurish. -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sun Jan 16 18:27:39 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sun, 16 Jan 2011 18:27:39 -0600 Subject: [petsc-users] Creating random matrices and doing Least squares in PETSc In-Reply-To: References: Message-ID: On Jan 16, 2011, at 6:23 PM, Gaurish Telang wrote: > Hi, this might be a vague question but is it possible to create a random rectangular matrix in PETSc of arbitrary specified dimension? More specifically, I would like the rectangular matrix to be sparse and have 1.14% non-zeros in the matrix. > > If there is no such direct way, is there a way for MATLAB to do this and then possibly output it to a text file so that it could be fed to PETSc? use $PETSC_DIR/bin/matlab/PetscBinaryWrite.m to save the matrix to a fast binary format then use MatLoad() to load it into PETSc Barry > > The thing is I want to know how fast PETSc does least squares problems |Ax-b| for matrices A of dimensions like 2683x1274 where A is sparse with about 1.14% non-zeros. > > If you have any kind of timing studies / weblinks describing how fast PETSc does least square problems that would be really helpful. > > Sincere thanks, > > Gaurish. From bsmith at mcs.anl.gov Sun Jan 16 18:35:01 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sun, 16 Jan 2011 18:35:01 -0600 Subject: [petsc-users] matrix-free preconditionning of FGMRES In-Reply-To: References: Message-ID: On Jan 16, 2011, at 5:59 PM, Darcoux Christine wrote: > I am new to PETSc and I am interested to use the nonlinear solver in a CFD code for low speed compressible fluid. > > According to some papers, it seems that GMRES (or any other Krylov method) could be used to precondition FGMRES in a way that is completely matrix-free. Is it something possible with the fgmres implementation provided by PETSc ? Yes, but like anything with "matrix-free" is the question how you provide a decent preconditioner without forming any matrices. If you can do that then you are all set. KSPSetOperators(ksp,A,B,...) or SNESSetJacobian(snes,A,B, ....) where A is a a MATSHELL that does matrix vector products or use MatCreateMFFD() -ksp_type fgmres -pc_type ksp -ksp_ksp_type gmres -ksp_view If B is some approximate representation of A then B will be used to construct the preconditioner, if you never have a matrix-representation but have a function/subroutine that is supposedly a good preconditioner then you would use PCSHELL to provide it. So there are several possibilities depending on what you have and what you want to do. Barry > Christine > From bouloumag at gmail.com Sun Jan 16 19:03:44 2011 From: bouloumag at gmail.com (Darcoux Christine) Date: Sun, 16 Jan 2011 20:03:44 -0500 Subject: [petsc-users] matrix-free preconditionning of FGMRES In-Reply-To: References: Message-ID: 2011/1/16 Barry Smith > > On Jan 16, 2011, at 5:59 PM, Darcoux Christine wrote: > > > I am new to PETSc and I am interested to use the nonlinear solver in a > CFD code for low speed compressible fluid. > > > > According to some papers, it seems that GMRES (or any other Krylov > method) could be used to precondition FGMRES in a way that is completely > matrix-free. Is it something possible with the fgmres implementation > provided by PETSc ? > > Yes, but like anything with "matrix-free" is the question how you > provide a decent preconditioner without forming any matrices. If you can do > that then you are all set. > > KSPSetOperators(ksp,A,B,...) or SNESSetJacobian(snes,A,B, ....) where A > is a a MATSHELL that does matrix vector products or use MatCreateMFFD() > > -ksp_type fgmres -pc_type ksp -ksp_ksp_type gmres -ksp_view > > If B is some approximate representation of A then B will be used to > construct the preconditioner, if you never have a matrix-representation but > have a function/subroutine that is supposedly a good preconditioner then you > would use PCSHELL to provide it. So there are several possibilities > depending on what you have and what you want to do. > > Barry > > > Christine > > > Thank you for the detailled explanations. My idea is to simply precondition fGMRES with GMRES to avoid creating a B that is an approximate representation of A. If A is represented by a mat-vec routine, I think that the preconditionner M (approximation of the inverse of A) could be defined by the action of GMRES. Here's a code pseudo-code showing how I would like to define such an M. 1. Define matrix A as a mat-vec operation (matrix free), right-hand-side b and x0 a zero vector 2. The tricky part is now to define M by a mat-vec function like this function precon_matvec(x) { //Basically, GMRES approximates inverse(A) with a polynomial // P(A) so that x = x0 + P(A)*r, where r is the residual. Thus in the // call to GMRES, x0 is a zero vector and the right-hand-side is x. return gmres(A, x, x0, tolerance=gmres_tol, maxiter=10) } M = CreateMatrixFreePreconditionner (precon_matvec) 3. Now call fGMRES with the above M preconditionner Is it possible to acheive something like this with PETSc ? -------------- next part -------------- An HTML attachment was scrubbed... URL: From bouloumag at gmail.com Sun Jan 16 19:05:51 2011 From: bouloumag at gmail.com (Darcoux Christine) Date: Sun, 16 Jan 2011 20:05:51 -0500 Subject: [petsc-users] matrix-free preconditionning of FGMRES In-Reply-To: References: Message-ID: 2011/1/16 Barry Smith > > On Jan 16, 2011, at 5:59 PM, Darcoux Christine wrote: > > > I am new to PETSc and I am interested to use the nonlinear solver in a > CFD code for low speed compressible fluid. > > > > According to some papers, it seems that GMRES (or any other Krylov > method) could be used to precondition FGMRES in a way that is completely > matrix-free. Is it something possible with the fgmres implementation > provided by PETSc ? > > Yes, but like anything with "matrix-free" is the question how you > provide a decent preconditioner without forming any matrices. If you can do > that then you are all set. > > KSPSetOperators(ksp,A,B,...) or SNESSetJacobian(snes,A,B, ....) where A > is a a MATSHELL that does matrix vector products or use MatCreateMFFD() > > -ksp_type fgmres -pc_type ksp -ksp_ksp_type gmres -ksp_view > > If B is some approximate representation of A then B will be used to > construct the preconditioner, if you never have a matrix-representation but > have a function/subroutine that is supposedly a good preconditioner then you > would use PCSHELL to provide it. So there are several possibilities > depending on what you have and what you want to do. > > Barry > > > Christine > > > > Thank you for the detailled explanations. My idea is to simply precondition fGMRES with GMRES to avoid creating a B that is an approximate representation of A. If A is represented by a mat-vec routine, I think that the preconditionner M (approximation of the inverse of A) could be defined by the action of GMRES. Here's a code pseudo-code showing how I would like to define such an M. 1. Define matrix A as a mat-vec operation (matrix free), right-hand-side b and x0 a zero vector 2. The tricky part is now to define M by a mat-vec function like this function precon_matvec(x) { //Basically, GMRES approximates inverse(A) with a polynomial // P(A) so that x = x0 + P(A)*r, where r is the residual. Thus in the // call to GMRES, x0 is a zero vector and the right-hand-side is x. return gmres(A, x, x0, tolerance=gmres_tol, maxiter=10) } M = CreateMatrixFreePreconditionne r (precon_matvec) 3. Now call fGMRES with the above M preconditionner Is it possible to acheive something like this with PETSc ? Cheers, Christine -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sun Jan 16 19:23:59 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sun, 16 Jan 2011 19:23:59 -0600 Subject: [petsc-users] matrix-free preconditionning of FGMRES In-Reply-To: References: Message-ID: -ksp_type fgmres -pc_type ksp -ksp_ksp_type gmres -ksp_pc_type none But this is just nesting unpreconditioned GMRES inside FGMRES. It is trivial to do but I doubt that it buys you anything over just -ksp_type gmres -pc_type none Barry On Jan 16, 2011, at 7:05 PM, Darcoux Christine wrote: > 2011/1/16 Barry Smith > > On Jan 16, 2011, at 5:59 PM, Darcoux Christine wrote: > > > I am new to PETSc and I am interested to use the nonlinear solver in a CFD code for low speed compressible fluid. > > > > According to some papers, it seems that GMRES (or any other Krylov method) could be used to precondition FGMRES in a way that is completely matrix-free. Is it something possible with the fgmres implementation provided by PETSc ? > > Yes, but like anything with "matrix-free" is the question how you provide a decent preconditioner without forming any matrices. If you can do that then you are all set. > > KSPSetOperators(ksp,A,B,...) or SNESSetJacobian(snes,A,B, ....) where A is a a MATSHELL that does matrix vector products or use MatCreateMFFD() > > -ksp_type fgmres -pc_type ksp -ksp_ksp_type gmres -ksp_view > > If B is some approximate representation of A then B will be used to construct the preconditioner, if you never have a matrix-representation but have a function/subroutine that is supposedly a good preconditioner then you would use PCSHELL to provide it. So there are several possibilities depending on what you have and what you want to do. > > Barry > > > Christine > > > > > Thank you for the detailled explanations. My idea is to simply precondition fGMRES with GMRES to avoid creating a B that is an approximate representation of A. If A is represented by a mat-vec routine, I think that the preconditionner M (approximation of the inverse of A) could be defined by the action of GMRES. Here's a code pseudo-code showing how I would like to define such an M. > > 1. Define matrix A as a mat-vec operation (matrix free), right-hand-side b and x0 a zero vector > 2. The tricky part is now to define M by a mat-vec function like this > > function precon_matvec(x) { > //Basically, GMRES approximates inverse(A) with a polynomial > // P(A) so that x = x0 + P(A)*r, where r is the residual. Thus in the > // call to GMRES, x0 is a zero vector and the right-hand-side is x. > return gmres(A, x, x0, tolerance=gmres_tol, maxiter=10) > } > > M = CreateMatrixFreePreconditionne > r (precon_matvec) > > 3. Now call fGMRES with the above M preconditionner > > Is it possible to acheive something like this with PETSc ? > > Cheers, > > Christine > From gaurish108 at gmail.com Mon Jan 17 02:15:42 2011 From: gaurish108 at gmail.com (Gaurish Telang) Date: Mon, 17 Jan 2011 03:15:42 -0500 Subject: [petsc-users] using KSPLSQR and SKPCGNE Message-ID: Hi, I am new to PETSc and I wanted to solve some least squares problems with it. On searching the net I found that KSPLSQR() and KSPCGNE() solve the least squares system |Ax-b| But I don't really know how to use these functions to get my answer. This manual page did not help: http://www.mcs.anl.gov/petsc/petsc-2/snapshots/petsc-current/docs/manualpages/KSP/KSPLSQR.html Apparently no tutorial code uses these functions. If anyone could could give a small code snippet of how to use these functions (assuming A and b are given) then it would be really helpful. Thanks, Gaurish -------------- next part -------------- An HTML attachment was scrubbed... URL: From wumengda at gmail.com Mon Jan 17 02:43:54 2011 From: wumengda at gmail.com (Mengda Wu) Date: Mon, 17 Jan 2011 00:43:54 -0800 Subject: [petsc-users] ksp/examples/tutorials/Ex2.c: good with with-debugging but error without-debugging Message-ID: Hi all, I just compiled the debugged and optimized versions of petsc-3.1-p7. Both are successful. I am running on Windows Vista 64bit machine. The C/C++ compiler is cl.exe from visual studio 2005 (64 bit) and no FORTRAN compiler is used. BLAS/LAPACK support comes from Intel MKL-10.1.3.028 (under em64t\lib). No MPI is used. The debugged petsc was configured with: $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' --with-mpi= 0 --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_thread.lib,mkl_core.lib ,libiomp5mt.lib] -CFLAGS='-MDd -W3 -Z7' -CXXFLAGS='-MDd -W3 -Z7' The optimized petsc was configured with: $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' --with-mpi= 0 --with-debugging=0 --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_threa d.lib,mkl_core.lib,libiomp5mt.lib] -CFLAGS='-MD -wd4996 -O2' -CXXFLAGS='-MD -w d4996 -O2' When I run ksp/examples/tutorials/Ex2.c. The result with debugged petsc is ================================================================= Norm of error 0.000156044 iterations 6 ================================================================= However, there are errors with the optimized petsc with the output as follows: ================================================================= [0]PETSC ERROR: --------------------- Error Message ---------------------------- -------- [0]PETSC ERROR: Nonconforming object sizes! [0]PETSC ERROR: Mat mat,Vec y: global dim 56 57! [0]PETSC ERROR: ---------------------------------------------------------------- -------- [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 20 10 [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [0]PETSC ERROR: ---------------------------------------------------------------- -------- [0]PETSC ERROR: D:\Develop\Test\PETSc\petsc-3.1-p7\src\ksp\ksp\examples\tutorial s\ex2.exe on a cygwin-c- named CVBRL-38 by mengda Mon Jan 17 00:35:15 2011 [0]PETSC ERROR: Libraries linked from /cygdrive/d/Develop/Test/PETSc/petsc-3.1-p 7/cygwin-c-opt/lib [0]PETSC ERROR: Configure run at Sun Jan 16 23:34:25 2011 [0]PETSC ERROR: Configure options --with-cc=cl --with-fc=0 --with-cxx=cl --with- mpi=0 --with-debugging=0 --with-blas-lapack-lib="[mkl_intel_lp64.lib,mkl_intel_t hread.lib,mkl_core.lib,libiomp5mt.lib]" -CFLAGS="-MD -wd4996 -O2" -CXXFLAGS="-MD -wd4996 -O2" --useThreads=0 [0]PETSC ERROR: ---------------------------------------------------------------- -------- [0]PETSC ERROR: MatMult() line 1888 in src/mat/interface/D:\Develop\Test\PETSc\P ETSC-~1.1-P\src\mat\INTERF~1\matrix.c [0]PETSC ERROR: main() line 146 in src/ksp/ksp/examples/tutorials/D:\Develop\Tes t\PETSc\PETSC-~1.1-P\src\ksp\ksp\examples\TUTORI~1\ex2.c This application has requested the Runtime to terminate it in an unusual way. Please contact the application's support team for more information. ================================================================= I am wondering what problems may lead to the errors. Please let me know if you need more information. Thanks, Mengda -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Mon Jan 17 08:48:20 2011 From: hzhang at mcs.anl.gov (Hong Zhang) Date: Mon, 17 Jan 2011 08:48:20 -0600 Subject: [petsc-users] using KSPLSQR and SKPCGNE In-Reply-To: References: Message-ID: You can run ~petsc/src/ksp/ksp/examples/tutorials/ex2.c with runtime option petsc-3.1/src/ksp/ksp/examples/tutorials>./ex2 -ksp_type lsqr -ksp_monitor 0 KSP Residual norm 6.164414002969e+00 1 KSP Residual norm 3.193020760242e+00 ... 19 KSP Residual norm 5.405772406774e-04 Norm of error 0.000404485 iterations 19 option '-ksp_view' display the solver being used. Hong On Mon, Jan 17, 2011 at 2:15 AM, Gaurish Telang wrote: > Hi, > > I am new to PETSc and I wanted to solve some least squares problems with it. > On searching the net I found that KSPLSQR() and KSPCGNE() solve the least > squares system |Ax-b| > > But I don't really know how to use these functions to get my answer. This > manual page did not help: > http://www.mcs.anl.gov/petsc/petsc-2/snapshots/petsc-current/docs/manualpages/KSP/KSPLSQR.html > > Apparently no tutorial code uses these functions. > > If anyone could could give a small code snippet of how to use these > functions (assuming A and b are given) then it would be really helpful. > > Thanks, > > Gaurish > From bsmith at mcs.anl.gov Mon Jan 17 14:27:25 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 17 Jan 2011 14:27:25 -0600 Subject: [petsc-users] [petsc-maint #61421] ksp/examples/tutorials/Ex2.c: good with with-debugging but error without-debugging In-Reply-To: References: Message-ID: <8B6F3968-D06D-4C9D-9269-FDD5645B444B@mcs.anl.gov> Compiler bug. Immediately before the call to MatMult() in the code add the two lines ierr = VecView(u,0); ierr = MatView(A,0); how large are the two objects? Given the code it is inconceivable that suddenly the vector length becomes 57. Barry On Jan 17, 2011, at 2:43 AM, Mengda Wu wrote: > Hi all, > > I just compiled the debugged and optimized versions of petsc-3.1-p7. > Both are successful. I am running on Windows Vista 64bit machine. > The C/C++ compiler is cl.exe from visual studio 2005 (64 bit) and no FORTRAN > compiler is used. BLAS/LAPACK > support comes from Intel MKL-10.1.3.028 (under em64t\lib). No MPI is used. > > The debugged petsc was configured with: > $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' > --with-mpi= > 0 > --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_thread.lib,mkl_core.lib > ,libiomp5mt.lib] -CFLAGS='-MDd -W3 -Z7' -CXXFLAGS='-MDd -W3 -Z7' > > The optimized petsc was configured with: > $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' > --with-mpi= > 0 --with-debugging=0 > --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_threa > d.lib,mkl_core.lib,libiomp5mt.lib] -CFLAGS='-MD -wd4996 -O2' -CXXFLAGS='-MD > -w > d4996 -O2' > > When I run ksp/examples/tutorials/Ex2.c. The result with debugged petsc > is > ================================================================= > Norm of error 0.000156044 iterations 6 > ================================================================= > > However, there are errors with the optimized petsc with the output as > follows: > ================================================================= > [0]PETSC ERROR: --------------------- Error Message > ---------------------------- > -------- > [0]PETSC ERROR: Nonconforming object sizes! > [0]PETSC ERROR: Mat mat,Vec y: global dim 56 57! > [0]PETSC ERROR: > ---------------------------------------------------------------- > -------- > [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 > CST 20 > 10 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: > ---------------------------------------------------------------- > -------- > [0]PETSC ERROR: > D:\Develop\Test\PETSc\petsc-3.1-p7\src\ksp\ksp\examples\tutorial > s\ex2.exe on a cygwin-c- named CVBRL-38 by mengda Mon Jan 17 00:35:15 2011 > [0]PETSC ERROR: Libraries linked from > /cygdrive/d/Develop/Test/PETSc/petsc-3.1-p > 7/cygwin-c-opt/lib > [0]PETSC ERROR: Configure run at Sun Jan 16 23:34:25 2011 > [0]PETSC ERROR: Configure options --with-cc=cl --with-fc=0 --with-cxx=cl > --with- > mpi=0 --with-debugging=0 > --with-blas-lapack-lib="[mkl_intel_lp64.lib,mkl_intel_t > hread.lib,mkl_core.lib,libiomp5mt.lib]" -CFLAGS="-MD -wd4996 -O2" > -CXXFLAGS="-MD > -wd4996 -O2" --useThreads=0 > [0]PETSC ERROR: > ---------------------------------------------------------------- > -------- > [0]PETSC ERROR: MatMult() line 1888 in > src/mat/interface/D:\Develop\Test\PETSc\P > ETSC-~1.1-P\src\mat\INTERF~1\matrix.c > [0]PETSC ERROR: main() line 146 in > src/ksp/ksp/examples/tutorials/D:\Develop\Tes > t\PETSc\PETSC-~1.1-P\src\ksp\ksp\examples\TUTORI~1\ex2.c > > This application has requested the Runtime to terminate it in an unusual > way. > Please contact the application's support team for more information. > ================================================================= > > I am wondering what problems may lead to the errors. Please let me know > if you need more > information. > > Thanks, > Mengda > > Hi all, > > I just compiled the debugged and optimized versions of petsc-3.1-p7. Both are successful. I am running on Windows Vista 64bit machine. > The C/C++ compiler is cl.exe from visual studio 2005 (64 bit) and no FORTRAN compiler is used. BLAS/LAPACK > support comes from Intel MKL-10.1.3.028 (under em64t\lib). No MPI is used. > > The debugged petsc was configured with: > $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' --with-mpi= > 0 --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_thread.lib,mkl_core.lib > ,libiomp5mt.lib] -CFLAGS='-MDd -W3 -Z7' -CXXFLAGS='-MDd -W3 -Z7' > > The optimized petsc was configured with: > $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' --with-mpi= > 0 --with-debugging=0 --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_threa > d.lib,mkl_core.lib,libiomp5mt.lib] -CFLAGS='-MD -wd4996 -O2' -CXXFLAGS='-MD -w > d4996 -O2' > > When I run ksp/examples/tutorials/Ex2.c. The result with debugged petsc is > ================================================================= > Norm of error 0.000156044 iterations 6 > ================================================================= > > However, there are errors with the optimized petsc with the output as follows: > ================================================================= > [0]PETSC ERROR: --------------------- Error Message ---------------------------- > -------- > [0]PETSC ERROR: Nonconforming object sizes! > [0]PETSC ERROR: Mat mat,Vec y: global dim 56 57! > [0]PETSC ERROR: ---------------------------------------------------------------- > -------- > [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 20 > 10 > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0]PETSC ERROR: See docs/index.html for manual pages. > [0]PETSC ERROR: ---------------------------------------------------------------- > -------- > [0]PETSC ERROR: D:\Develop\Test\PETSc\petsc-3.1-p7\src\ksp\ksp\examples\tutorial > s\ex2.exe on a cygwin-c- named CVBRL-38 by mengda Mon Jan 17 00:35:15 2011 > [0]PETSC ERROR: Libraries linked from /cygdrive/d/Develop/Test/PETSc/petsc-3.1-p > 7/cygwin-c-opt/lib > [0]PETSC ERROR: Configure run at Sun Jan 16 23:34:25 2011 > [0]PETSC ERROR: Configure options --with-cc=cl --with-fc=0 --with-cxx=cl --with- > mpi=0 --with-debugging=0 --with-blas-lapack-lib="[mkl_intel_lp64.lib,mkl_intel_t > hread.lib,mkl_core.lib,libiomp5mt.lib]" -CFLAGS="-MD -wd4996 -O2" -CXXFLAGS="-MD > -wd4996 -O2" --useThreads=0 > [0]PETSC ERROR: ---------------------------------------------------------------- > -------- > [0]PETSC ERROR: MatMult() line 1888 in src/mat/interface/D:\Develop\Test\PETSc\P > ETSC-~1.1-P\src\mat\INTERF~1\matrix.c > [0]PETSC ERROR: main() line 146 in src/ksp/ksp/examples/tutorials/D:\Develop\Tes > t\PETSc\PETSC-~1.1-P\src\ksp\ksp\examples\TUTORI~1\ex2.c > > This application has requested the Runtime to terminate it in an unusual way. > Please contact the application's support team for more information. > ================================================================= > > I am wondering what problems may lead to the errors. Please let me know if you need more > information. > > Thanks, > Mengda From wumengda at gmail.com Mon Jan 17 15:43:40 2011 From: wumengda at gmail.com (Mengda Wu) Date: Mon, 17 Jan 2011 13:43:40 -0800 Subject: [petsc-users] [petsc-maint #61421] ksp/examples/tutorials/Ex2.c: good with with-debugging but error without-debugging In-Reply-To: <8B6F3968-D06D-4C9D-9269-FDD5645B444B@mcs.anl.gov> References: <8B6F3968-D06D-4C9D-9269-FDD5645B444B@mcs.anl.gov> Message-ID: This indeed is caused by a bug in Visual c++ 2005 64bit compiler when using optimization. The result is correct after installing the hotfix: http://support.microsoft.com/kb/976617/ Thanks a lot! Mengda On Mon, Jan 17, 2011 at 12:27 PM, Barry Smith wrote: > > Compiler bug. Immediately before the call to MatMult() in the code add the > two lines > > ierr = VecView(u,0); > ierr = MatView(A,0); > > how large are the two objects? Given the code it is inconceivable that > suddenly the vector length becomes 57. > > Barry > > > On Jan 17, 2011, at 2:43 AM, Mengda Wu wrote: > > > Hi all, > > > > I just compiled the debugged and optimized versions of petsc-3.1-p7. > > Both are successful. I am running on Windows Vista 64bit machine. > > The C/C++ compiler is cl.exe from visual studio 2005 (64 bit) and no > FORTRAN > > compiler is used. BLAS/LAPACK > > support comes from Intel MKL-10.1.3.028 (under em64t\lib). No MPI is > used. > > > > The debugged petsc was configured with: > > $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' > > --with-mpi= > > 0 > > > --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_thread.lib,mkl_core.lib > > ,libiomp5mt.lib] -CFLAGS='-MDd -W3 -Z7' -CXXFLAGS='-MDd -W3 -Z7' > > > > The optimized petsc was configured with: > > $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' > > --with-mpi= > > 0 --with-debugging=0 > > --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_threa > > d.lib,mkl_core.lib,libiomp5mt.lib] -CFLAGS='-MD -wd4996 -O2' > -CXXFLAGS='-MD > > -w > > d4996 -O2' > > > > When I run ksp/examples/tutorials/Ex2.c. The result with debugged petsc > > is > > ================================================================= > > Norm of error 0.000156044 iterations 6 > > ================================================================= > > > > However, there are errors with the optimized petsc with the output as > > follows: > > ================================================================= > > [0]PETSC ERROR: --------------------- Error Message > > ---------------------------- > > -------- > > [0]PETSC ERROR: Nonconforming object sizes! > > [0]PETSC ERROR: Mat mat,Vec y: global dim 56 57! > > [0]PETSC ERROR: > > ---------------------------------------------------------------- > > -------- > > [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 > > CST 20 > > 10 > > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > > [0]PETSC ERROR: See docs/index.html for manual pages. > > [0]PETSC ERROR: > > ---------------------------------------------------------------- > > -------- > > [0]PETSC ERROR: > > D:\Develop\Test\PETSc\petsc-3.1-p7\src\ksp\ksp\examples\tutorial > > s\ex2.exe on a cygwin-c- named CVBRL-38 by mengda Mon Jan 17 00:35:15 > 2011 > > [0]PETSC ERROR: Libraries linked from > > /cygdrive/d/Develop/Test/PETSc/petsc-3.1-p > > 7/cygwin-c-opt/lib > > [0]PETSC ERROR: Configure run at Sun Jan 16 23:34:25 2011 > > [0]PETSC ERROR: Configure options --with-cc=cl --with-fc=0 --with-cxx=cl > > --with- > > mpi=0 --with-debugging=0 > > --with-blas-lapack-lib="[mkl_intel_lp64.lib,mkl_intel_t > > hread.lib,mkl_core.lib,libiomp5mt.lib]" -CFLAGS="-MD -wd4996 -O2" > > -CXXFLAGS="-MD > > -wd4996 -O2" --useThreads=0 > > [0]PETSC ERROR: > > ---------------------------------------------------------------- > > -------- > > [0]PETSC ERROR: MatMult() line 1888 in > > src/mat/interface/D:\Develop\Test\PETSc\P > > ETSC-~1.1-P\src\mat\INTERF~1\matrix.c > > [0]PETSC ERROR: main() line 146 in > > src/ksp/ksp/examples/tutorials/D:\Develop\Tes > > t\PETSc\PETSC-~1.1-P\src\ksp\ksp\examples\TUTORI~1\ex2.c > > > > This application has requested the Runtime to terminate it in an unusual > > way. > > Please contact the application's support team for more information. > > ================================================================= > > > > I am wondering what problems may lead to the errors. Please let me know > > if you need more > > information. > > > > Thanks, > > Mengda > > > > Hi all, > > > > I just compiled the debugged and optimized versions of petsc-3.1-p7. > Both are successful. I am running on Windows Vista 64bit machine. > > The C/C++ compiler is cl.exe from visual studio 2005 (64 bit) and no > FORTRAN compiler is used. BLAS/LAPACK > > support comes from Intel MKL-10.1.3.028 (under em64t\lib). No MPI is > used. > > > > The debugged petsc was configured with: > > $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' > --with-mpi= > > 0 > --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_thread.lib,mkl_core.lib > > ,libiomp5mt.lib] -CFLAGS='-MDd -W3 -Z7' -CXXFLAGS='-MDd -W3 -Z7' > > > > The optimized petsc was configured with: > > $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' > --with-mpi= > > 0 --with-debugging=0 > --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_threa > > d.lib,mkl_core.lib,libiomp5mt.lib] -CFLAGS='-MD -wd4996 -O2' > -CXXFLAGS='-MD -w > > d4996 -O2' > > > > When I run ksp/examples/tutorials/Ex2.c. The result with debugged > petsc is > > ================================================================= > > Norm of error 0.000156044 iterations 6 > > ================================================================= > > > > However, there are errors with the optimized petsc with the output as > follows: > > ================================================================= > > [0]PETSC ERROR: --------------------- Error Message > ---------------------------- > > -------- > > [0]PETSC ERROR: Nonconforming object sizes! > > [0]PETSC ERROR: Mat mat,Vec y: global dim 56 57! > > [0]PETSC ERROR: > ---------------------------------------------------------------- > > -------- > > [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 > CST 20 > > 10 > > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > > [0]PETSC ERROR: See docs/index.html for manual pages. > > [0]PETSC ERROR: > ---------------------------------------------------------------- > > -------- > > [0]PETSC ERROR: > D:\Develop\Test\PETSc\petsc-3.1-p7\src\ksp\ksp\examples\tutorial > > s\ex2.exe on a cygwin-c- named CVBRL-38 by mengda Mon Jan 17 00:35:15 > 2011 > > [0]PETSC ERROR: Libraries linked from > /cygdrive/d/Develop/Test/PETSc/petsc-3.1-p > > 7/cygwin-c-opt/lib > > [0]PETSC ERROR: Configure run at Sun Jan 16 23:34:25 2011 > > [0]PETSC ERROR: Configure options --with-cc=cl --with-fc=0 --with-cxx=cl > --with- > > mpi=0 --with-debugging=0 > --with-blas-lapack-lib="[mkl_intel_lp64.lib,mkl_intel_t > > hread.lib,mkl_core.lib,libiomp5mt.lib]" -CFLAGS="-MD -wd4996 -O2" > -CXXFLAGS="-MD > > -wd4996 -O2" --useThreads=0 > > [0]PETSC ERROR: > ---------------------------------------------------------------- > > -------- > > [0]PETSC ERROR: MatMult() line 1888 in > src/mat/interface/D:\Develop\Test\PETSc\P > > ETSC-~1.1-P\src\mat\INTERF~1\matrix.c > > [0]PETSC ERROR: main() line 146 in > src/ksp/ksp/examples/tutorials/D:\Develop\Tes > > t\PETSc\PETSC-~1.1-P\src\ksp\ksp\examples\TUTORI~1\ex2.c > > > > This application has requested the Runtime to terminate it in an unusual > way. > > Please contact the application's support team for more information. > > ================================================================= > > > > I am wondering what problems may lead to the errors. Please let me > know if you need more > > information. > > > > Thanks, > > Mengda > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From petsc-maint at mcs.anl.gov Mon Jan 17 15:55:48 2011 From: petsc-maint at mcs.anl.gov (Satish Balay) Date: Mon, 17 Jan 2011 15:55:48 -0600 (CST) Subject: [petsc-users] [petsc-maint #61421] ksp/examples/tutorials/Ex2.c: good with with-debugging but error without-debugging In-Reply-To: References: <8B6F3968-D06D-4C9D-9269-FDD5645B444B@mcs.anl.gov> Message-ID: Thanks for confirming its a compiler bug. BTW: The url below doesn't show the actual hotfix download. Is there a different location for this download? Satish On Mon, 17 Jan 2011, Mengda Wu wrote: > This indeed is caused by a bug in Visual c++ 2005 64bit compiler when using > optimization. > The result is correct after installing the hotfix: > http://support.microsoft.com/kb/976617/ > Thanks a lot! > > Mengda > > On Mon, Jan 17, 2011 at 12:27 PM, Barry Smith wrote: > > > > > Compiler bug. Immediately before the call to MatMult() in the code add the > > two lines > > > > ierr = VecView(u,0); > > ierr = MatView(A,0); > > > > how large are the two objects? Given the code it is inconceivable that > > suddenly the vector length becomes 57. > > > > Barry > > > > > > On Jan 17, 2011, at 2:43 AM, Mengda Wu wrote: > > > > > Hi all, > > > > > > I just compiled the debugged and optimized versions of petsc-3.1-p7. > > > Both are successful. I am running on Windows Vista 64bit machine. > > > The C/C++ compiler is cl.exe from visual studio 2005 (64 bit) and no > > FORTRAN > > > compiler is used. BLAS/LAPACK > > > support comes from Intel MKL-10.1.3.028 (under em64t\lib). No MPI is > > used. > > > > > > The debugged petsc was configured with: > > > $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' > > > --with-mpi= > > > 0 > > > > > --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_thread.lib,mkl_core.lib > > > ,libiomp5mt.lib] -CFLAGS='-MDd -W3 -Z7' -CXXFLAGS='-MDd -W3 -Z7' > > > > > > The optimized petsc was configured with: > > > $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' > > > --with-mpi= > > > 0 --with-debugging=0 > > > --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_threa > > > d.lib,mkl_core.lib,libiomp5mt.lib] -CFLAGS='-MD -wd4996 -O2' > > -CXXFLAGS='-MD > > > -w > > > d4996 -O2' > > > > > > When I run ksp/examples/tutorials/Ex2.c. The result with debugged petsc > > > is > > > ================================================================= > > > Norm of error 0.000156044 iterations 6 > > > ================================================================= > > > > > > However, there are errors with the optimized petsc with the output as > > > follows: > > > ================================================================= > > > [0]PETSC ERROR: --------------------- Error Message > > > ---------------------------- > > > -------- > > > [0]PETSC ERROR: Nonconforming object sizes! > > > [0]PETSC ERROR: Mat mat,Vec y: global dim 56 57! > > > [0]PETSC ERROR: > > > ---------------------------------------------------------------- > > > -------- > > > [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 > > > CST 20 > > > 10 > > > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > > > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > > > [0]PETSC ERROR: See docs/index.html for manual pages. > > > [0]PETSC ERROR: > > > ---------------------------------------------------------------- > > > -------- > > > [0]PETSC ERROR: > > > D:\Develop\Test\PETSc\petsc-3.1-p7\src\ksp\ksp\examples\tutorial > > > s\ex2.exe on a cygwin-c- named CVBRL-38 by mengda Mon Jan 17 00:35:15 > > 2011 > > > [0]PETSC ERROR: Libraries linked from > > > /cygdrive/d/Develop/Test/PETSc/petsc-3.1-p > > > 7/cygwin-c-opt/lib > > > [0]PETSC ERROR: Configure run at Sun Jan 16 23:34:25 2011 > > > [0]PETSC ERROR: Configure options --with-cc=cl --with-fc=0 --with-cxx=cl > > > --with- > > > mpi=0 --with-debugging=0 > > > --with-blas-lapack-lib="[mkl_intel_lp64.lib,mkl_intel_t > > > hread.lib,mkl_core.lib,libiomp5mt.lib]" -CFLAGS="-MD -wd4996 -O2" > > > -CXXFLAGS="-MD > > > -wd4996 -O2" --useThreads=0 > > > [0]PETSC ERROR: > > > ---------------------------------------------------------------- > > > -------- > > > [0]PETSC ERROR: MatMult() line 1888 in > > > src/mat/interface/D:\Develop\Test\PETSc\P > > > ETSC-~1.1-P\src\mat\INTERF~1\matrix.c > > > [0]PETSC ERROR: main() line 146 in > > > src/ksp/ksp/examples/tutorials/D:\Develop\Tes > > > t\PETSc\PETSC-~1.1-P\src\ksp\ksp\examples\TUTORI~1\ex2.c > > > > > > This application has requested the Runtime to terminate it in an unusual > > > way. > > > Please contact the application's support team for more information. > > > ================================================================= > > > > > > I am wondering what problems may lead to the errors. Please let me know > > > if you need more > > > information. > > > > > > Thanks, > > > Mengda > > > > > > Hi all, > > > > > > I just compiled the debugged and optimized versions of petsc-3.1-p7. > > Both are successful. I am running on Windows Vista 64bit machine. > > > The C/C++ compiler is cl.exe from visual studio 2005 (64 bit) and no > > FORTRAN compiler is used. BLAS/LAPACK > > > support comes from Intel MKL-10.1.3.028 (under em64t\lib). No MPI is > > used. > > > > > > The debugged petsc was configured with: > > > $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' > > --with-mpi= > > > 0 > > --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_thread.lib,mkl_core.lib > > > ,libiomp5mt.lib] -CFLAGS='-MDd -W3 -Z7' -CXXFLAGS='-MDd -W3 -Z7' > > > > > > The optimized petsc was configured with: > > > $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' > > --with-mpi= > > > 0 --with-debugging=0 > > --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_threa > > > d.lib,mkl_core.lib,libiomp5mt.lib] -CFLAGS='-MD -wd4996 -O2' > > -CXXFLAGS='-MD -w > > > d4996 -O2' > > > > > > When I run ksp/examples/tutorials/Ex2.c. The result with debugged > > petsc is > > > ================================================================= > > > Norm of error 0.000156044 iterations 6 > > > ================================================================= > > > > > > However, there are errors with the optimized petsc with the output as > > follows: > > > ================================================================= > > > [0]PETSC ERROR: --------------------- Error Message > > ---------------------------- > > > -------- > > > [0]PETSC ERROR: Nonconforming object sizes! > > > [0]PETSC ERROR: Mat mat,Vec y: global dim 56 57! > > > [0]PETSC ERROR: > > ---------------------------------------------------------------- > > > -------- > > > [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 > > CST 20 > > > 10 > > > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > > > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > > > [0]PETSC ERROR: See docs/index.html for manual pages. > > > [0]PETSC ERROR: > > ---------------------------------------------------------------- > > > -------- > > > [0]PETSC ERROR: > > D:\Develop\Test\PETSc\petsc-3.1-p7\src\ksp\ksp\examples\tutorial > > > s\ex2.exe on a cygwin-c- named CVBRL-38 by mengda Mon Jan 17 00:35:15 > > 2011 > > > [0]PETSC ERROR: Libraries linked from > > /cygdrive/d/Develop/Test/PETSc/petsc-3.1-p > > > 7/cygwin-c-opt/lib > > > [0]PETSC ERROR: Configure run at Sun Jan 16 23:34:25 2011 > > > [0]PETSC ERROR: Configure options --with-cc=cl --with-fc=0 --with-cxx=cl > > --with- > > > mpi=0 --with-debugging=0 > > --with-blas-lapack-lib="[mkl_intel_lp64.lib,mkl_intel_t > > > hread.lib,mkl_core.lib,libiomp5mt.lib]" -CFLAGS="-MD -wd4996 -O2" > > -CXXFLAGS="-MD > > > -wd4996 -O2" --useThreads=0 > > > [0]PETSC ERROR: > > ---------------------------------------------------------------- > > > -------- > > > [0]PETSC ERROR: MatMult() line 1888 in > > src/mat/interface/D:\Develop\Test\PETSc\P > > > ETSC-~1.1-P\src\mat\INTERF~1\matrix.c > > > [0]PETSC ERROR: main() line 146 in > > src/ksp/ksp/examples/tutorials/D:\Develop\Tes > > > t\PETSc\PETSC-~1.1-P\src\ksp\ksp\examples\TUTORI~1\ex2.c > > > > > > This application has requested the Runtime to terminate it in an unusual > > way. > > > Please contact the application's support team for more information. > > > ================================================================= > > > > > > I am wondering what problems may lead to the errors. Please let me > > know if you need more > > > information. > > > > > > Thanks, > > > Mengda > > > > > > From bsmith at mcs.anl.gov Mon Jan 17 16:01:29 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 17 Jan 2011 16:01:29 -0600 Subject: [petsc-users] [petsc-maint #61421] ksp/examples/tutorials/Ex2.c: good with with-debugging but error without-debugging In-Reply-To: References: <8B6F3968-D06D-4C9D-9269-FDD5645B444B@mcs.anl.gov> Message-ID: Mengda, Thanks for the report. At the website it says "You perform arithmetic on 64-bit pointers and then pass the results to an inline function that expects an "int" data type." Do you know specifically where this is happening in PETSc? We don't intend to in PETSc " perform arithmetic on 64-bit pointers and then pass the results to an inline function that expects an "int" data type. Thanks Barry On Jan 17, 2011, at 3:43 PM, Mengda Wu wrote: > This indeed is caused by a bug in Visual c++ 2005 64bit compiler when using > optimization. > The result is correct after installing the hotfix: > http://support.microsoft.com/kb/976617/ > Thanks a lot! > > Mengda > > On Mon, Jan 17, 2011 at 12:27 PM, Barry Smith wrote: > >> >> Compiler bug. Immediately before the call to MatMult() in the code add the >> two lines >> >> ierr = VecView(u,0); >> ierr = MatView(A,0); >> >> how large are the two objects? Given the code it is inconceivable that >> suddenly the vector length becomes 57. >> >> Barry >> >> >> On Jan 17, 2011, at 2:43 AM, Mengda Wu wrote: >> >>> Hi all, >>> >>> I just compiled the debugged and optimized versions of petsc-3.1-p7. >>> Both are successful. I am running on Windows Vista 64bit machine. >>> The C/C++ compiler is cl.exe from visual studio 2005 (64 bit) and no >> FORTRAN >>> compiler is used. BLAS/LAPACK >>> support comes from Intel MKL-10.1.3.028 (under em64t\lib). No MPI is >> used. >>> >>> The debugged petsc was configured with: >>> $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' >>> --with-mpi= >>> 0 >>> >> --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_thread.lib,mkl_core.lib >>> ,libiomp5mt.lib] -CFLAGS='-MDd -W3 -Z7' -CXXFLAGS='-MDd -W3 -Z7' >>> >>> The optimized petsc was configured with: >>> $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' >>> --with-mpi= >>> 0 --with-debugging=0 >>> --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_threa >>> d.lib,mkl_core.lib,libiomp5mt.lib] -CFLAGS='-MD -wd4996 -O2' >> -CXXFLAGS='-MD >>> -w >>> d4996 -O2' >>> >>> When I run ksp/examples/tutorials/Ex2.c. The result with debugged petsc >>> is >>> ================================================================= >>> Norm of error 0.000156044 iterations 6 >>> ================================================================= >>> >>> However, there are errors with the optimized petsc with the output as >>> follows: >>> ================================================================= >>> [0]PETSC ERROR: --------------------- Error Message >>> ---------------------------- >>> -------- >>> [0]PETSC ERROR: Nonconforming object sizes! >>> [0]PETSC ERROR: Mat mat,Vec y: global dim 56 57! >>> [0]PETSC ERROR: >>> ---------------------------------------------------------------- >>> -------- >>> [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 >>> CST 20 >>> 10 >>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>> [0]PETSC ERROR: See docs/index.html for manual pages. >>> [0]PETSC ERROR: >>> ---------------------------------------------------------------- >>> -------- >>> [0]PETSC ERROR: >>> D:\Develop\Test\PETSc\petsc-3.1-p7\src\ksp\ksp\examples\tutorial >>> s\ex2.exe on a cygwin-c- named CVBRL-38 by mengda Mon Jan 17 00:35:15 >> 2011 >>> [0]PETSC ERROR: Libraries linked from >>> /cygdrive/d/Develop/Test/PETSc/petsc-3.1-p >>> 7/cygwin-c-opt/lib >>> [0]PETSC ERROR: Configure run at Sun Jan 16 23:34:25 2011 >>> [0]PETSC ERROR: Configure options --with-cc=cl --with-fc=0 --with-cxx=cl >>> --with- >>> mpi=0 --with-debugging=0 >>> --with-blas-lapack-lib="[mkl_intel_lp64.lib,mkl_intel_t >>> hread.lib,mkl_core.lib,libiomp5mt.lib]" -CFLAGS="-MD -wd4996 -O2" >>> -CXXFLAGS="-MD >>> -wd4996 -O2" --useThreads=0 >>> [0]PETSC ERROR: >>> ---------------------------------------------------------------- >>> -------- >>> [0]PETSC ERROR: MatMult() line 1888 in >>> src/mat/interface/D:\Develop\Test\PETSc\P >>> ETSC-~1.1-P\src\mat\INTERF~1\matrix.c >>> [0]PETSC ERROR: main() line 146 in >>> src/ksp/ksp/examples/tutorials/D:\Develop\Tes >>> t\PETSc\PETSC-~1.1-P\src\ksp\ksp\examples\TUTORI~1\ex2.c >>> >>> This application has requested the Runtime to terminate it in an unusual >>> way. >>> Please contact the application's support team for more information. >>> ================================================================= >>> >>> I am wondering what problems may lead to the errors. Please let me know >>> if you need more >>> information. >>> >>> Thanks, >>> Mengda >>> >>> Hi all, >>> >>> I just compiled the debugged and optimized versions of petsc-3.1-p7. >> Both are successful. I am running on Windows Vista 64bit machine. >>> The C/C++ compiler is cl.exe from visual studio 2005 (64 bit) and no >> FORTRAN compiler is used. BLAS/LAPACK >>> support comes from Intel MKL-10.1.3.028 (under em64t\lib). No MPI is >> used. >>> >>> The debugged petsc was configured with: >>> $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' >> --with-mpi= >>> 0 >> --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_thread.lib,mkl_core.lib >>> ,libiomp5mt.lib] -CFLAGS='-MDd -W3 -Z7' -CXXFLAGS='-MDd -W3 -Z7' >>> >>> The optimized petsc was configured with: >>> $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' >> --with-mpi= >>> 0 --with-debugging=0 >> --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_threa >>> d.lib,mkl_core.lib,libiomp5mt.lib] -CFLAGS='-MD -wd4996 -O2' >> -CXXFLAGS='-MD -w >>> d4996 -O2' >>> >>> When I run ksp/examples/tutorials/Ex2.c. The result with debugged >> petsc is >>> ================================================================= >>> Norm of error 0.000156044 iterations 6 >>> ================================================================= >>> >>> However, there are errors with the optimized petsc with the output as >> follows: >>> ================================================================= >>> [0]PETSC ERROR: --------------------- Error Message >> ---------------------------- >>> -------- >>> [0]PETSC ERROR: Nonconforming object sizes! >>> [0]PETSC ERROR: Mat mat,Vec y: global dim 56 57! >>> [0]PETSC ERROR: >> ---------------------------------------------------------------- >>> -------- >>> [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 >> CST 20 >>> 10 >>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. >>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >>> [0]PETSC ERROR: See docs/index.html for manual pages. >>> [0]PETSC ERROR: >> ---------------------------------------------------------------- >>> -------- >>> [0]PETSC ERROR: >> D:\Develop\Test\PETSc\petsc-3.1-p7\src\ksp\ksp\examples\tutorial >>> s\ex2.exe on a cygwin-c- named CVBRL-38 by mengda Mon Jan 17 00:35:15 >> 2011 >>> [0]PETSC ERROR: Libraries linked from >> /cygdrive/d/Develop/Test/PETSc/petsc-3.1-p >>> 7/cygwin-c-opt/lib >>> [0]PETSC ERROR: Configure run at Sun Jan 16 23:34:25 2011 >>> [0]PETSC ERROR: Configure options --with-cc=cl --with-fc=0 --with-cxx=cl >> --with- >>> mpi=0 --with-debugging=0 >> --with-blas-lapack-lib="[mkl_intel_lp64.lib,mkl_intel_t >>> hread.lib,mkl_core.lib,libiomp5mt.lib]" -CFLAGS="-MD -wd4996 -O2" >> -CXXFLAGS="-MD >>> -wd4996 -O2" --useThreads=0 >>> [0]PETSC ERROR: >> ---------------------------------------------------------------- >>> -------- >>> [0]PETSC ERROR: MatMult() line 1888 in >> src/mat/interface/D:\Develop\Test\PETSc\P >>> ETSC-~1.1-P\src\mat\INTERF~1\matrix.c >>> [0]PETSC ERROR: main() line 146 in >> src/ksp/ksp/examples/tutorials/D:\Develop\Tes >>> t\PETSc\PETSC-~1.1-P\src\ksp\ksp\examples\TUTORI~1\ex2.c >>> >>> This application has requested the Runtime to terminate it in an unusual >> way. >>> Please contact the application's support team for more information. >>> ================================================================= >>> >>> I am wondering what problems may lead to the errors. Please let me >> know if you need more >>> information. >>> >>> Thanks, >>> Mengda >> >> > > This indeed is caused by a bug in Visual c++ 2005 64bit compiler when using optimization. > The result is correct after installing the hotfix: http://support.microsoft.com/kb/976617/ > Thanks a lot! > > Mengda > > On Mon, Jan 17, 2011 at 12:27 PM, Barry Smith wrote: > > Compiler bug. Immediately before the call to MatMult() in the code add the two lines > > ierr = VecView(u,0); > ierr = MatView(A,0); > > how large are the two objects? Given the code it is inconceivable that suddenly the vector length becomes 57. > > Barry > > > On Jan 17, 2011, at 2:43 AM, Mengda Wu wrote: > > > Hi all, > > > > I just compiled the debugged and optimized versions of petsc-3.1-p7. > > Both are successful. I am running on Windows Vista 64bit machine. > > The C/C++ compiler is cl.exe from visual studio 2005 (64 bit) and no FORTRAN > > compiler is used. BLAS/LAPACK > > support comes from Intel MKL-10.1.3.028 (under em64t\lib). No MPI is used. > > > > The debugged petsc was configured with: > > $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' > > --with-mpi= > > 0 > > --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_thread.lib,mkl_core.lib > > ,libiomp5mt.lib] -CFLAGS='-MDd -W3 -Z7' -CXXFLAGS='-MDd -W3 -Z7' > > > > The optimized petsc was configured with: > > $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' > > --with-mpi= > > 0 --with-debugging=0 > > --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_threa > > d.lib,mkl_core.lib,libiomp5mt.lib] -CFLAGS='-MD -wd4996 -O2' -CXXFLAGS='-MD > > -w > > d4996 -O2' > > > > When I run ksp/examples/tutorials/Ex2.c. The result with debugged petsc > > is > > ================================================================= > > Norm of error 0.000156044 iterations 6 > > ================================================================= > > > > However, there are errors with the optimized petsc with the output as > > follows: > > ================================================================= > > [0]PETSC ERROR: --------------------- Error Message > > ---------------------------- > > -------- > > [0]PETSC ERROR: Nonconforming object sizes! > > [0]PETSC ERROR: Mat mat,Vec y: global dim 56 57! > > [0]PETSC ERROR: > > ---------------------------------------------------------------- > > -------- > > [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 > > CST 20 > > 10 > > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > > [0]PETSC ERROR: See docs/index.html for manual pages. > > [0]PETSC ERROR: > > ---------------------------------------------------------------- > > -------- > > [0]PETSC ERROR: > > D:\Develop\Test\PETSc\petsc-3.1-p7\src\ksp\ksp\examples\tutorial > > s\ex2.exe on a cygwin-c- named CVBRL-38 by mengda Mon Jan 17 00:35:15 2011 > > [0]PETSC ERROR: Libraries linked from > > /cygdrive/d/Develop/Test/PETSc/petsc-3.1-p > > 7/cygwin-c-opt/lib > > [0]PETSC ERROR: Configure run at Sun Jan 16 23:34:25 2011 > > [0]PETSC ERROR: Configure options --with-cc=cl --with-fc=0 --with-cxx=cl > > --with- > > mpi=0 --with-debugging=0 > > --with-blas-lapack-lib="[mkl_intel_lp64.lib,mkl_intel_t > > hread.lib,mkl_core.lib,libiomp5mt.lib]" -CFLAGS="-MD -wd4996 -O2" > > -CXXFLAGS="-MD > > -wd4996 -O2" --useThreads=0 > > [0]PETSC ERROR: > > ---------------------------------------------------------------- > > -------- > > [0]PETSC ERROR: MatMult() line 1888 in > > src/mat/interface/D:\Develop\Test\PETSc\P > > ETSC-~1.1-P\src\mat\INTERF~1\matrix.c > > [0]PETSC ERROR: main() line 146 in > > src/ksp/ksp/examples/tutorials/D:\Develop\Tes > > t\PETSc\PETSC-~1.1-P\src\ksp\ksp\examples\TUTORI~1\ex2.c > > > > This application has requested the Runtime to terminate it in an unusual > > way. > > Please contact the application's support team for more information. > > ================================================================= > > > > I am wondering what problems may lead to the errors. Please let me know > > if you need more > > information. > > > > Thanks, > > Mengda > > > > Hi all, > > > > I just compiled the debugged and optimized versions of petsc-3.1-p7. Both are successful. I am running on Windows Vista 64bit machine. > > The C/C++ compiler is cl.exe from visual studio 2005 (64 bit) and no FORTRAN compiler is used. BLAS/LAPACK > > support comes from Intel MKL-10.1.3.028 (under em64t\lib). No MPI is used. > > > > The debugged petsc was configured with: > > $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' --with-mpi= > > 0 --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_thread.lib,mkl_core.lib > > ,libiomp5mt.lib] -CFLAGS='-MDd -W3 -Z7' -CXXFLAGS='-MDd -W3 -Z7' > > > > The optimized petsc was configured with: > > $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' --with-mpi= > > 0 --with-debugging=0 --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_threa > > d.lib,mkl_core.lib,libiomp5mt.lib] -CFLAGS='-MD -wd4996 -O2' -CXXFLAGS='-MD -w > > d4996 -O2' > > > > When I run ksp/examples/tutorials/Ex2.c. The result with debugged petsc is > > ================================================================= > > Norm of error 0.000156044 iterations 6 > > ================================================================= > > > > However, there are errors with the optimized petsc with the output as follows: > > ================================================================= > > [0]PETSC ERROR: --------------------- Error Message ---------------------------- > > -------- > > [0]PETSC ERROR: Nonconforming object sizes! > > [0]PETSC ERROR: Mat mat,Vec y: global dim 56 57! > > [0]PETSC ERROR: ---------------------------------------------------------------- > > -------- > > [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 20 > > 10 > > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > > [0]PETSC ERROR: See docs/index.html for manual pages. > > [0]PETSC ERROR: ---------------------------------------------------------------- > > -------- > > [0]PETSC ERROR: D:\Develop\Test\PETSc\petsc-3.1-p7\src\ksp\ksp\examples\tutorial > > s\ex2.exe on a cygwin-c- named CVBRL-38 by mengda Mon Jan 17 00:35:15 2011 > > [0]PETSC ERROR: Libraries linked from /cygdrive/d/Develop/Test/PETSc/petsc-3.1-p > > 7/cygwin-c-opt/lib > > [0]PETSC ERROR: Configure run at Sun Jan 16 23:34:25 2011 > > [0]PETSC ERROR: Configure options --with-cc=cl --with-fc=0 --with-cxx=cl --with- > > mpi=0 --with-debugging=0 --with-blas-lapack-lib="[mkl_intel_lp64.lib,mkl_intel_t > > hread.lib,mkl_core.lib,libiomp5mt.lib]" -CFLAGS="-MD -wd4996 -O2" -CXXFLAGS="-MD > > -wd4996 -O2" --useThreads=0 > > [0]PETSC ERROR: ---------------------------------------------------------------- > > -------- > > [0]PETSC ERROR: MatMult() line 1888 in src/mat/interface/D:\Develop\Test\PETSc\P > > ETSC-~1.1-P\src\mat\INTERF~1\matrix.c > > [0]PETSC ERROR: main() line 146 in src/ksp/ksp/examples/tutorials/D:\Develop\Tes > > t\PETSc\PETSC-~1.1-P\src\ksp\ksp\examples\TUTORI~1\ex2.c > > > > This application has requested the Runtime to terminate it in an unusual way. > > Please contact the application's support team for more information. > > ================================================================= > > > > I am wondering what problems may lead to the errors. Please let me know if you need more > > information. > > > > Thanks, > > Mengda > > From gaurish108 at gmail.com Mon Jan 17 16:46:20 2011 From: gaurish108 at gmail.com (Gaurish Telang) Date: Mon, 17 Jan 2011 17:46:20 -0500 Subject: [petsc-users] Regarding printing to standard output. and possible mistake in the code comments in PetscBinaryWrite.m Message-ID: Hi. I had two questions (1) I was curious to know why the following happens with the PETSc standard output. Having created the executable 'test' when I try to run it with mpiexec -n 2 ./test the same output is printed to the terminal twice. If I use 3 processors, then the same output is printed thrice. In short the number of processors = number of times the output from PETSc is printed. Could this be a mistake with my PETSc installation??? For example, consider the code in src/ksp/ksp/examples/tutorials/ex23.c After creating ex23 the executable and running it with two processors gives the following terminal output: gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ mpiexec -n 1 ./ex23 KSP Object: type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-07, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: type: jacobi linear system matrix = precond matrix: Matrix Object: type=seqaij, rows=10, cols=10 total: nonzeros=28, allocated nonzeros=50 not using I-node routines Norm of error < 1.e-12, Iterations 5 gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ mpiexec -n 2 ./ex23 KSP Object: type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-07, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: type: jacobi linear system matrix = precond matrix: Matrix Object: type=seqaij, rows=10, cols=10 total: nonzeros=28, allocated nonzeros=50 not using I-node routines Norm of error < 1.e-12, Iterations 5 KSP Object: type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-07, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: type: jacobi linear system matrix = precond matrix: Matrix Object: type=seqaij, rows=10, cols=10 total: nonzeros=28, allocated nonzeros=50 not using I-node routines Norm of error < 1.e-12, Iterations 5 gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ (2) Also I was told yesterday on the PETSC users mailing list that the MATLAB m file PetscBinaryWrite.m converts a sparse matrix in MATLAB into Petsc Binary format. The following are the comments in the code near the heading saying that it works only for square sparse matrices . But it seems to be working quite well for rectangular sparse MATLAB matrices also. I have tested this in conjunction with PetscBinaryRead.m also, which reads in a Petsc binary file into MATLAB as a sparse matrix. Is there something I might have missed or some error that I might be making??? Comments in PetscBinaryWrite.m "-================================================ % Writes in PETSc binary file sparse matrices and vectors % if the array is multidimensional and dense it is saved % as a one dimensional array % % Only works for square sparse matrices %: .. .. .. .. .. .. . . . -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Mon Jan 17 16:53:33 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 17 Jan 2011 16:53:33 -0600 Subject: [petsc-users] Regarding printing to standard output. and possible mistake in the code comments in PetscBinaryWrite.m In-Reply-To: References: Message-ID: <74003201-E961-48C3-BBC9-37FACF5136B1@mcs.anl.gov> (1) Are you sure that mpiexec is the correct mpiexec for that build of PETSc? Run mpiexec -n 2 ./ex23 -info Likely it is not and it is running the program twice and each one thinks it is the entire world and hence each of the two run sequentially and print their own thing. (2) Likely this is an outdated comment from when it only handled square sparse matrices. Barry On Jan 17, 2011, at 4:46 PM, Gaurish Telang wrote: > Hi. > > I had two questions > > (1) > > I was curious to know why the following happens with the PETSc standard output. Having created the executable 'test' when I try to run it with mpiexec -n 2 ./test > the same output is printed to the terminal twice. If I use 3 processors, then the same output is printed thrice. > > In short the number of processors = number of times the output from PETSc is printed. Could this be a mistake with my PETSc installation??? > > For example, consider the code in src/ksp/ksp/examples/tutorials/ex23.c After creating ex23 the executable and running it with two processors gives the following terminal output: > > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ mpiexec -n 1 ./ex23 > KSP Object: > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-07, absolute=1e-50, divergence=10000 > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: > type: jacobi > linear system matrix = precond matrix: > Matrix Object: > type=seqaij, rows=10, cols=10 > total: nonzeros=28, allocated nonzeros=50 > not using I-node routines > Norm of error < 1.e-12, Iterations 5 > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ mpiexec -n 2 ./ex23 > KSP Object: > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-07, absolute=1e-50, divergence=10000 > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: > type: jacobi > linear system matrix = precond matrix: > Matrix Object: > type=seqaij, rows=10, cols=10 > total: nonzeros=28, allocated nonzeros=50 > not using I-node routines > Norm of error < 1.e-12, Iterations 5 > KSP Object: > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-07, absolute=1e-50, divergence=10000 > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: > type: jacobi > linear system matrix = precond matrix: > Matrix Object: > type=seqaij, rows=10, cols=10 > total: nonzeros=28, allocated nonzeros=50 > not using I-node routines > Norm of error < 1.e-12, Iterations 5 > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ > > > > (2) > > Also I was told yesterday on the PETSC users mailing list that the MATLAB m file PetscBinaryWrite.m converts a sparse matrix in MATLAB into Petsc Binary format. > The following are the comments in the code near the heading saying that it works only for square sparse matrices . But it seems to be working quite well for rectangular sparse MATLAB matrices also. > I have tested this in conjunction with PetscBinaryRead.m also, which reads in a Petsc binary file into MATLAB as a sparse matrix. > > Is there something I might have missed or some error that I might be making??? > > Comments in PetscBinaryWrite.m > "-================================================ > % Writes in PETSc binary file sparse matrices and vectors > % if the array is multidimensional and dense it is saved > % as a one dimensional array > % > % Only works for square sparse matrices > %: > .. > .. > .. > .. > .. > .. > . > . > . > > > > From gaurish108 at gmail.com Mon Jan 17 17:20:19 2011 From: gaurish108 at gmail.com (Gaurish Telang) Date: Mon, 17 Jan 2011 18:20:19 -0500 Subject: [petsc-users] Regarding printing to standard output. and possible mistake in the code comments in PetscBinaryWrite.m In-Reply-To: References: Message-ID: This is what I get on running mpiexec -n 2 ./ex23 -info Also, using mpirun in place of mpiexec and using the -info option I get the exact same output you see below. As far as the MPI implmentation I am using, I have OpenMPI and MPICH installed on my laptop. While installing PETSc there were some external packages required. In the external packages folder I can see the following softwares: fblaslapack-3.1.1 mpich2-1.0.8 ParMetis-dev-p3 SuperLU_DIST_2.4-hg-v2 Possibly it is this mpich2 that should be used?? Please let me know what I should do. I am quite new to PETSc. gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ mpiexec -n 2 ./ex23 -info [0] PetscInitialize(): PETSc successfully started: number of processors = 1 [0] PetscGetHostName(): Rejecting domainname, likely is NIS gaurish108-laptop.(none) [0] PetscInitialize(): Running on machine: gaurish108-laptop [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 [0] PetscCommDuplicate(): returning tag 2147483647 [0] PetscCommDuplicate(): returning tag 2147483646 [0] PetscCommDuplicate(): returning tag 2147483645 [0] PetscInitialize(): PETSc successfully started: number of processors = 1 [0] PetscGetHostName(): Rejecting domainname, likely is NIS gaurish108-laptop.(none) [0] PetscInitialize(): Running on machine: gaurish108-laptop [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 [0] PetscCommDuplicate(): returning tag 2147483647 [0] PetscCommDuplicate(): returning tag 2147483646 [0] PetscCommDuplicate(): returning tag 2147483645 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [0] PetscCommDuplicate(): returning tag 2147483644 [0] MatSetUpPreallocation(): Warning not preallocating matrix storage [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 10 X 10; storage space: 22 unneeded,28 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 [0] Mat_CheckInode(): Found 10 nodes out of 10 rows. Not using Inode routines [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [0] PetscCommDuplicate(): returning tag 2147483643 [0] PetscCommDuplicate(): returning tag 2147483642 [0] PetscCommDuplicate(): returning tag 2147483641 [0] PetscCommDuplicate(): returning tag 2147483640 [0] PetscCommDuplicate(): returning tag 2147483639 [0] PetscCommDuplicate(): returning tag 2147483638 [0] PetscCommDuplicate(): returning tag 2147483637 [0] PCSetUp(): Setting up new PC [0] PetscCommDuplicate(): returning tag 2147483636 [0] PetscCommDuplicate(): returning tag 2147483635 [0] PetscCommDuplicate(): returning tag 2147483634 [0] PetscCommDuplicate(): returning tag 2147483633 [0] PetscCommDuplicate(): returning tag 2147483632 [0] PetscCommDuplicate(): returning tag 2147483631 [0] PetscCommDuplicate(): returning tag 2147483630 [0] PetscCommDuplicate(): returning tag 2147483629 [0] PetscCommDuplicate(): returning tag 2147483628 [0] PetscCommDuplicate(): returning tag 2147483627 [0] PetscCommDuplicate(): returning tag 2147483626 [0] KSPDefaultConverged(): Linear solver has converged. Residual norm 4.50879e-16 is less than relative tolerance 1e-07 times initial right hand side norm 0.707107 at iteration 5 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [0] PetscCommDuplicate(): returning tag 2147483625 KSP Object: type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-07, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: type: jacobi linear system matrix = precond matrix: Matrix Object: type=seqaij, rows=10, cols=10 total: nonzeros=28, allocated nonzeros=50 not using I-node routines Norm of error < 1.e-12, Iterations 5 [0] PetscFinalize(): PetscFinalize() called [0] PetscCommDuplicate(): returning tag 2147483624 [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [0] PetscCommDuplicate(): returning tag 2147483644 [0] MatSetUpPreallocation(): Warning not preallocating matrix storage [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 10 X 10; storage space: 22 unneeded,28 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 [0] Mat_CheckInode(): Found 10 nodes out of 10 rows. Not using Inode routines [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [0] PetscCommDuplicate(): returning tag 2147483643 [0] PetscCommDuplicate(): returning tag 2147483642 [0] PetscCommDuplicate(): returning tag 2147483641 [0] PetscCommDuplicate(): returning tag 2147483640 [0] PetscCommDuplicate(): returning tag 2147483639 [0] PetscCommDuplicate(): returning tag 2147483638 [0] PetscCommDuplicate(): returning tag 2147483637 [0] PCSetUp(): Setting up new PC [0] PetscCommDuplicate(): returning tag 2147483636 [0] PetscCommDuplicate(): returning tag 2147483635 [0] PetscCommDuplicate(): returning tag 2147483634 [0] PetscCommDuplicate(): returning tag 2147483633 [0] PetscCommDuplicate(): returning tag 2147483632 [0] PetscCommDuplicate(): returning tag 2147483631 [0] PetscCommDuplicate(): returning tag 2147483630 [0] PetscCommDuplicate(): returning tag 2147483629 [0] PetscCommDuplicate(): returning tag 2147483628 [0] PetscCommDuplicate(): returning tag 2147483627 [0] PetscCommDuplicate(): returning tag 2147483626 [0] KSPDefaultConverged(): Linear solver has converged. Residual norm 4.50879e-16 is less than relative tolerance 1e-07 times initial right hand side norm 0.707107 at iteration 5 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [0] PetscCommDuplicate(): returning tag 2147483625 KSP Object: type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-07, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: type: jacobi linear system matrix = precond matrix: Matrix Object: type=seqaij, rows=10, cols=10 total: nonzeros=28, allocated nonzeros=50 not using I-node routines Norm of error < 1.e-12, Iterations 5 [0] PetscFinalize(): PetscFinalize() called [0] PetscCommDuplicate(): returning tag 2147483624 [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ On Mon, Jan 17, 2011 at 5:46 PM, Gaurish Telang wrote: > Hi. > > I had two questions > > (1) > > I was curious to know why the following happens with the PETSc standard > output. Having created the executable 'test' when I try to run it with > mpiexec -n 2 ./test > the same output is printed to the terminal twice. If I use 3 processors, > then the same output is printed thrice. > > In short the number of processors = number of times the output from PETSc > is printed. Could this be a mistake with my PETSc installation??? > > For example, consider the code in src/ksp/ksp/examples/tutorials/ex23.c > After creating ex23 the executable and running it with two processors gives > the following terminal output: > > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ > mpiexec -n 1 ./ex23 > KSP Object: > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-07, absolute=1e-50, divergence=10000 > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: > type: jacobi > linear system matrix = precond matrix: > Matrix Object: > type=seqaij, rows=10, cols=10 > total: nonzeros=28, allocated nonzeros=50 > not using I-node routines > Norm of error < 1.e-12, Iterations 5 > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ > mpiexec -n 2 ./ex23 > KSP Object: > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-07, absolute=1e-50, divergence=10000 > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: > type: jacobi > linear system matrix = precond matrix: > Matrix Object: > type=seqaij, rows=10, cols=10 > total: nonzeros=28, allocated nonzeros=50 > not using I-node routines > Norm of error < 1.e-12, Iterations 5 > KSP Object: > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-07, absolute=1e-50, divergence=10000 > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: > type: jacobi > linear system matrix = precond matrix: > Matrix Object: > type=seqaij, rows=10, cols=10 > total: nonzeros=28, allocated nonzeros=50 > not using I-node routines > Norm of error < 1.e-12, Iterations 5 > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ > > > > > (2) > > Also I was told yesterday on the PETSC users mailing list that the MATLAB m > file PetscBinaryWrite.m converts a sparse matrix in MATLAB into Petsc Binary > format. > The following are the comments in the code near the heading saying that > it works only for square sparse matrices . But it seems to be working quite > well for rectangular sparse MATLAB matrices also. > I have tested this in conjunction with PetscBinaryRead.m also, which reads > in a Petsc binary file into MATLAB as a sparse matrix. > > Is there something I might have missed or some error that I might be > making??? > > Comments in PetscBinaryWrite.m > "-================================================ > % Writes in PETSc binary file sparse matrices and vectors > % if the array is multidimensional and dense it is saved > % as a one dimensional array > % > % Only works for square sparse matrices > %: > .. > .. > .. > .. > .. > .. > . > . > . > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Mon Jan 17 17:25:39 2011 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 17 Jan 2011 17:25:39 -0600 (CST) Subject: [petsc-users] Regarding printing to standard output. and possible mistake in the code comments in PetscBinaryWrite.m In-Reply-To: References: Message-ID: Looks like you've installed petsc with --download-mpich - so you should use the corresponding mpiexec [its in PETSC_DIR/PETSC_ARCH/bin/] Since you already have 2 installs of MPI - you could have used one of them for PETSc install [by specifying the corresponding mpicc,mpif90 to PETSc configure -] instead of --download-mpich. Satish On Mon, 17 Jan 2011, Gaurish Telang wrote: > This is what I get on running mpiexec -n 2 ./ex23 -info > > Also, using mpirun in place of mpiexec and using the -info option I get the > exact same output you see below. > > As far as the MPI implmentation I am using, I have OpenMPI and MPICH > installed on my laptop. > > While installing PETSc there were some external packages required. In the > external packages folder I can see the following softwares: > > fblaslapack-3.1.1 mpich2-1.0.8 ParMetis-dev-p3 SuperLU_DIST_2.4-hg-v2 > > Possibly it is this mpich2 that should be used?? > Please let me know what I should do. I am quite new to PETSc. > > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ > mpiexec -n 2 ./ex23 -info > [0] PetscInitialize(): PETSc successfully started: number of processors = 1 > [0] PetscGetHostName(): Rejecting domainname, likely is NIS > gaurish108-laptop.(none) > [0] PetscInitialize(): Running on machine: gaurish108-laptop > [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 > max tags = 2147483647 > [0] PetscCommDuplicate(): returning tag 2147483647 > [0] PetscCommDuplicate(): returning tag 2147483646 > [0] PetscCommDuplicate(): returning tag 2147483645 > [0] PetscInitialize(): PETSc successfully started: number of processors = 1 > [0] PetscGetHostName(): Rejecting domainname, likely is NIS > gaurish108-laptop.(none) > [0] PetscInitialize(): Running on machine: gaurish108-laptop > [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 > max tags = 2147483647 > [0] PetscCommDuplicate(): returning tag 2147483647 > [0] PetscCommDuplicate(): returning tag 2147483646 > [0] PetscCommDuplicate(): returning tag 2147483645 > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [0] PetscCommDuplicate(): returning tag 2147483644 > [0] MatSetUpPreallocation(): Warning not preallocating matrix storage > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 10 X 10; storage space: 22 > unneeded,28 used > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 > [0] Mat_CheckInode(): Found 10 nodes out of 10 rows. Not using Inode > routines > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [0] PetscCommDuplicate(): returning tag 2147483643 > [0] PetscCommDuplicate(): returning tag 2147483642 > [0] PetscCommDuplicate(): returning tag 2147483641 > [0] PetscCommDuplicate(): returning tag 2147483640 > [0] PetscCommDuplicate(): returning tag 2147483639 > [0] PetscCommDuplicate(): returning tag 2147483638 > [0] PetscCommDuplicate(): returning tag 2147483637 > [0] PCSetUp(): Setting up new PC > [0] PetscCommDuplicate(): returning tag 2147483636 > [0] PetscCommDuplicate(): returning tag 2147483635 > [0] PetscCommDuplicate(): returning tag 2147483634 > [0] PetscCommDuplicate(): returning tag 2147483633 > [0] PetscCommDuplicate(): returning tag 2147483632 > [0] PetscCommDuplicate(): returning tag 2147483631 > [0] PetscCommDuplicate(): returning tag 2147483630 > [0] PetscCommDuplicate(): returning tag 2147483629 > [0] PetscCommDuplicate(): returning tag 2147483628 > [0] PetscCommDuplicate(): returning tag 2147483627 > [0] PetscCommDuplicate(): returning tag 2147483626 > [0] KSPDefaultConverged(): Linear solver has converged. Residual norm > 4.50879e-16 is less than relative tolerance 1e-07 times initial right hand > side norm 0.707107 at iteration 5 > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [0] PetscCommDuplicate(): returning tag 2147483625 > KSP Object: > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-07, absolute=1e-50, divergence=10000 > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: > type: jacobi > linear system matrix = precond matrix: > Matrix Object: > type=seqaij, rows=10, cols=10 > total: nonzeros=28, allocated nonzeros=50 > not using I-node routines > Norm of error < 1.e-12, Iterations 5 > [0] PetscFinalize(): PetscFinalize() called > [0] PetscCommDuplicate(): returning tag 2147483624 > [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 > [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm > 1140850688 > [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 > [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 > [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm > -2080374784 > [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [0] PetscCommDuplicate(): returning tag 2147483644 > [0] MatSetUpPreallocation(): Warning not preallocating matrix storage > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 10 X 10; storage space: 22 > unneeded,28 used > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 > [0] Mat_CheckInode(): Found 10 nodes out of 10 rows. Not using Inode > routines > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [0] PetscCommDuplicate(): returning tag 2147483643 > [0] PetscCommDuplicate(): returning tag 2147483642 > [0] PetscCommDuplicate(): returning tag 2147483641 > [0] PetscCommDuplicate(): returning tag 2147483640 > [0] PetscCommDuplicate(): returning tag 2147483639 > [0] PetscCommDuplicate(): returning tag 2147483638 > [0] PetscCommDuplicate(): returning tag 2147483637 > [0] PCSetUp(): Setting up new PC > [0] PetscCommDuplicate(): returning tag 2147483636 > [0] PetscCommDuplicate(): returning tag 2147483635 > [0] PetscCommDuplicate(): returning tag 2147483634 > [0] PetscCommDuplicate(): returning tag 2147483633 > [0] PetscCommDuplicate(): returning tag 2147483632 > [0] PetscCommDuplicate(): returning tag 2147483631 > [0] PetscCommDuplicate(): returning tag 2147483630 > [0] PetscCommDuplicate(): returning tag 2147483629 > [0] PetscCommDuplicate(): returning tag 2147483628 > [0] PetscCommDuplicate(): returning tag 2147483627 > [0] PetscCommDuplicate(): returning tag 2147483626 > [0] KSPDefaultConverged(): Linear solver has converged. Residual norm > 4.50879e-16 is less than relative tolerance 1e-07 times initial right hand > side norm 0.707107 at iteration 5 > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [0] PetscCommDuplicate(): returning tag 2147483625 > KSP Object: > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-07, absolute=1e-50, divergence=10000 > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: > type: jacobi > linear system matrix = precond matrix: > Matrix Object: > type=seqaij, rows=10, cols=10 > total: nonzeros=28, allocated nonzeros=50 > not using I-node routines > Norm of error < 1.e-12, Iterations 5 > [0] PetscFinalize(): PetscFinalize() called > [0] PetscCommDuplicate(): returning tag 2147483624 > [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 > [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm > 1140850688 > [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 > [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 > [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm > -2080374784 > [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ > > > On Mon, Jan 17, 2011 at 5:46 PM, Gaurish Telang wrote: > > > Hi. > > > > I had two questions > > > > (1) > > > > I was curious to know why the following happens with the PETSc standard > > output. Having created the executable 'test' when I try to run it with > > mpiexec -n 2 ./test > > the same output is printed to the terminal twice. If I use 3 processors, > > then the same output is printed thrice. > > > > In short the number of processors = number of times the output from PETSc > > is printed. Could this be a mistake with my PETSc installation??? > > > > For example, consider the code in src/ksp/ksp/examples/tutorials/ex23.c > > After creating ex23 the executable and running it with two processors gives > > the following terminal output: > > > > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ > > mpiexec -n 1 ./ex23 > > KSP Object: > > type: gmres > > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > > Orthogonalization with no iterative refinement > > GMRES: happy breakdown tolerance 1e-30 > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-07, absolute=1e-50, divergence=10000 > > left preconditioning > > using PRECONDITIONED norm type for convergence test > > PC Object: > > type: jacobi > > linear system matrix = precond matrix: > > Matrix Object: > > type=seqaij, rows=10, cols=10 > > total: nonzeros=28, allocated nonzeros=50 > > not using I-node routines > > Norm of error < 1.e-12, Iterations 5 > > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ > > mpiexec -n 2 ./ex23 > > KSP Object: > > type: gmres > > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > > Orthogonalization with no iterative refinement > > GMRES: happy breakdown tolerance 1e-30 > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-07, absolute=1e-50, divergence=10000 > > left preconditioning > > using PRECONDITIONED norm type for convergence test > > PC Object: > > type: jacobi > > linear system matrix = precond matrix: > > Matrix Object: > > type=seqaij, rows=10, cols=10 > > total: nonzeros=28, allocated nonzeros=50 > > not using I-node routines > > Norm of error < 1.e-12, Iterations 5 > > KSP Object: > > type: gmres > > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > > Orthogonalization with no iterative refinement > > GMRES: happy breakdown tolerance 1e-30 > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-07, absolute=1e-50, divergence=10000 > > left preconditioning > > using PRECONDITIONED norm type for convergence test > > PC Object: > > type: jacobi > > linear system matrix = precond matrix: > > Matrix Object: > > type=seqaij, rows=10, cols=10 > > total: nonzeros=28, allocated nonzeros=50 > > not using I-node routines > > Norm of error < 1.e-12, Iterations 5 > > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ > > > > > > > > > > (2) > > > > Also I was told yesterday on the PETSC users mailing list that the MATLAB m > > file PetscBinaryWrite.m converts a sparse matrix in MATLAB into Petsc Binary > > format. > > The following are the comments in the code near the heading saying that > > it works only for square sparse matrices . But it seems to be working quite > > well for rectangular sparse MATLAB matrices also. > > I have tested this in conjunction with PetscBinaryRead.m also, which reads > > in a Petsc binary file into MATLAB as a sparse matrix. > > > > Is there something I might have missed or some error that I might be > > making??? > > > > Comments in PetscBinaryWrite.m > > "-================================================ > > % Writes in PETSc binary file sparse matrices and vectors > > % if the array is multidimensional and dense it is saved > > % as a one dimensional array > > % > > % Only works for square sparse matrices > > %: > > .. > > .. > > .. > > .. > > .. > > .. > > . > > . > > . > > > > > > > > > > > From gaurish108 at gmail.com Mon Jan 17 18:36:19 2011 From: gaurish108 at gmail.com (Gaurish Telang) Date: Mon, 17 Jan 2011 19:36:19 -0500 Subject: [petsc-users] Regarding printing to standard output. and possible mistake in the code comments in PetscBinaryWrite.m In-Reply-To: References: Message-ID: Thank you that seems to have worked! Just to confirm, I have posted the output at the end of this message. I hope this is the way the generic output should look like. I have still have a few questions though. (1) So all I need to do, is to specify the location of the correct executable mpiexec which is $PETSC_DIR/$PETSC_ARCH/bin/mpiexec while running the program, right? The contents of my $PETSC_DIR/$PETSC_ARCH/bin are gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/linux-gnu-c-debug/bin$ ls mpicc mpich2version mpiexec mpif77 mpif90 parkill (2) Do I need to make any changes in the makefiles of the PETSc programs that I have written? And hence recompile my codes by using the "new" mpiexec I mean, since $PETSC_DIR/$PETSC_ARCH/bin/ also contains mpicc (as seen above), I want to be sure that the correct mpicc is being used during execution. (3) Should I run the mpd daemon before using mpiexec??? On the MPICH2 that I had installed prior to my PETSc it required me type "mpd &" before program execution. But it seems for my PETSc mpiexec I don;t need mpd. But should I type it in ?? I mean I am not sure if this affects program performance Sincere thanks, Gaurish gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -n 2 ./ex23 -info [0] PetscInitialize(): PETSc successfully started: number of processors = 2 [1] PetscInitialize(): PETSc successfully started: number of processors = 2 [1] PetscGetHostName(): Rejecting domainname, likely is NIS gaurish108-laptop.(none) [1] PetscInitialize(): Running on machine: gaurish108-laptop [0] PetscGetHostName(): Rejecting domainname, likely is NIS gaurish108-laptop.(none) [0] PetscInitialize(): Running on machine: gaurish108-laptop [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 [1] PetscCommDuplicate(): returning tag 2147483647 [0] PetscCommDuplicate(): returning tag 2147483647 [0] PetscCommDuplicate(): returning tag 2147483642 [1] PetscCommDuplicate(): returning tag 2147483642 [0] PetscCommDuplicate(): returning tag 2147483637 [1] PetscCommDuplicate(): returning tag 2147483637 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [0] PetscCommDuplicate(): returning tag 2147483632 [1] PetscCommDuplicate(): returning tag 2147483632 [0] MatSetUpPreallocation(): Warning not preallocating matrix storage [0] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374783 max tags = 2147483647 [0] PetscCommDuplicate(): returning tag 2147483647 [1] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374783 max tags = 2147483647 [1] PetscCommDuplicate(): returning tag 2147483647 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): returning tag 2147483646 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [1] PetscCommDuplicate(): returning tag 2147483646 [0] MatStashScatterBegin_Private(): No of messages: 0 [0] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs. [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 5; storage space: 12 unneeded,13 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 [1] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs. [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 5; storage space: 12 unneeded,13 used [1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 [1] Mat_CheckInode(): Found 5 nodes out of 5 rows. Not using Inode routines [0] Mat_CheckInode(): Found 5 nodes out of 5 rows. Not using Inode routines [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): returning tag 2147483645 [0] MatSetUpMultiply_MPIAIJ(): Using block index set to define scatter [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [1] PetscCommDuplicate(): returning tag 2147483645 [1] PetscCommDuplicate(): returning tag 2147483628 [0] PetscCommDuplicate(): returning tag 2147483628 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): returning tag 2147483644 [0] PetscCommDuplicate(): returning tag 2147483627 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [1] PetscCommDuplicate(): returning tag 2147483644 [1] PetscCommDuplicate(): returning tag 2147483627 [1] PetscCommDuplicate(): returning tag 2147483622 [0] PetscCommDuplicate(): returning tag 2147483622 [0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter [0] VecScatterCreate(): General case: MPI to Seq [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 1; storage space: 9 unneeded,1 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 [0] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 4)/(num_localrows 5) > 0.6. Use CompressedRow routines. [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 1; storage space: 9 unneeded,1 used [1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 [1] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 4)/(num_localrows 5) > 0.6. Use CompressedRow routines. [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [1] PetscCommDuplicate(): returning tag 2147483618 [0] PetscCommDuplicate(): returning tag 2147483618 [0] PetscCommDuplicate(): returning tag 2147483617 [1] PetscCommDuplicate(): returning tag 2147483617 [1] PetscCommDuplicate(): returning tag 2147483616 [0] PetscCommDuplicate(): returning tag 2147483616 [0] PetscCommDuplicate(): returning tag 2147483611 [1] PetscCommDuplicate(): returning tag 2147483611 [1] PetscCommDuplicate(): returning tag 2147483606 [0] PetscCommDuplicate(): returning tag 2147483606 [0] PetscCommDuplicate(): returning tag 2147483601 [1] PetscCommDuplicate(): returning tag 2147483601 [1] PetscCommDuplicate(): returning tag 2147483596 [0] PetscCommDuplicate(): returning tag 2147483596 [0] PCSetUp(): Setting up new PC [1] PetscCommDuplicate(): returning tag 2147483591 [0] PetscCommDuplicate(): returning tag 2147483591 [0] PetscCommDuplicate(): returning tag 2147483586 [1] PetscCommDuplicate(): returning tag 2147483586 [0] PetscCommDuplicate(): returning tag 2147483581 [1] PetscCommDuplicate(): returning tag 2147483581 [0] PetscCommDuplicate(): returning tag 2147483576 [1] PetscCommDuplicate(): returning tag 2147483576 [0] PetscCommDuplicate(): returning tag 2147483571 [1] PetscCommDuplicate(): returning tag 2147483571 [1] PetscCommDuplicate(): returning tag 2147483566 [0] PetscCommDuplicate(): returning tag 2147483566 [0] PetscCommDuplicate(): returning tag 2147483561 [1] PetscCommDuplicate(): returning tag 2147483561 [0] PetscCommDuplicate(): returning tag 2147483556 [0] PetscCommDuplicate(): returning tag 2147483551 [0] PetscCommDuplicate(): returning tag 2147483546 [0] PetscCommDuplicate(): returning tag 2147483541 [0] KSPDefaultConverged(): Linear solver has converged. Residual norm 5.11279e-16 is less than relative tolerance 1e-07 times initial right hand side norm 0.707107 at iteration 5 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [0] PetscCommDuplicate(): returning tag 2147483536 KSP Object: type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-07, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: type: jacobi linear system matrix = precond matrix: Matrix Object: type=mpiaij, rows=10, cols=10 tot[1] PetscCommDuplicate(): returning tag 2147483556 [1] PetscCommDuplicate(): returning tag 2147483551 [1] PetscCommDuplicate(): returning tag 2147483546 [1] PetscCommDuplicate(): returning tag 2147483541 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [1] PetscCommDuplicate(): returning tag 2147483536 [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850689 [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783 [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374783 [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374783 al: nonzeros=28, allocated nonzeros=70 not using I-node (on process 0) routines Norm of error < 1.e-12, Iterations 5 [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850689 [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783 [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374783 [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374783 [0] PetscFinalize(): PetscFinalize() called [1] PetscFinalize(): PetscFinalize() called [1] PetscCommDuplicate(): returning tag 2147483535 [1] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 [0] PetscCommDuplicate(): returning tag 2147483535 [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 [1] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ clear gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ $PETSC_DIR/$PETSC_ARCH/bin/mpich2version -n 2 ./ex23 -info Unrecognized argument -n gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -n 2 ./ex23 -info [0] PetscInitialize(): PETSc successfully started: number of processors = 2 [1] PetscInitialize(): PETSc successfully started: number of processors = 2 [0] PetscGetHostName(): Rejecting domainname, likely is NIS gaurish108-laptop.(none) [0] PetscInitialize(): Running on machine: gaurish108-laptop [1] PetscGetHostName(): Rejecting domainname, likely is NIS gaurish108-laptop.(none) [1] PetscInitialize(): Running on machine: gaurish108-laptop [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 max tags = 2147483647 [1] PetscCommDuplicate(): returning tag 2147483647 [0] PetscCommDuplicate(): returning tag 2147483647 [0] PetscCommDuplicate(): returning tag 2147483642 [0] PetscCommDuplicate(): returning tag 2147483637 [1] PetscCommDuplicate(): returning tag 2147483642 [1] PetscCommDuplicate(): returning tag 2147483637 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [1] PetscCommDuplicate(): returning tag 2147483632 [0] PetscCommDuplicate(): returning tag 2147483632 [0] MatSetUpPreallocation(): Warning not preallocating matrix storage [0] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374783 max tags = 2147483647 [0] PetscCommDuplicate(): returning tag 2147483647 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): returning tag 2147483646 [1] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374783 max tags = 2147483647 [1] PetscCommDuplicate(): returning tag 2147483647 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [1] PetscCommDuplicate(): returning tag 2147483646 [0] MatStashScatterBegin_Private(): No of messages: 0 [0] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs. [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 5; storage space: 12 unneeded,13 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 [0] Mat_CheckInode(): Found 5 nodes out of 5 rows. Not using Inode routines [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): returning tag 2147483645 [0] MatSetUpMultiply_MPIAIJ(): Using block index set to define scatter [0] PetscCommDuplicate(): returning tag 2147483628 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [0] PetscCommDuplicate(): returning tag 2147483644 [0] PetscCommDuplicate(): returning tag 2147483627 [0] PetscCommDuplicate(): returning tag 2147483622 [1] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs. [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 5; storage space: 12 unneeded,13 used [1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 [1] Mat_CheckInode(): Found 5 nodes out of 5 rows. Not using Inode routines [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [1] PetscCommDuplicate(): returning tag 2147483645 [1] PetscCommDuplicate(): returning tag 2147483628 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 -2080374783 [1] PetscCommDuplicate(): returning tag 2147483644 [1] PetscCommDuplicate(): returning tag 2147483627 [1] PetscCommDuplicate(): returning tag 2147483622 [0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter [0] VecScatterCreate(): General case: MPI to Seq [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 1; storage space: 9 unneeded,1 used [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 [0] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 4)/(num_localrows 5) > 0.6. Use CompressedRow routines. [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 1; storage space: 9 unneeded,1 used [1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 [1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 [1] Mat_CheckCompressedRow(): Found the ratio (num_zerorows 4)/(num_localrows 5) > 0.6. Use CompressedRow routines. [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [1] PetscCommDuplicate(): returning tag 2147483618 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [0] PetscCommDuplicate(): returning tag 2147483618 [0] PetscCommDuplicate(): returning tag 2147483617 [0] PetscCommDuplicate(): returning tag 2147483616 [1] PetscCommDuplicate(): returning tag 2147483617 [1] PetscCommDuplicate(): returning tag 2147483616 [1] PetscCommDuplicate(): returning tag 2147483611 [1] PetscCommDuplicate(): returning tag 2147483606 [0] PetscCommDuplicate(): returning tag 2147483611 [0] PetscCommDuplicate(): returning tag 2147483606 [0] PetscCommDuplicate(): returning tag 2147483601 [0] PetscCommDuplicate(): returning tag 2147483596 [0] PCSetUp(): Setting up new PC [1] PetscCommDuplicate(): returning tag 2147483601 [1] PetscCommDuplicate(): returning tag 2147483596 [1] PetscCommDuplicate(): returning tag 2147483591 [0] PetscCommDuplicate(): returning tag 2147483591 [0] PetscCommDuplicate(): returning tag 2147483586 [0] PetscCommDuplicate(): returning tag 2147483581 [0] PetscCommDuplicate(): returning tag 2147483576 [1] PetscCommDuplicate(): returning tag 2147483586 [1] PetscCommDuplicate(): returning tag 2147483581 [1] PetscCommDuplicate(): returning tag 2147483576 [1] PetscCommDuplicate(): returning tag 2147483571 [0] PetscCommDuplicate(): returning tag 2147483571 [0] PetscCommDuplicate(): returning tag 2147483566 [0] PetscCommDuplicate(): returning tag 2147483561 [1] PetscCommDuplicate(): returning tag 2147483566 [1] PetscCommDuplicate(): returning tag 2147483561 [1] PetscCommDuplicate(): returning tag 2147483556 [0] PetscCommDuplicate(): returning tag 2147483556 [0] PetscCommDuplicate(): returning tag 2147483551 [1] PetscCommDuplicate(): returning tag 2147483551 [1] PetscCommDuplicate(): returning tag 2147483546 [0] PetscCommDuplicate(): returning tag 2147483546 [0] PetscCommDuplicate(): returning tag 2147483541 [1] PetscCommDuplicate(): returning tag 2147483541 [0] KSPDefaultConverged(): Linear solver has converged. Residual norm 5.11279e-16 is less than relative tolerance 1e-07 times initial right hand side norm 0.707107 at iteration 5 [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 -2080374784 [0] PetscCommDuplicate(): returning tag 2147483536 KSP Object: type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-07, absolute=1e-50, divergence=10000 left preconditioning using PRECONDITIONED norm type for convergence test PC Object: type: jacobi linear system matrix = precond matrix: Matrix Object: type=mpiaij, rows=10, cols=10 [1] PetscCommDuplicate(): returning tag 2147483536 total: nonzeros=28, allocated nonzeros=70 not using I-node (on process 0) routines Norm of error < 1.e-12, Iterations 5 [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850689 [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783 [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374783 [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374783 [0] PetscFinalize(): PetscFinalize() called [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850689 [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783 [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374783 [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374783 [1] PetscFinalize(): PetscFinalize() called [1] PetscCommDuplicate(): returning tag 2147483535 [1] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 [1] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 [0] PetscCommDuplicate(): returning tag 2147483535 [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm 1140850688 [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user MPI_Comm -2080374784 [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ On Mon, Jan 17, 2011 at 6:20 PM, Gaurish Telang wrote: > This is what I get on running mpiexec -n 2 ./ex23 -info > > Also, using mpirun in place of mpiexec and using the -info option I get > the exact same output you see below. > > As far as the MPI implmentation I am using, I have OpenMPI and MPICH > installed on my laptop. > > While installing PETSc there were some external packages required. In the > external packages folder I can see the following softwares: > > fblaslapack-3.1.1 mpich2-1.0.8 ParMetis-dev-p3 SuperLU_DIST_2.4-hg-v2 > > Possibly it is this mpich2 that should be used?? > Please let me know what I should do. I am quite new to PETSc. > > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ > mpiexec -n 2 ./ex23 -info > [0] PetscInitialize(): PETSc successfully started: number of processors = 1 > [0] PetscGetHostName(): Rejecting domainname, likely is NIS > gaurish108-laptop.(none) > [0] PetscInitialize(): Running on machine: gaurish108-laptop > [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 > max tags = 2147483647 > [0] PetscCommDuplicate(): returning tag 2147483647 > [0] PetscCommDuplicate(): returning tag 2147483646 > [0] PetscCommDuplicate(): returning tag 2147483645 > [0] PetscInitialize(): PETSc successfully started: number of processors = 1 > [0] PetscGetHostName(): Rejecting domainname, likely is NIS > gaurish108-laptop.(none) > [0] PetscInitialize(): Running on machine: gaurish108-laptop > [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 > max tags = 2147483647 > [0] PetscCommDuplicate(): returning tag 2147483647 > [0] PetscCommDuplicate(): returning tag 2147483646 > [0] PetscCommDuplicate(): returning tag 2147483645 > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [0] PetscCommDuplicate(): returning tag 2147483644 > [0] MatSetUpPreallocation(): Warning not preallocating matrix storage > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 10 X 10; storage space: 22 > unneeded,28 used > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 > [0] Mat_CheckInode(): Found 10 nodes out of 10 rows. Not using Inode > routines > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [0] PetscCommDuplicate(): returning tag 2147483643 > [0] PetscCommDuplicate(): returning tag 2147483642 > [0] PetscCommDuplicate(): returning tag 2147483641 > [0] PetscCommDuplicate(): returning tag 2147483640 > [0] PetscCommDuplicate(): returning tag 2147483639 > [0] PetscCommDuplicate(): returning tag 2147483638 > [0] PetscCommDuplicate(): returning tag 2147483637 > [0] PCSetUp(): Setting up new PC > [0] PetscCommDuplicate(): returning tag 2147483636 > [0] PetscCommDuplicate(): returning tag 2147483635 > [0] PetscCommDuplicate(): returning tag 2147483634 > [0] PetscCommDuplicate(): returning tag 2147483633 > [0] PetscCommDuplicate(): returning tag 2147483632 > [0] PetscCommDuplicate(): returning tag 2147483631 > [0] PetscCommDuplicate(): returning tag 2147483630 > [0] PetscCommDuplicate(): returning tag 2147483629 > [0] PetscCommDuplicate(): returning tag 2147483628 > [0] PetscCommDuplicate(): returning tag 2147483627 > [0] PetscCommDuplicate(): returning tag 2147483626 > [0] KSPDefaultConverged(): Linear solver has converged. Residual norm > 4.50879e-16 is less than relative tolerance 1e-07 times initial right hand > side norm 0.707107 at iteration 5 > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [0] PetscCommDuplicate(): returning tag 2147483625 > > KSP Object: > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-07, absolute=1e-50, divergence=10000 > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: > type: jacobi > linear system matrix = precond matrix: > Matrix Object: > type=seqaij, rows=10, cols=10 > total: nonzeros=28, allocated nonzeros=50 > not using I-node routines > Norm of error < 1.e-12, Iterations 5 > [0] PetscFinalize(): PetscFinalize() called > [0] PetscCommDuplicate(): returning tag 2147483624 > [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 > [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user > MPI_Comm 1140850688 > [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 > [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 > [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user > MPI_Comm -2080374784 > [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [0] PetscCommDuplicate(): returning tag 2147483644 > [0] MatSetUpPreallocation(): Warning not preallocating matrix storage > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 10 X 10; storage space: 22 > unneeded,28 used > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 > [0] Mat_CheckInode(): Found 10 nodes out of 10 rows. Not using Inode > routines > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [0] PetscCommDuplicate(): returning tag 2147483643 > [0] PetscCommDuplicate(): returning tag 2147483642 > [0] PetscCommDuplicate(): returning tag 2147483641 > [0] PetscCommDuplicate(): returning tag 2147483640 > [0] PetscCommDuplicate(): returning tag 2147483639 > [0] PetscCommDuplicate(): returning tag 2147483638 > [0] PetscCommDuplicate(): returning tag 2147483637 > [0] PCSetUp(): Setting up new PC > [0] PetscCommDuplicate(): returning tag 2147483636 > [0] PetscCommDuplicate(): returning tag 2147483635 > [0] PetscCommDuplicate(): returning tag 2147483634 > [0] PetscCommDuplicate(): returning tag 2147483633 > [0] PetscCommDuplicate(): returning tag 2147483632 > [0] PetscCommDuplicate(): returning tag 2147483631 > [0] PetscCommDuplicate(): returning tag 2147483630 > [0] PetscCommDuplicate(): returning tag 2147483629 > [0] PetscCommDuplicate(): returning tag 2147483628 > [0] PetscCommDuplicate(): returning tag 2147483627 > [0] PetscCommDuplicate(): returning tag 2147483626 > [0] KSPDefaultConverged(): Linear solver has converged. Residual norm > 4.50879e-16 is less than relative tolerance 1e-07 times initial right hand > side norm 0.707107 at iteration 5 > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [0] PetscCommDuplicate(): returning tag 2147483625 > > KSP Object: > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-07, absolute=1e-50, divergence=10000 > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: > type: jacobi > linear system matrix = precond matrix: > Matrix Object: > type=seqaij, rows=10, cols=10 > total: nonzeros=28, allocated nonzeros=50 > not using I-node routines > Norm of error < 1.e-12, Iterations 5 > [0] PetscFinalize(): PetscFinalize() called > [0] PetscCommDuplicate(): returning tag 2147483624 > [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 > [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user > MPI_Comm 1140850688 > [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 > [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 > [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user > MPI_Comm -2080374784 > [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 > > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ > > > On Mon, Jan 17, 2011 at 5:46 PM, Gaurish Telang wrote: > >> Hi. >> >> I had two questions >> >> (1) >> >> I was curious to know why the following happens with the PETSc standard >> output. Having created the executable 'test' when I try to run it with >> mpiexec -n 2 ./test >> the same output is printed to the terminal twice. If I use 3 processors, >> then the same output is printed thrice. >> >> In short the number of processors = number of times the output from PETSc >> is printed. Could this be a mistake with my PETSc installation??? >> >> For example, consider the code in src/ksp/ksp/examples/tutorials/ex23.c >> After creating ex23 the executable and running it with two processors gives >> the following terminal output: >> >> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ >> mpiexec -n 1 ./ex23 >> KSP Object: >> type: gmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-07, absolute=1e-50, divergence=10000 >> left preconditioning >> using PRECONDITIONED norm type for convergence test >> PC Object: >> type: jacobi >> linear system matrix = precond matrix: >> Matrix Object: >> type=seqaij, rows=10, cols=10 >> total: nonzeros=28, allocated nonzeros=50 >> not using I-node routines >> Norm of error < 1.e-12, Iterations 5 >> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ >> mpiexec -n 2 ./ex23 >> KSP Object: >> type: gmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-07, absolute=1e-50, divergence=10000 >> left preconditioning >> using PRECONDITIONED norm type for convergence test >> PC Object: >> type: jacobi >> linear system matrix = precond matrix: >> Matrix Object: >> type=seqaij, rows=10, cols=10 >> total: nonzeros=28, allocated nonzeros=50 >> not using I-node routines >> Norm of error < 1.e-12, Iterations 5 >> KSP Object: >> type: gmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-07, absolute=1e-50, divergence=10000 >> left preconditioning >> using PRECONDITIONED norm type for convergence test >> PC Object: >> type: jacobi >> linear system matrix = precond matrix: >> Matrix Object: >> type=seqaij, rows=10, cols=10 >> total: nonzeros=28, allocated nonzeros=50 >> not using I-node routines >> Norm of error < 1.e-12, Iterations 5 >> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ >> >> >> >> >> (2) >> >> Also I was told yesterday on the PETSC users mailing list that the MATLAB >> m file PetscBinaryWrite.m converts a sparse matrix in MATLAB into Petsc >> Binary format. >> The following are the comments in the code near the heading saying >> that it works only for square sparse matrices . But it seems to be working >> quite well for rectangular sparse MATLAB matrices also. >> I have tested this in conjunction with PetscBinaryRead.m also, which reads >> in a Petsc binary file into MATLAB as a sparse matrix. >> >> Is there something I might have missed or some error that I might be >> making??? >> >> Comments in PetscBinaryWrite.m >> "-================================================ >> % Writes in PETSc binary file sparse matrices and vectors >> % if the array is multidimensional and dense it is saved >> % as a one dimensional array >> % >> % Only works for square sparse matrices >> %: >> .. >> .. >> .. >> .. >> .. >> .. >> . >> . >> . >> >> >> >> >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Jan 17 18:43:35 2011 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 17 Jan 2011 18:43:35 -0600 Subject: [petsc-users] Regarding printing to standard output. and possible mistake in the code comments in PetscBinaryWrite.m In-Reply-To: References: Message-ID: On Mon, Jan 17, 2011 at 6:36 PM, Gaurish Telang wrote: > Thank you that seems to have worked! Just to confirm, I have posted the > output at the end of this message. I hope this is the way the generic output > should look like. > > I have still have a few questions though. > > (1) > > So all I need to do, is to specify the location of the correct executable > mpiexec which is $PETSC_DIR/$PETSC_ARCH/bin/mpiexec > while running the program, right? > Yes > The contents of my $PETSC_DIR/$PETSC_ARCH/bin are > > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/linux-gnu-c-debug/bin$ > ls > mpicc mpich2version mpiexec mpif77 mpif90 parkill > > > (2) > Do I need to make any changes in the makefiles of the PETSc programs that I > have written? And hence recompile my codes by using the "new" mpiexec > mpiexec is only for execution, not compiling or linking. > I mean, since $PETSC_DIR/$PETSC_ARCH/bin/ also contains mpicc (as seen > above), I want to be sure that the correct mpicc is being used during > execution. > > (3) Should I run the mpd daemon before using mpiexec??? On the MPICH2 that > I had installed prior to my PETSc it required me type "mpd &" > before program execution. > > But it seems for my PETSc mpiexec I don;t need mpd. But should I type it in > ?? I mean I am not sure if this affects program performance > The new version of MPICH uses hydra, not mpd, to manage the startup. Matt > Sincere thanks, > > Gaurish > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From gaurish108 at gmail.com Mon Jan 17 18:47:30 2011 From: gaurish108 at gmail.com (Gaurish Telang) Date: Mon, 17 Jan 2011 19:47:30 -0500 Subject: [petsc-users] Regarding printing to standard output. and possible mistake in the code comments in PetscBinaryWrite.m In-Reply-To: References: Message-ID: So hydra begins automatically while using mpiexec right? Meaning I don't have to manually enter in "hydra &" at the terminal?? Thanks, gaurish On Mon, Jan 17, 2011 at 7:36 PM, Gaurish Telang wrote: > Thank you that seems to have worked! Just to confirm, I have posted the > output at the end of this message. I hope this is the way the generic output > should look like. > > I have still have a few questions though. > > (1) > > So all I need to do, is to specify the location of the correct executable > mpiexec which is $PETSC_DIR/$PETSC_ARCH/bin/mpiexec > while running the program, right? > > The contents of my $PETSC_DIR/$PETSC_ARCH/bin are > > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/linux-gnu-c-debug/bin$ > ls > mpicc mpich2version mpiexec mpif77 mpif90 parkill > > > (2) > Do I need to make any changes in the makefiles of the PETSc programs that I > have written? And hence recompile my codes by using the "new" mpiexec > > I mean, since $PETSC_DIR/$PETSC_ARCH/bin/ also contains mpicc (as seen > above), I want to be sure that the correct mpicc is being used during > execution. > > (3) Should I run the mpd daemon before using mpiexec??? On the MPICH2 that > I had installed prior to my PETSc it required me type "mpd &" > before program execution. > > But it seems for my PETSc mpiexec I don;t need mpd. But should I type it in > ?? I mean I am not sure if this affects program performance > > > Sincere thanks, > > Gaurish > > > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ > $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -n 2 ./ex23 -info > [0] PetscInitialize(): PETSc successfully started: number of processors = 2 > [1] PetscInitialize(): PETSc successfully started: number of processors = 2 > [1] PetscGetHostName(): Rejecting domainname, likely is NIS > gaurish108-laptop.(none) > [1] PetscInitialize(): Running on machine: gaurish108-laptop > > [0] PetscGetHostName(): Rejecting domainname, likely is NIS > gaurish108-laptop.(none) > [0] PetscInitialize(): Running on machine: gaurish108-laptop > [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 > max tags = 2147483647 > [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 > max tags = 2147483647 > [1] PetscCommDuplicate(): returning tag 2147483647 > [0] PetscCommDuplicate(): returning tag 2147483647 > > [0] PetscCommDuplicate(): returning tag 2147483642 > [1] PetscCommDuplicate(): returning tag 2147483642 > > [0] PetscCommDuplicate(): returning tag 2147483637 > [1] PetscCommDuplicate(): returning tag 2147483637 > > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [0] PetscCommDuplicate(): returning tag 2147483632 > [1] PetscCommDuplicate(): returning tag 2147483632 > > [0] MatSetUpPreallocation(): Warning not preallocating matrix storage > [0] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374783 > max tags = 2147483647 > > [0] PetscCommDuplicate(): returning tag 2147483647 > [1] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374783 > max tags = 2147483647 > [1] PetscCommDuplicate(): returning tag 2147483647 > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 > -2080374783 > > [0] PetscCommDuplicate(): returning tag 2147483646 > [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 > -2080374783 > [1] PetscCommDuplicate(): returning tag 2147483646 > [0] MatStashScatterBegin_Private(): No of messages: 0 > [0] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs. > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 5; storage space: 12 > unneeded,13 used > > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 > [1] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs. > [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 5; storage space: 12 > unneeded,13 used > [1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 > [1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 > [1] Mat_CheckInode(): Found 5 nodes out of 5 rows. Not using Inode routines > [0] Mat_CheckInode(): Found 5 nodes out of 5 rows. Not using Inode routines > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 > -2080374783 > > [0] PetscCommDuplicate(): returning tag 2147483645 > [0] MatSetUpMultiply_MPIAIJ(): Using block index set to define scatter > [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 > -2080374783 > [1] PetscCommDuplicate(): returning tag 2147483645 > [1] PetscCommDuplicate(): returning tag 2147483628 > [0] PetscCommDuplicate(): returning tag 2147483628 > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 > -2080374783 > > [0] PetscCommDuplicate(): returning tag 2147483644 > [0] PetscCommDuplicate(): returning tag 2147483627 > [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 > -2080374783 > [1] PetscCommDuplicate(): returning tag 2147483644 > [1] PetscCommDuplicate(): returning tag 2147483627 > [1] PetscCommDuplicate(): returning tag 2147483622 > [0] PetscCommDuplicate(): returning tag 2147483622 > [0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter > [0] VecScatterCreate(): General case: MPI to Seq > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 1; storage space: 9 > unneeded,1 used > > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 > [0] Mat_CheckCompressedRow(): Found the ratio (num_zerorows > 4)/(num_localrows 5) > 0.6. Use CompressedRow routines. > > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 1; storage space: 9 > unneeded,1 used > [1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 > [1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 > [1] Mat_CheckCompressedRow(): Found the ratio (num_zerorows > 4)/(num_localrows 5) > 0.6. Use CompressedRow routines. > [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [1] PetscCommDuplicate(): returning tag 2147483618 > [0] PetscCommDuplicate(): returning tag 2147483618 > [0] PetscCommDuplicate(): returning tag 2147483617 > [1] PetscCommDuplicate(): returning tag 2147483617 > [1] PetscCommDuplicate(): returning tag 2147483616 > [0] PetscCommDuplicate(): returning tag 2147483616 > [0] PetscCommDuplicate(): returning tag 2147483611 > [1] PetscCommDuplicate(): returning tag 2147483611 > [1] PetscCommDuplicate(): returning tag 2147483606 > [0] PetscCommDuplicate(): returning tag 2147483606 > [0] PetscCommDuplicate(): returning tag 2147483601 > [1] PetscCommDuplicate(): returning tag 2147483601 > [1] PetscCommDuplicate(): returning tag 2147483596 > [0] PetscCommDuplicate(): returning tag 2147483596 > > [0] PCSetUp(): Setting up new PC > [1] PetscCommDuplicate(): returning tag 2147483591 > [0] PetscCommDuplicate(): returning tag 2147483591 > [0] PetscCommDuplicate(): returning tag 2147483586 > [1] PetscCommDuplicate(): returning tag 2147483586 > [0] PetscCommDuplicate(): returning tag 2147483581 > [1] PetscCommDuplicate(): returning tag 2147483581 > [0] PetscCommDuplicate(): returning tag 2147483576 > [1] PetscCommDuplicate(): returning tag 2147483576 > > [0] PetscCommDuplicate(): returning tag 2147483571 > [1] PetscCommDuplicate(): returning tag 2147483571 > [1] PetscCommDuplicate(): returning tag 2147483566 > [0] PetscCommDuplicate(): returning tag 2147483566 > > [0] PetscCommDuplicate(): returning tag 2147483561 > [1] PetscCommDuplicate(): returning tag 2147483561 > [0] PetscCommDuplicate(): returning tag 2147483556 > > [0] PetscCommDuplicate(): returning tag 2147483551 > [0] PetscCommDuplicate(): returning tag 2147483546 > [0] PetscCommDuplicate(): returning tag 2147483541 > [0] KSPDefaultConverged(): Linear solver has converged. Residual norm > 5.11279e-16 is less than relative tolerance 1e-07 times initial right hand > side norm 0.707107 at iteration 5 > > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [0] PetscCommDuplicate(): returning tag 2147483536 > > KSP Object: > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-07, absolute=1e-50, divergence=10000 > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: > type: jacobi > linear system matrix = precond matrix: > Matrix Object: > type=mpiaij, rows=10, cols=10 > tot[1] PetscCommDuplicate(): returning tag 2147483556 > [1] PetscCommDuplicate(): returning tag 2147483551 > [1] PetscCommDuplicate(): returning tag 2147483546 > [1] PetscCommDuplicate(): returning tag 2147483541 > [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [1] PetscCommDuplicate(): returning tag 2147483536 > [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user > MPI_Comm 1140850689 > [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783 > [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374783 > [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user > MPI_Comm -2080374783 > al: nonzeros=28, allocated nonzeros=70 > not using I-node (on process 0) routines > > Norm of error < 1.e-12, Iterations 5 > [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user > MPI_Comm 1140850689 > [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783 > [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374783 > [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user > MPI_Comm -2080374783 > [0] PetscFinalize(): PetscFinalize() called > [1] PetscFinalize(): PetscFinalize() called > [1] PetscCommDuplicate(): returning tag 2147483535 > [1] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 > [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user > MPI_Comm 1140850688 > [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 > [0] PetscCommDuplicate(): returning tag 2147483535 > > [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 > [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user > MPI_Comm 1140850688 > [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 > [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 > [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user > MPI_Comm -2080374784 > [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 > [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 > [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user > MPI_Comm -2080374784 > [1] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 > > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ > > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ > > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ > > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ > > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ > > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ > clear > > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ > $PETSC_DIR/$PETSC_ARCH/bin/mpich2version -n 2 ./ex23 -info > Unrecognized argument -n > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ > $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -n 2 ./ex23 -info > [0] PetscInitialize(): PETSc successfully started: number of processors = 2 > [1] PetscInitialize(): PETSc successfully started: number of processors = 2 > > [0] PetscGetHostName(): Rejecting domainname, likely is NIS > gaurish108-laptop.(none) > [0] PetscInitialize(): Running on machine: gaurish108-laptop > [1] PetscGetHostName(): Rejecting domainname, likely is NIS > gaurish108-laptop.(none) > [1] PetscInitialize(): Running on machine: gaurish108-laptop > > [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 > max tags = 2147483647 > [1] PetscCommDuplicate(): Duplicating a communicator 1140850688 -2080374784 > max tags = 2147483647 > [1] PetscCommDuplicate(): returning tag 2147483647 > [0] PetscCommDuplicate(): returning tag 2147483647 > > [0] PetscCommDuplicate(): returning tag 2147483642 > [0] PetscCommDuplicate(): returning tag 2147483637 > [1] PetscCommDuplicate(): returning tag 2147483642 > [1] PetscCommDuplicate(): returning tag 2147483637 > > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [1] PetscCommDuplicate(): returning tag 2147483632 > [0] PetscCommDuplicate(): returning tag 2147483632 > > [0] MatSetUpPreallocation(): Warning not preallocating matrix storage > [0] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374783 > max tags = 2147483647 > > [0] PetscCommDuplicate(): returning tag 2147483647 > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 > -2080374783 > > [0] PetscCommDuplicate(): returning tag 2147483646 > [1] PetscCommDuplicate(): Duplicating a communicator 1140850689 -2080374783 > max tags = 2147483647 > [1] PetscCommDuplicate(): returning tag 2147483647 > [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 > -2080374783 > [1] PetscCommDuplicate(): returning tag 2147483646 > [0] MatStashScatterBegin_Private(): No of messages: 0 > [0] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs. > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 5; storage space: 12 > unneeded,13 used > > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 > [0] Mat_CheckInode(): Found 5 nodes out of 5 rows. Not using Inode routines > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 > -2080374783 > > [0] PetscCommDuplicate(): returning tag 2147483645 > [0] MatSetUpMultiply_MPIAIJ(): Using block index set to define scatter > > [0] PetscCommDuplicate(): returning tag 2147483628 > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 > -2080374783 > > [0] PetscCommDuplicate(): returning tag 2147483644 > [0] PetscCommDuplicate(): returning tag 2147483627 > [0] PetscCommDuplicate(): returning tag 2147483622 > [1] MatAssemblyBegin_MPIAIJ(): Stash has 0 entries, uses 0 mallocs. > [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 5; storage space: 12 > unneeded,13 used > [1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 > [1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 > [1] Mat_CheckInode(): Found 5 nodes out of 5 rows. Not using Inode routines > [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 > -2080374783 > [1] PetscCommDuplicate(): returning tag 2147483645 > [1] PetscCommDuplicate(): returning tag 2147483628 > [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850689 > -2080374783 > [1] PetscCommDuplicate(): returning tag 2147483644 > [1] PetscCommDuplicate(): returning tag 2147483627 > [1] PetscCommDuplicate(): returning tag 2147483622 > [0] VecScatterCreateCommon_PtoS(): Using blocksize 1 scatter > [0] VecScatterCreate(): General case: MPI to Seq > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 1; storage space: 9 > unneeded,1 used > > [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 > [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 > [0] Mat_CheckCompressedRow(): Found the ratio (num_zerorows > 4)/(num_localrows 5) > 0.6. Use CompressedRow routines. > [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 5 X 1; storage space: 9 > unneeded,1 used > [1] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 > [1] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 1 > [1] Mat_CheckCompressedRow(): Found the ratio (num_zerorows > 4)/(num_localrows 5) > 0.6. Use CompressedRow routines. > [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [1] PetscCommDuplicate(): returning tag 2147483618 > > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [0] PetscCommDuplicate(): returning tag 2147483618 > [0] PetscCommDuplicate(): returning tag 2147483617 > [0] PetscCommDuplicate(): returning tag 2147483616 > [1] PetscCommDuplicate(): returning tag 2147483617 > [1] PetscCommDuplicate(): returning tag 2147483616 > [1] PetscCommDuplicate(): returning tag 2147483611 > [1] PetscCommDuplicate(): returning tag 2147483606 > [0] PetscCommDuplicate(): returning tag 2147483611 > [0] PetscCommDuplicate(): returning tag 2147483606 > [0] PetscCommDuplicate(): returning tag 2147483601 > [0] PetscCommDuplicate(): returning tag 2147483596 > > [0] PCSetUp(): Setting up new PC > [1] PetscCommDuplicate(): returning tag 2147483601 > [1] PetscCommDuplicate(): returning tag 2147483596 > [1] PetscCommDuplicate(): returning tag 2147483591 > [0] PetscCommDuplicate(): returning tag 2147483591 > [0] PetscCommDuplicate(): returning tag 2147483586 > [0] PetscCommDuplicate(): returning tag 2147483581 > [0] PetscCommDuplicate(): returning tag 2147483576 > [1] PetscCommDuplicate(): returning tag 2147483586 > [1] PetscCommDuplicate(): returning tag 2147483581 > [1] PetscCommDuplicate(): returning tag 2147483576 > [1] PetscCommDuplicate(): returning tag 2147483571 > [0] PetscCommDuplicate(): returning tag 2147483571 > > [0] PetscCommDuplicate(): returning tag 2147483566 > [0] PetscCommDuplicate(): returning tag 2147483561 > [1] PetscCommDuplicate(): returning tag 2147483566 > [1] PetscCommDuplicate(): returning tag 2147483561 > [1] PetscCommDuplicate(): returning tag 2147483556 > > [0] PetscCommDuplicate(): returning tag 2147483556 > [0] PetscCommDuplicate(): returning tag 2147483551 > [1] PetscCommDuplicate(): returning tag 2147483551 > [1] PetscCommDuplicate(): returning tag 2147483546 > > [0] PetscCommDuplicate(): returning tag 2147483546 > [0] PetscCommDuplicate(): returning tag 2147483541 > [1] PetscCommDuplicate(): returning tag 2147483541 > [0] KSPDefaultConverged(): Linear solver has converged. Residual norm > 5.11279e-16 is less than relative tolerance 1e-07 times initial right hand > side norm 0.707107 at iteration 5 > > [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [1] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 > -2080374784 > [0] PetscCommDuplicate(): returning tag 2147483536 > > KSP Object: > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-07, absolute=1e-50, divergence=10000 > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: > type: jacobi > linear system matrix = precond matrix: > Matrix Object: > type=mpiaij, rows=10, cols=10 > [1] PetscCommDuplicate(): returning tag 2147483536 > total: nonzeros=28, allocated nonzeros=70 > not using I-node (on process 0) routines > > Norm of error < 1.e-12, Iterations 5 > [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user > MPI_Comm 1140850689 > [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783 > [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374783 > [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user > MPI_Comm -2080374783 > [0] PetscFinalize(): PetscFinalize() called > [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user > MPI_Comm 1140850689 > [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374783 > [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374783 > [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user > MPI_Comm -2080374783 > [1] PetscFinalize(): PetscFinalize() called > [1] PetscCommDuplicate(): returning tag 2147483535 > [1] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 > [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user > MPI_Comm 1140850688 > [1] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 > [1] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 > [1] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user > MPI_Comm -2080374784 > [1] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 > [0] PetscCommDuplicate(): returning tag 2147483535 > > [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 > [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user > MPI_Comm 1140850688 > [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 > [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 > [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user > MPI_Comm -2080374784 > [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 > gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > On Mon, Jan 17, 2011 at 6:20 PM, Gaurish Telang wrote: > >> This is what I get on running mpiexec -n 2 ./ex23 -info >> >> Also, using mpirun in place of mpiexec and using the -info option I get >> the exact same output you see below. >> >> As far as the MPI implmentation I am using, I have OpenMPI and MPICH >> installed on my laptop. >> >> While installing PETSc there were some external packages required. In the >> external packages folder I can see the following softwares: >> >> fblaslapack-3.1.1 mpich2-1.0.8 ParMetis-dev-p3 SuperLU_DIST_2.4-hg-v2 >> >> Possibly it is this mpich2 that should be used?? >> Please let me know what I should do. I am quite new to PETSc. >> >> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ >> mpiexec -n 2 ./ex23 -info >> [0] PetscInitialize(): PETSc successfully started: number of processors = >> 1 >> [0] PetscGetHostName(): Rejecting domainname, likely is NIS >> gaurish108-laptop.(none) >> [0] PetscInitialize(): Running on machine: gaurish108-laptop >> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 >> -2080374784 max tags = 2147483647 >> [0] PetscCommDuplicate(): returning tag 2147483647 >> [0] PetscCommDuplicate(): returning tag 2147483646 >> [0] PetscCommDuplicate(): returning tag 2147483645 >> [0] PetscInitialize(): PETSc successfully started: number of processors = >> 1 >> [0] PetscGetHostName(): Rejecting domainname, likely is NIS >> gaurish108-laptop.(none) >> [0] PetscInitialize(): Running on machine: gaurish108-laptop >> [0] PetscCommDuplicate(): Duplicating a communicator 1140850688 >> -2080374784 max tags = 2147483647 >> [0] PetscCommDuplicate(): returning tag 2147483647 >> [0] PetscCommDuplicate(): returning tag 2147483646 >> [0] PetscCommDuplicate(): returning tag 2147483645 >> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 >> -2080374784 >> [0] PetscCommDuplicate(): returning tag 2147483644 >> [0] MatSetUpPreallocation(): Warning not preallocating matrix storage >> [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 10 X 10; storage space: 22 >> unneeded,28 used >> [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 >> [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 >> [0] Mat_CheckInode(): Found 10 nodes out of 10 rows. Not using Inode >> routines >> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 >> -2080374784 >> [0] PetscCommDuplicate(): returning tag 2147483643 >> [0] PetscCommDuplicate(): returning tag 2147483642 >> [0] PetscCommDuplicate(): returning tag 2147483641 >> [0] PetscCommDuplicate(): returning tag 2147483640 >> [0] PetscCommDuplicate(): returning tag 2147483639 >> [0] PetscCommDuplicate(): returning tag 2147483638 >> [0] PetscCommDuplicate(): returning tag 2147483637 >> [0] PCSetUp(): Setting up new PC >> [0] PetscCommDuplicate(): returning tag 2147483636 >> [0] PetscCommDuplicate(): returning tag 2147483635 >> [0] PetscCommDuplicate(): returning tag 2147483634 >> [0] PetscCommDuplicate(): returning tag 2147483633 >> [0] PetscCommDuplicate(): returning tag 2147483632 >> [0] PetscCommDuplicate(): returning tag 2147483631 >> [0] PetscCommDuplicate(): returning tag 2147483630 >> [0] PetscCommDuplicate(): returning tag 2147483629 >> [0] PetscCommDuplicate(): returning tag 2147483628 >> [0] PetscCommDuplicate(): returning tag 2147483627 >> [0] PetscCommDuplicate(): returning tag 2147483626 >> [0] KSPDefaultConverged(): Linear solver has converged. Residual norm >> 4.50879e-16 is less than relative tolerance 1e-07 times initial right hand >> side norm 0.707107 at iteration 5 >> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 >> -2080374784 >> [0] PetscCommDuplicate(): returning tag 2147483625 >> >> KSP Object: >> type: gmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-07, absolute=1e-50, divergence=10000 >> left preconditioning >> using PRECONDITIONED norm type for convergence test >> PC Object: >> type: jacobi >> linear system matrix = precond matrix: >> Matrix Object: >> type=seqaij, rows=10, cols=10 >> total: nonzeros=28, allocated nonzeros=50 >> not using I-node routines >> Norm of error < 1.e-12, Iterations 5 >> [0] PetscFinalize(): PetscFinalize() called >> [0] PetscCommDuplicate(): returning tag 2147483624 >> [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 >> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user >> MPI_Comm 1140850688 >> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >> [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user >> MPI_Comm -2080374784 >> [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 >> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 >> -2080374784 >> [0] PetscCommDuplicate(): returning tag 2147483644 >> [0] MatSetUpPreallocation(): Warning not preallocating matrix storage >> [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 10 X 10; storage space: 22 >> unneeded,28 used >> [0] MatAssemblyEnd_SeqAIJ(): Number of mallocs during MatSetValues() is 0 >> [0] MatAssemblyEnd_SeqAIJ(): Maximum nonzeros in any row is 3 >> [0] Mat_CheckInode(): Found 10 nodes out of 10 rows. Not using Inode >> routines >> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 >> -2080374784 >> [0] PetscCommDuplicate(): returning tag 2147483643 >> [0] PetscCommDuplicate(): returning tag 2147483642 >> [0] PetscCommDuplicate(): returning tag 2147483641 >> [0] PetscCommDuplicate(): returning tag 2147483640 >> [0] PetscCommDuplicate(): returning tag 2147483639 >> [0] PetscCommDuplicate(): returning tag 2147483638 >> [0] PetscCommDuplicate(): returning tag 2147483637 >> [0] PCSetUp(): Setting up new PC >> [0] PetscCommDuplicate(): returning tag 2147483636 >> [0] PetscCommDuplicate(): returning tag 2147483635 >> [0] PetscCommDuplicate(): returning tag 2147483634 >> [0] PetscCommDuplicate(): returning tag 2147483633 >> [0] PetscCommDuplicate(): returning tag 2147483632 >> [0] PetscCommDuplicate(): returning tag 2147483631 >> [0] PetscCommDuplicate(): returning tag 2147483630 >> [0] PetscCommDuplicate(): returning tag 2147483629 >> [0] PetscCommDuplicate(): returning tag 2147483628 >> [0] PetscCommDuplicate(): returning tag 2147483627 >> [0] PetscCommDuplicate(): returning tag 2147483626 >> [0] KSPDefaultConverged(): Linear solver has converged. Residual norm >> 4.50879e-16 is less than relative tolerance 1e-07 times initial right hand >> side norm 0.707107 at iteration 5 >> [0] PetscCommDuplicate(): Using internal PETSc communicator 1140850688 >> -2080374784 >> [0] PetscCommDuplicate(): returning tag 2147483625 >> >> KSP Object: >> type: gmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >> Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-07, absolute=1e-50, divergence=10000 >> left preconditioning >> using PRECONDITIONED norm type for convergence test >> PC Object: >> type: jacobi >> linear system matrix = precond matrix: >> Matrix Object: >> type=seqaij, rows=10, cols=10 >> total: nonzeros=28, allocated nonzeros=50 >> not using I-node routines >> Norm of error < 1.e-12, Iterations 5 >> [0] PetscFinalize(): PetscFinalize() called >> [0] PetscCommDuplicate(): returning tag 2147483624 >> [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 >> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user >> MPI_Comm 1140850688 >> [0] PetscCommDestroy(): Deleting PETSc MPI_Comm -2080374784 >> [0] Petsc_DelCounter(): Deleting counter data in an MPI_Comm -2080374784 >> [0] Petsc_DelComm(): Deleting PETSc communicator imbedded in a user >> MPI_Comm -2080374784 >> [0] Petsc_DelViewer(): Deleting viewer data in an MPI_Comm -2080374784 >> >> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ >> >> >> On Mon, Jan 17, 2011 at 5:46 PM, Gaurish Telang wrote: >> >>> Hi. >>> >>> I had two questions >>> >>> (1) >>> >>> I was curious to know why the following happens with the PETSc standard >>> output. Having created the executable 'test' when I try to run it with >>> mpiexec -n 2 ./test >>> the same output is printed to the terminal twice. If I use 3 processors, >>> then the same output is printed thrice. >>> >>> In short the number of processors = number of times the output from PETSc >>> is printed. Could this be a mistake with my PETSc installation??? >>> >>> For example, consider the code in src/ksp/ksp/examples/tutorials/ex23.c >>> After creating ex23 the executable and running it with two processors gives >>> the following terminal output: >>> >>> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ >>> mpiexec -n 1 ./ex23 >>> KSP Object: >>> type: gmres >>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>> Orthogonalization with no iterative refinement >>> GMRES: happy breakdown tolerance 1e-30 >>> maximum iterations=10000, initial guess is zero >>> tolerances: relative=1e-07, absolute=1e-50, divergence=10000 >>> left preconditioning >>> using PRECONDITIONED norm type for convergence test >>> PC Object: >>> type: jacobi >>> linear system matrix = precond matrix: >>> Matrix Object: >>> type=seqaij, rows=10, cols=10 >>> total: nonzeros=28, allocated nonzeros=50 >>> not using I-node routines >>> Norm of error < 1.e-12, Iterations 5 >>> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ >>> mpiexec -n 2 ./ex23 >>> KSP Object: >>> type: gmres >>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>> Orthogonalization with no iterative refinement >>> GMRES: happy breakdown tolerance 1e-30 >>> maximum iterations=10000, initial guess is zero >>> tolerances: relative=1e-07, absolute=1e-50, divergence=10000 >>> left preconditioning >>> using PRECONDITIONED norm type for convergence test >>> PC Object: >>> type: jacobi >>> linear system matrix = precond matrix: >>> Matrix Object: >>> type=seqaij, rows=10, cols=10 >>> total: nonzeros=28, allocated nonzeros=50 >>> not using I-node routines >>> Norm of error < 1.e-12, Iterations 5 >>> KSP Object: >>> type: gmres >>> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt >>> Orthogonalization with no iterative refinement >>> GMRES: happy breakdown tolerance 1e-30 >>> maximum iterations=10000, initial guess is zero >>> tolerances: relative=1e-07, absolute=1e-50, divergence=10000 >>> left preconditioning >>> using PRECONDITIONED norm type for convergence test >>> PC Object: >>> type: jacobi >>> linear system matrix = precond matrix: >>> Matrix Object: >>> type=seqaij, rows=10, cols=10 >>> total: nonzeros=28, allocated nonzeros=50 >>> not using I-node routines >>> Norm of error < 1.e-12, Iterations 5 >>> gaurish108 at gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$ >>> >>> >>> >>> >>> (2) >>> >>> Also I was told yesterday on the PETSC users mailing list that the MATLAB >>> m file PetscBinaryWrite.m converts a sparse matrix in MATLAB into Petsc >>> Binary format. >>> The following are the comments in the code near the heading saying >>> that it works only for square sparse matrices . But it seems to be working >>> quite well for rectangular sparse MATLAB matrices also. >>> I have tested this in conjunction with PetscBinaryRead.m also, which >>> reads in a Petsc binary file into MATLAB as a sparse matrix. >>> >>> Is there something I might have missed or some error that I might be >>> making??? >>> >>> Comments in PetscBinaryWrite.m >>> "-================================================ >>> % Writes in PETSc binary file sparse matrices and vectors >>> % if the array is multidimensional and dense it is saved >>> % as a one dimensional array >>> % >>> % Only works for square sparse matrices >>> %: >>> .. >>> .. >>> .. >>> .. >>> .. >>> .. >>> . >>> . >>> . >>> >>> >>> >>> >>> >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Jan 17 19:02:18 2011 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 17 Jan 2011 19:02:18 -0600 Subject: [petsc-users] Regarding printing to standard output. and possible mistake in the code comments in PetscBinaryWrite.m In-Reply-To: References: Message-ID: On Mon, Jan 17, 2011 at 6:47 PM, Gaurish Telang wrote: > So hydra begins automatically while using mpiexec right? Meaning I don't > have to manually enter in "hydra &" at the terminal?? > Yes Matt > Thanks, > > gaurish > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Mon Jan 17 19:35:12 2011 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 17 Jan 2011 19:35:12 -0600 (CST) Subject: [petsc-users] Regarding printing to standard output. and possible mistake in the code comments in PetscBinaryWrite.m In-Reply-To: References: Message-ID: On Mon, 17 Jan 2011, Matthew Knepley wrote: > > (3) Should I run the mpd daemon before using mpiexec??? On the MPICH2 that > > I had installed prior to my PETSc it required me type "mpd &" > > before program execution. > > > > But it seems for my PETSc mpiexec I don;t need mpd. But should I type it in > > ?? I mean I am not sure if this affects program performance > > > > The new version of MPICH uses hydra, not mpd, to manage the startup. Actually with petsc-3.1 download-mpich uses pm=gforker [not hydra], which limits MPI to a single node [with fork for all the mpi jobs] If you need something more specific with MPI config you should build your own MPI appropriately - and then configure PETSc with it - or specify additional configure options to --download-mpich ./configure --help |grep mpich With petsc-dev we default to '--with-pm=hydra --with-hydra-bss=fork,ssh' so again mpiexec will again default to fork [i.e 1 node] But one can use ' mpiexec -bootstrap ssh' to switch to multi-node hydra. PETSc defaults cater to easy software deveopment [this default works for most users]. Performance runs with mult-nodes are usually done on clusters/high performance machines - which usually have a tuned MPI installed on it anyway.. Satish From wumengda at gmail.com Mon Jan 17 21:22:32 2011 From: wumengda at gmail.com (Mengda Wu) Date: Mon, 17 Jan 2011 19:22:32 -0800 Subject: [petsc-users] [petsc-maint #61421] ksp/examples/tutorials/Ex2.c: good with with-debugging but error without-debugging In-Reply-To: References: <8B6F3968-D06D-4C9D-9269-FDD5645B444B@mcs.anl.gov> Message-ID: Yes. I obtained the hotfix download from a third-party website: http://thehotfixshare.net/board/index.php?autocom=downloads&showfile=11462 I asked for this hotfix on msdn social but it seems not easy to get. http://social.msdn.microsoft.com/Forums/en-US/vclanguage/thread/87ae2101-22ed-416c-aeb5-763b2f1ee1b1 Thanks, Mengda On Mon, Jan 17, 2011 at 1:55 PM, Satish Balay wrote: > Thanks for confirming its a compiler bug. > > BTW: The url below doesn't show the actual hotfix download. Is > there a different location for this download? > > Satish > > On Mon, 17 Jan 2011, Mengda Wu wrote: > > > This indeed is caused by a bug in Visual c++ 2005 64bit compiler when > using > > optimization. > > The result is correct after installing the hotfix: > > http://support.microsoft.com/kb/976617/ > > Thanks a lot! > > > > Mengda > > > > On Mon, Jan 17, 2011 at 12:27 PM, Barry Smith > wrote: > > > > > > > > Compiler bug. Immediately before the call to MatMult() in the code add > the > > > two lines > > > > > > ierr = VecView(u,0); > > > ierr = MatView(A,0); > > > > > > how large are the two objects? Given the code it is inconceivable that > > > suddenly the vector length becomes 57. > > > > > > Barry > > > > > > > > > On Jan 17, 2011, at 2:43 AM, Mengda Wu wrote: > > > > > > > Hi all, > > > > > > > > I just compiled the debugged and optimized versions of > petsc-3.1-p7. > > > > Both are successful. I am running on Windows Vista 64bit machine. > > > > The C/C++ compiler is cl.exe from visual studio 2005 (64 bit) and no > > > FORTRAN > > > > compiler is used. BLAS/LAPACK > > > > support comes from Intel MKL-10.1.3.028 (under em64t\lib). No MPI is > > > used. > > > > > > > > The debugged petsc was configured with: > > > > $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' > > > > --with-mpi= > > > > 0 > > > > > > > > --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_thread.lib,mkl_core.lib > > > > ,libiomp5mt.lib] -CFLAGS='-MDd -W3 -Z7' -CXXFLAGS='-MDd -W3 -Z7' > > > > > > > > The optimized petsc was configured with: > > > > $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' > > > > --with-mpi= > > > > 0 --with-debugging=0 > > > > --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_threa > > > > d.lib,mkl_core.lib,libiomp5mt.lib] -CFLAGS='-MD -wd4996 -O2' > > > -CXXFLAGS='-MD > > > > -w > > > > d4996 -O2' > > > > > > > > When I run ksp/examples/tutorials/Ex2.c. The result with debugged > petsc > > > > is > > > > ================================================================= > > > > Norm of error 0.000156044 iterations 6 > > > > ================================================================= > > > > > > > > However, there are errors with the optimized petsc with the output > as > > > > follows: > > > > ================================================================= > > > > [0]PETSC ERROR: --------------------- Error Message > > > > ---------------------------- > > > > -------- > > > > [0]PETSC ERROR: Nonconforming object sizes! > > > > [0]PETSC ERROR: Mat mat,Vec y: global dim 56 57! > > > > [0]PETSC ERROR: > > > > ---------------------------------------------------------------- > > > > -------- > > > > [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 > 14:26:37 > > > > CST 20 > > > > 10 > > > > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > > > > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > > > > [0]PETSC ERROR: See docs/index.html for manual pages. > > > > [0]PETSC ERROR: > > > > ---------------------------------------------------------------- > > > > -------- > > > > [0]PETSC ERROR: > > > > D:\Develop\Test\PETSc\petsc-3.1-p7\src\ksp\ksp\examples\tutorial > > > > s\ex2.exe on a cygwin-c- named CVBRL-38 by mengda Mon Jan 17 00:35:15 > > > 2011 > > > > [0]PETSC ERROR: Libraries linked from > > > > /cygdrive/d/Develop/Test/PETSc/petsc-3.1-p > > > > 7/cygwin-c-opt/lib > > > > [0]PETSC ERROR: Configure run at Sun Jan 16 23:34:25 2011 > > > > [0]PETSC ERROR: Configure options --with-cc=cl --with-fc=0 > --with-cxx=cl > > > > --with- > > > > mpi=0 --with-debugging=0 > > > > --with-blas-lapack-lib="[mkl_intel_lp64.lib,mkl_intel_t > > > > hread.lib,mkl_core.lib,libiomp5mt.lib]" -CFLAGS="-MD -wd4996 -O2" > > > > -CXXFLAGS="-MD > > > > -wd4996 -O2" --useThreads=0 > > > > [0]PETSC ERROR: > > > > ---------------------------------------------------------------- > > > > -------- > > > > [0]PETSC ERROR: MatMult() line 1888 in > > > > src/mat/interface/D:\Develop\Test\PETSc\P > > > > ETSC-~1.1-P\src\mat\INTERF~1\matrix.c > > > > [0]PETSC ERROR: main() line 146 in > > > > src/ksp/ksp/examples/tutorials/D:\Develop\Tes > > > > t\PETSc\PETSC-~1.1-P\src\ksp\ksp\examples\TUTORI~1\ex2.c > > > > > > > > This application has requested the Runtime to terminate it in an > unusual > > > > way. > > > > Please contact the application's support team for more information. > > > > ================================================================= > > > > > > > > I am wondering what problems may lead to the errors. Please let me > know > > > > if you need more > > > > information. > > > > > > > > Thanks, > > > > Mengda > > > > > > > > Hi all, > > > > > > > > I just compiled the debugged and optimized versions of > petsc-3.1-p7. > > > Both are successful. I am running on Windows Vista 64bit machine. > > > > The C/C++ compiler is cl.exe from visual studio 2005 (64 bit) and no > > > FORTRAN compiler is used. BLAS/LAPACK > > > > support comes from Intel MKL-10.1.3.028 (under em64t\lib). No MPI is > > > used. > > > > > > > > The debugged petsc was configured with: > > > > $ ./config/configure.py --with-cc='cl' --with-fc=0 > --with-cxx='cl' > > > --with-mpi= > > > > 0 > > > > --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_thread.lib,mkl_core.lib > > > > ,libiomp5mt.lib] -CFLAGS='-MDd -W3 -Z7' -CXXFLAGS='-MDd -W3 -Z7' > > > > > > > > The optimized petsc was configured with: > > > > $ ./config/configure.py --with-cc='cl' --with-fc=0 > --with-cxx='cl' > > > --with-mpi= > > > > 0 --with-debugging=0 > > > --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_threa > > > > d.lib,mkl_core.lib,libiomp5mt.lib] -CFLAGS='-MD -wd4996 -O2' > > > -CXXFLAGS='-MD -w > > > > d4996 -O2' > > > > > > > > When I run ksp/examples/tutorials/Ex2.c. The result with debugged > > > petsc is > > > > ================================================================= > > > > Norm of error 0.000156044 iterations 6 > > > > ================================================================= > > > > > > > > However, there are errors with the optimized petsc with the output > as > > > follows: > > > > ================================================================= > > > > [0]PETSC ERROR: --------------------- Error Message > > > ---------------------------- > > > > -------- > > > > [0]PETSC ERROR: Nonconforming object sizes! > > > > [0]PETSC ERROR: Mat mat,Vec y: global dim 56 57! > > > > [0]PETSC ERROR: > > > ---------------------------------------------------------------- > > > > -------- > > > > [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 > 14:26:37 > > > CST 20 > > > > 10 > > > > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > > > > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > > > > [0]PETSC ERROR: See docs/index.html for manual pages. > > > > [0]PETSC ERROR: > > > ---------------------------------------------------------------- > > > > -------- > > > > [0]PETSC ERROR: > > > D:\Develop\Test\PETSc\petsc-3.1-p7\src\ksp\ksp\examples\tutorial > > > > s\ex2.exe on a cygwin-c- named CVBRL-38 by mengda Mon Jan 17 00:35:15 > > > 2011 > > > > [0]PETSC ERROR: Libraries linked from > > > /cygdrive/d/Develop/Test/PETSc/petsc-3.1-p > > > > 7/cygwin-c-opt/lib > > > > [0]PETSC ERROR: Configure run at Sun Jan 16 23:34:25 2011 > > > > [0]PETSC ERROR: Configure options --with-cc=cl --with-fc=0 > --with-cxx=cl > > > --with- > > > > mpi=0 --with-debugging=0 > > > --with-blas-lapack-lib="[mkl_intel_lp64.lib,mkl_intel_t > > > > hread.lib,mkl_core.lib,libiomp5mt.lib]" -CFLAGS="-MD -wd4996 -O2" > > > -CXXFLAGS="-MD > > > > -wd4996 -O2" --useThreads=0 > > > > [0]PETSC ERROR: > > > ---------------------------------------------------------------- > > > > -------- > > > > [0]PETSC ERROR: MatMult() line 1888 in > > > src/mat/interface/D:\Develop\Test\PETSc\P > > > > ETSC-~1.1-P\src\mat\INTERF~1\matrix.c > > > > [0]PETSC ERROR: main() line 146 in > > > src/ksp/ksp/examples/tutorials/D:\Develop\Tes > > > > t\PETSc\PETSC-~1.1-P\src\ksp\ksp\examples\TUTORI~1\ex2.c > > > > > > > > This application has requested the Runtime to terminate it in an > unusual > > > way. > > > > Please contact the application's support team for more information. > > > > ================================================================= > > > > > > > > I am wondering what problems may lead to the errors. Please let me > > > know if you need more > > > > information. > > > > > > > > Thanks, > > > > Mengda > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From wumengda at gmail.com Mon Jan 17 21:29:13 2011 From: wumengda at gmail.com (Mengda Wu) Date: Mon, 17 Jan 2011 19:29:13 -0800 Subject: [petsc-users] [petsc-maint #61421] ksp/examples/tutorials/Ex2.c: good with with-debugging but error without-debugging In-Reply-To: References: <8B6F3968-D06D-4C9D-9269-FDD5645B444B@mcs.anl.gov> Message-ID: Sorry but I am new to PETSc. This is out of my ability to find why this happens in PETSc. I am happy to do some tests to look for the tests if you can instruct me how to do that. There is an option in config/configure.py, which I do not use and may be relevant to check. --with-64-bit-pointers= : Use 64 bit compilers and libraries Thanks, Mengda On Mon, Jan 17, 2011 at 2:01 PM, Barry Smith wrote: > > Mengda, > > Thanks for the report. At the website it says "You perform arithmetic on > 64-bit pointers and then pass the results to an inline function that expects > an "int" data type." Do you know specifically where this is happening in > PETSc? We don't intend to in PETSc " perform arithmetic on 64-bit pointers > and then pass the results to an inline function that expects an "int" data > type. > > Thanks > > Barry > > > > On Jan 17, 2011, at 3:43 PM, Mengda Wu wrote: > > > This indeed is caused by a bug in Visual c++ 2005 64bit compiler when > using > > optimization. > > The result is correct after installing the hotfix: > > http://support.microsoft.com/kb/976617/ > > Thanks a lot! > > > > Mengda > > > > On Mon, Jan 17, 2011 at 12:27 PM, Barry Smith > wrote: > > > >> > >> Compiler bug. Immediately before the call to MatMult() in the code add > the > >> two lines > >> > >> ierr = VecView(u,0); > >> ierr = MatView(A,0); > >> > >> how large are the two objects? Given the code it is inconceivable that > >> suddenly the vector length becomes 57. > >> > >> Barry > >> > >> > >> On Jan 17, 2011, at 2:43 AM, Mengda Wu wrote: > >> > >>> Hi all, > >>> > >>> I just compiled the debugged and optimized versions of petsc-3.1-p7. > >>> Both are successful. I am running on Windows Vista 64bit machine. > >>> The C/C++ compiler is cl.exe from visual studio 2005 (64 bit) and no > >> FORTRAN > >>> compiler is used. BLAS/LAPACK > >>> support comes from Intel MKL-10.1.3.028 (under em64t\lib). No MPI is > >> used. > >>> > >>> The debugged petsc was configured with: > >>> $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' > >>> --with-mpi= > >>> 0 > >>> > >> > --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_thread.lib,mkl_core.lib > >>> ,libiomp5mt.lib] -CFLAGS='-MDd -W3 -Z7' -CXXFLAGS='-MDd -W3 -Z7' > >>> > >>> The optimized petsc was configured with: > >>> $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' > >>> --with-mpi= > >>> 0 --with-debugging=0 > >>> --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_threa > >>> d.lib,mkl_core.lib,libiomp5mt.lib] -CFLAGS='-MD -wd4996 -O2' > >> -CXXFLAGS='-MD > >>> -w > >>> d4996 -O2' > >>> > >>> When I run ksp/examples/tutorials/Ex2.c. The result with debugged > petsc > >>> is > >>> ================================================================= > >>> Norm of error 0.000156044 iterations 6 > >>> ================================================================= > >>> > >>> However, there are errors with the optimized petsc with the output as > >>> follows: > >>> ================================================================= > >>> [0]PETSC ERROR: --------------------- Error Message > >>> ---------------------------- > >>> -------- > >>> [0]PETSC ERROR: Nonconforming object sizes! > >>> [0]PETSC ERROR: Mat mat,Vec y: global dim 56 57! > >>> [0]PETSC ERROR: > >>> ---------------------------------------------------------------- > >>> -------- > >>> [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 > 14:26:37 > >>> CST 20 > >>> 10 > >>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. > >>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > >>> [0]PETSC ERROR: See docs/index.html for manual pages. > >>> [0]PETSC ERROR: > >>> ---------------------------------------------------------------- > >>> -------- > >>> [0]PETSC ERROR: > >>> D:\Develop\Test\PETSc\petsc-3.1-p7\src\ksp\ksp\examples\tutorial > >>> s\ex2.exe on a cygwin-c- named CVBRL-38 by mengda Mon Jan 17 00:35:15 > >> 2011 > >>> [0]PETSC ERROR: Libraries linked from > >>> /cygdrive/d/Develop/Test/PETSc/petsc-3.1-p > >>> 7/cygwin-c-opt/lib > >>> [0]PETSC ERROR: Configure run at Sun Jan 16 23:34:25 2011 > >>> [0]PETSC ERROR: Configure options --with-cc=cl --with-fc=0 > --with-cxx=cl > >>> --with- > >>> mpi=0 --with-debugging=0 > >>> --with-blas-lapack-lib="[mkl_intel_lp64.lib,mkl_intel_t > >>> hread.lib,mkl_core.lib,libiomp5mt.lib]" -CFLAGS="-MD -wd4996 -O2" > >>> -CXXFLAGS="-MD > >>> -wd4996 -O2" --useThreads=0 > >>> [0]PETSC ERROR: > >>> ---------------------------------------------------------------- > >>> -------- > >>> [0]PETSC ERROR: MatMult() line 1888 in > >>> src/mat/interface/D:\Develop\Test\PETSc\P > >>> ETSC-~1.1-P\src\mat\INTERF~1\matrix.c > >>> [0]PETSC ERROR: main() line 146 in > >>> src/ksp/ksp/examples/tutorials/D:\Develop\Tes > >>> t\PETSc\PETSC-~1.1-P\src\ksp\ksp\examples\TUTORI~1\ex2.c > >>> > >>> This application has requested the Runtime to terminate it in an > unusual > >>> way. > >>> Please contact the application's support team for more information. > >>> ================================================================= > >>> > >>> I am wondering what problems may lead to the errors. Please let me > know > >>> if you need more > >>> information. > >>> > >>> Thanks, > >>> Mengda > >>> > >>> Hi all, > >>> > >>> I just compiled the debugged and optimized versions of petsc-3.1-p7. > >> Both are successful. I am running on Windows Vista 64bit machine. > >>> The C/C++ compiler is cl.exe from visual studio 2005 (64 bit) and no > >> FORTRAN compiler is used. BLAS/LAPACK > >>> support comes from Intel MKL-10.1.3.028 (under em64t\lib). No MPI is > >> used. > >>> > >>> The debugged petsc was configured with: > >>> $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' > >> --with-mpi= > >>> 0 > >> > --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_thread.lib,mkl_core.lib > >>> ,libiomp5mt.lib] -CFLAGS='-MDd -W3 -Z7' -CXXFLAGS='-MDd -W3 -Z7' > >>> > >>> The optimized petsc was configured with: > >>> $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' > >> --with-mpi= > >>> 0 --with-debugging=0 > >> --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_threa > >>> d.lib,mkl_core.lib,libiomp5mt.lib] -CFLAGS='-MD -wd4996 -O2' > >> -CXXFLAGS='-MD -w > >>> d4996 -O2' > >>> > >>> When I run ksp/examples/tutorials/Ex2.c. The result with debugged > >> petsc is > >>> ================================================================= > >>> Norm of error 0.000156044 iterations 6 > >>> ================================================================= > >>> > >>> However, there are errors with the optimized petsc with the output as > >> follows: > >>> ================================================================= > >>> [0]PETSC ERROR: --------------------- Error Message > >> ---------------------------- > >>> -------- > >>> [0]PETSC ERROR: Nonconforming object sizes! > >>> [0]PETSC ERROR: Mat mat,Vec y: global dim 56 57! > >>> [0]PETSC ERROR: > >> ---------------------------------------------------------------- > >>> -------- > >>> [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 > 14:26:37 > >> CST 20 > >>> 10 > >>> [0]PETSC ERROR: See docs/changes/index.html for recent updates. > >>> [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > >>> [0]PETSC ERROR: See docs/index.html for manual pages. > >>> [0]PETSC ERROR: > >> ---------------------------------------------------------------- > >>> -------- > >>> [0]PETSC ERROR: > >> D:\Develop\Test\PETSc\petsc-3.1-p7\src\ksp\ksp\examples\tutorial > >>> s\ex2.exe on a cygwin-c- named CVBRL-38 by mengda Mon Jan 17 00:35:15 > >> 2011 > >>> [0]PETSC ERROR: Libraries linked from > >> /cygdrive/d/Develop/Test/PETSc/petsc-3.1-p > >>> 7/cygwin-c-opt/lib > >>> [0]PETSC ERROR: Configure run at Sun Jan 16 23:34:25 2011 > >>> [0]PETSC ERROR: Configure options --with-cc=cl --with-fc=0 > --with-cxx=cl > >> --with- > >>> mpi=0 --with-debugging=0 > >> --with-blas-lapack-lib="[mkl_intel_lp64.lib,mkl_intel_t > >>> hread.lib,mkl_core.lib,libiomp5mt.lib]" -CFLAGS="-MD -wd4996 -O2" > >> -CXXFLAGS="-MD > >>> -wd4996 -O2" --useThreads=0 > >>> [0]PETSC ERROR: > >> ---------------------------------------------------------------- > >>> -------- > >>> [0]PETSC ERROR: MatMult() line 1888 in > >> src/mat/interface/D:\Develop\Test\PETSc\P > >>> ETSC-~1.1-P\src\mat\INTERF~1\matrix.c > >>> [0]PETSC ERROR: main() line 146 in > >> src/ksp/ksp/examples/tutorials/D:\Develop\Tes > >>> t\PETSc\PETSC-~1.1-P\src\ksp\ksp\examples\TUTORI~1\ex2.c > >>> > >>> This application has requested the Runtime to terminate it in an > unusual > >> way. > >>> Please contact the application's support team for more information. > >>> ================================================================= > >>> > >>> I am wondering what problems may lead to the errors. Please let me > >> know if you need more > >>> information. > >>> > >>> Thanks, > >>> Mengda > >> > >> > > > > This indeed is caused by a bug in Visual c++ 2005 64bit compiler when > using optimization. > > The result is correct after installing the hotfix: > http://support.microsoft.com/kb/976617/ > > Thanks a lot! > > > > Mengda > > > > On Mon, Jan 17, 2011 at 12:27 PM, Barry Smith > wrote: > > > > Compiler bug. Immediately before the call to MatMult() in the code add > the two lines > > > > ierr = VecView(u,0); > > ierr = MatView(A,0); > > > > how large are the two objects? Given the code it is inconceivable that > suddenly the vector length becomes 57. > > > > Barry > > > > > > On Jan 17, 2011, at 2:43 AM, Mengda Wu wrote: > > > > > Hi all, > > > > > > I just compiled the debugged and optimized versions of petsc-3.1-p7. > > > Both are successful. I am running on Windows Vista 64bit machine. > > > The C/C++ compiler is cl.exe from visual studio 2005 (64 bit) and no > FORTRAN > > > compiler is used. BLAS/LAPACK > > > support comes from Intel MKL-10.1.3.028 (under em64t\lib). No MPI is > used. > > > > > > The debugged petsc was configured with: > > > $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' > > > --with-mpi= > > > 0 > > > > --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_thread.lib,mkl_core.lib > > > ,libiomp5mt.lib] -CFLAGS='-MDd -W3 -Z7' -CXXFLAGS='-MDd -W3 -Z7' > > > > > > The optimized petsc was configured with: > > > $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' > > > --with-mpi= > > > 0 --with-debugging=0 > > > --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_threa > > > d.lib,mkl_core.lib,libiomp5mt.lib] -CFLAGS='-MD -wd4996 -O2' > -CXXFLAGS='-MD > > > -w > > > d4996 -O2' > > > > > > When I run ksp/examples/tutorials/Ex2.c. The result with debugged > petsc > > > is > > > ================================================================= > > > Norm of error 0.000156044 iterations 6 > > > ================================================================= > > > > > > However, there are errors with the optimized petsc with the output as > > > follows: > > > ================================================================= > > > [0]PETSC ERROR: --------------------- Error Message > > > ---------------------------- > > > -------- > > > [0]PETSC ERROR: Nonconforming object sizes! > > > [0]PETSC ERROR: Mat mat,Vec y: global dim 56 57! > > > [0]PETSC ERROR: > > > ---------------------------------------------------------------- > > > -------- > > > [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 > 14:26:37 > > > CST 20 > > > 10 > > > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > > > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > > > [0]PETSC ERROR: See docs/index.html for manual pages. > > > [0]PETSC ERROR: > > > ---------------------------------------------------------------- > > > -------- > > > [0]PETSC ERROR: > > > D:\Develop\Test\PETSc\petsc-3.1-p7\src\ksp\ksp\examples\tutorial > > > s\ex2.exe on a cygwin-c- named CVBRL-38 by mengda Mon Jan 17 00:35:15 > 2011 > > > [0]PETSC ERROR: Libraries linked from > > > /cygdrive/d/Develop/Test/PETSc/petsc-3.1-p > > > 7/cygwin-c-opt/lib > > > [0]PETSC ERROR: Configure run at Sun Jan 16 23:34:25 2011 > > > [0]PETSC ERROR: Configure options --with-cc=cl --with-fc=0 > --with-cxx=cl > > > --with- > > > mpi=0 --with-debugging=0 > > > --with-blas-lapack-lib="[mkl_intel_lp64.lib,mkl_intel_t > > > hread.lib,mkl_core.lib,libiomp5mt.lib]" -CFLAGS="-MD -wd4996 -O2" > > > -CXXFLAGS="-MD > > > -wd4996 -O2" --useThreads=0 > > > [0]PETSC ERROR: > > > ---------------------------------------------------------------- > > > -------- > > > [0]PETSC ERROR: MatMult() line 1888 in > > > src/mat/interface/D:\Develop\Test\PETSc\P > > > ETSC-~1.1-P\src\mat\INTERF~1\matrix.c > > > [0]PETSC ERROR: main() line 146 in > > > src/ksp/ksp/examples/tutorials/D:\Develop\Tes > > > t\PETSc\PETSC-~1.1-P\src\ksp\ksp\examples\TUTORI~1\ex2.c > > > > > > This application has requested the Runtime to terminate it in an > unusual > > > way. > > > Please contact the application's support team for more information. > > > ================================================================= > > > > > > I am wondering what problems may lead to the errors. Please let me > know > > > if you need more > > > information. > > > > > > Thanks, > > > Mengda > > > > > > Hi all, > > > > > > I just compiled the debugged and optimized versions of > petsc-3.1-p7. Both are successful. I am running on Windows Vista 64bit > machine. > > > The C/C++ compiler is cl.exe from visual studio 2005 (64 bit) and no > FORTRAN compiler is used. BLAS/LAPACK > > > support comes from Intel MKL-10.1.3.028 (under em64t\lib). No MPI is > used. > > > > > > The debugged petsc was configured with: > > > $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' > --with-mpi= > > > 0 > --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_thread.lib,mkl_core.lib > > > ,libiomp5mt.lib] -CFLAGS='-MDd -W3 -Z7' -CXXFLAGS='-MDd -W3 -Z7' > > > > > > The optimized petsc was configured with: > > > $ ./config/configure.py --with-cc='cl' --with-fc=0 --with-cxx='cl' > --with-mpi= > > > 0 --with-debugging=0 > --with-blas-lapack-lib=[mkl_intel_lp64.lib,mkl_intel_threa > > > d.lib,mkl_core.lib,libiomp5mt.lib] -CFLAGS='-MD -wd4996 -O2' > -CXXFLAGS='-MD -w > > > d4996 -O2' > > > > > > When I run ksp/examples/tutorials/Ex2.c. The result with debugged > petsc is > > > ================================================================= > > > Norm of error 0.000156044 iterations 6 > > > ================================================================= > > > > > > However, there are errors with the optimized petsc with the output > as follows: > > > ================================================================= > > > [0]PETSC ERROR: --------------------- Error Message > ---------------------------- > > > -------- > > > [0]PETSC ERROR: Nonconforming object sizes! > > > [0]PETSC ERROR: Mat mat,Vec y: global dim 56 57! > > > [0]PETSC ERROR: > ---------------------------------------------------------------- > > > -------- > > > [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 > 14:26:37 CST 20 > > > 10 > > > [0]PETSC ERROR: See docs/changes/index.html for recent updates. > > > [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > > > [0]PETSC ERROR: See docs/index.html for manual pages. > > > [0]PETSC ERROR: > ---------------------------------------------------------------- > > > -------- > > > [0]PETSC ERROR: > D:\Develop\Test\PETSc\petsc-3.1-p7\src\ksp\ksp\examples\tutorial > > > s\ex2.exe on a cygwin-c- named CVBRL-38 by mengda Mon Jan 17 00:35:15 > 2011 > > > [0]PETSC ERROR: Libraries linked from > /cygdrive/d/Develop/Test/PETSc/petsc-3.1-p > > > 7/cygwin-c-opt/lib > > > [0]PETSC ERROR: Configure run at Sun Jan 16 23:34:25 2011 > > > [0]PETSC ERROR: Configure options --with-cc=cl --with-fc=0 > --with-cxx=cl --with- > > > mpi=0 --with-debugging=0 > --with-blas-lapack-lib="[mkl_intel_lp64.lib,mkl_intel_t > > > hread.lib,mkl_core.lib,libiomp5mt.lib]" -CFLAGS="-MD -wd4996 -O2" > -CXXFLAGS="-MD > > > -wd4996 -O2" --useThreads=0 > > > [0]PETSC ERROR: > ---------------------------------------------------------------- > > > -------- > > > [0]PETSC ERROR: MatMult() line 1888 in > src/mat/interface/D:\Develop\Test\PETSc\P > > > ETSC-~1.1-P\src\mat\INTERF~1\matrix.c > > > [0]PETSC ERROR: main() line 146 in > src/ksp/ksp/examples/tutorials/D:\Develop\Tes > > > t\PETSc\PETSC-~1.1-P\src\ksp\ksp\examples\TUTORI~1\ex2.c > > > > > > This application has requested the Runtime to terminate it in an > unusual way. > > > Please contact the application's support team for more information. > > > ================================================================= > > > > > > I am wondering what problems may lead to the errors. Please let me > know if you need more > > > information. > > > > > > Thanks, > > > Mengda > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From chetan.jhurani at gmail.com Mon Jan 17 21:42:03 2011 From: chetan.jhurani at gmail.com (Chetan Jhurani) Date: Mon, 17 Jan 2011 20:42:03 -0700 Subject: [petsc-users] using KSPLSQR and SKPCGNE In-Reply-To: References: Message-ID: <03F2101115EC462786CDD943770030FA@spiff> See the emails in this thread "KSPLSQR convergence criterion" for explicit calls for LSQR and some more info. http://lists.mcs.anl.gov/pipermail/petsc-users/2010-August/thread.html#6784 Chetan _____ From: petsc-users-bounces at mcs.anl.gov [mailto:petsc-users-bounces at mcs.anl.gov] On Behalf Of Gaurish Telang Sent: Monday, January 17, 2011 01:16 AM To: petsc-users at mcs.anl.gov Subject: [petsc-users] using KSPLSQR and SKPCGNE Hi, I am new to PETSc and I wanted to solve some least squares problems with it. On searching the net I found that KSPLSQR() and KSPCGNE() solve the least squares system |Ax-b| But I don't really know how to use these functions to get my answer. This manual page did not help: http://www.mcs.anl.gov/petsc/petsc-2/snapshots/petsc-current/docs/manualpage s/KSP/KSPLSQR.html Apparently no tutorial code uses these functions. If anyone could could give a small code snippet of how to use these functions (assuming A and b are given) then it would be really helpful. Thanks, Gaurish -------------- next part -------------- An HTML attachment was scrubbed... URL: From gaurish108 at gmail.com Mon Jan 17 22:56:35 2011 From: gaurish108 at gmail.com (Gaurish Telang) Date: Mon, 17 Jan 2011 23:56:35 -0500 Subject: [petsc-users] Trouble with solving 5x5 system... Ay=x on 2 or more processors Message-ID: Hi, I am trying to solve a linear system where Ay=x and A is a 5x5 matrix stored in a binary file called 'square' and x=[1;1;1;1;1] I am trying to display the matrix A, and the vectors x(rhs) and y(soln) in that order to standard output. On running my code on a single processor the answer returned is accurate. But on using 2 processors I get weird error messages PART of which says [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------------ [1]PETSC ERROR: Nonconforming object sizes! [1]PETSC ERROR: Mat mat,Vec x: global dim 5 10! Also somehow the vector x gets displayed TWICE when run on two processes. A however gets displayed ONCE (as it should!!) I am attaching the output I get when I run on 1 process and the when I run the same code on 2 processes. please let me know where I could be going wrong. (1) This is what I get on running on ONE process: (Here system is solved successfully) gaurish108 at gaurish108-laptop:~/Desktop$ $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -n 1 ./ex4 -f square 1.5761308167754828e-01 1.4188633862721534e-01 6.5574069915658684e-01 7.5774013057833345e-01 7.0604608801960878e-01 9.7059278176061570e-01 4.2176128262627499e-01 3.5711678574189554e-02 7.4313246812491618e-01 3.1832846377420676e-02 9.5716694824294557e-01 9.1573552518906709e-01 8.4912930586877711e-01 3.9222701953416816e-01 2.7692298496088996e-01 4.8537564872284122e-01 7.9220732955955442e-01 9.3399324775755055e-01 6.5547789017755664e-01 4.6171390631153941e-02 8.0028046888880011e-01 9.5949242639290300e-01 6.7873515485777347e-01 1.7118668781156177e-01 9.7131781235847536e-02 Process [0] 1 1 1 1 1 KSPGetIterationNumber 5 KSPGetResidualNorm 0.000000 Process [0] -0.810214 2.33178 -1.31131 1.09323 1.17322 gaurish108 at gaurish108-laptop:~/Desktop$ %-------------------------------------------------------------------- This is what I get on running on TWO processes: (2) gaurish108 at gaurish108-laptop:~/Desktop$ $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -n 2 ./ex4 -f square 1.5761308167754828e-01 1.4188633862721534e-01 6.5574069915658684e-01 7.5774013057833345e-01 7.0604608801960878e-01 9.7059278176061570e-01 4.2176128262627499e-01 3.5711678574189554e-02 7.4313246812491618e-01 3.1832846377420676e-02 9.5716694824294557e-01 9.1573552518906709e-01 8.4912930586877711e-01 3.9222701953416816e-01 2.7692298496088996e-01 4.8537564872284122e-01 7.9220732955955442e-01 9.3399324775755055e-01 6.5547789017755664e-01 4.6171390631153941e-02 8.0028046888880011e-01 9.5949242639290300e-01 6.7873515485777347e-01 1.7118668781156177e-01 9.7131781235847536e-02 Process [0] 1 1 1 1 1 Process [1] 1 1 1 1 1 [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------------ [1]PETSC ERROR: Nonconforming object sizes! [1]PETSC ERROR: Mat mat,Vec x: global dim 5 10! [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: Petsc Release Version 3.1.0, Patch 5, Mon Sep 27 11:51:54 CDT 2010 [1]PETSC ERROR: See docs/changes/index.html for recent updates. [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [1]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Nonconforming object sizes! [0]PETSC ERROR: Mat mat,Vec x: global dim 5 10! [0]PETSC ERROR: See docs/index.html for manual pages. [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: ./ex4 on a linux-gnu named gaurish108-laptop by gaurish108 Mon Jan 17 23:49:18 2011 [1]PETSC ERROR: Libraries linked from /home/gaurish108/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/linux-gnu-c-debug/lib [1]PETSC ERROR: Configure run at Sat Nov 13 20:34:38 2010 [1]PETSC ERROR: Configure options --with-cc=gcc --with-fc=gfortran --download-f-blas-lapack=1 --download-mpich=1 --download-superlu_dist=1 --download-parmetis=1 --with-superlu_dist=1 --with-parmetis=1 [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: MatMultTranspose() line 1947 in src/mat/interface/matrix.c [1]PETSC ERROR: KSPSolve_CGNE() line 103 in src/ksp/ksp/impls/cg/cgne/cgne.c [1]PETSC ERROR: KSPSolve() line 396 in src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: main() line 78 in src/mat/examples/tutorials/ex4.c application called MPI_Abort(MPI_COMM_WORLD, 60) - process 1[cli_1]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, 60) - process 1 [0]0:Return code = 0, signaled with Interrupt [0]1:Return code = 60 gaurish108 at gaurish108-laptop:~/Desktop$ -------------- next part -------------- An HTML attachment was scrubbed... URL: From abhyshr at mcs.anl.gov Tue Jan 18 00:21:35 2011 From: abhyshr at mcs.anl.gov (Shri) Date: Tue, 18 Jan 2011 00:21:35 -0600 (CST) Subject: [petsc-users] Trouble with solving 5x5 system... Ay=x on 2 or more processors In-Reply-To: Message-ID: <1281068921.73804.1295331695590.JavaMail.root@zimbra.anl.gov> You are not setting the 'global' and the 'local' sizes of vector x correctly as the error message says. The global size of the matrix is 5X5 while the global size of the vector x is 10 !!! ----- Original Message ----- Hi, I am trying to solve a linear system where Ay=x and A is a 5x5 matrix stored in a binary file called 'square' and x=[1;1;1;1;1] I am trying to display the matrix A, and the vectors x(rhs) and y(soln) in that order to standard output. On running my code on a single processor the answer returned is accurate. But on using 2 processors I get weird error messages PART of which says [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------------ [1]PETSC ERROR: Nonconforming object sizes! [1]PETSC ERROR: Mat mat,Vec x: global dim 5 10! Also somehow the vector x gets displayed TWICE when run on two processes. A however gets displayed ONCE (as it should!!) I am attaching the output I get when I run on 1 process and the when I run the same code on 2 processes. please let me know where I could be going wrong. (1) This is what I get on running on ONE process: (Here system is solved successfully) gaurish108 at gaurish108-laptop:~/Desktop$ $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -n 1 ./ex4 -f square 1.5761308167754828e-01 1.4188633862721534e-01 6.5574069915658684e-01 7.5774013057833345e-01 7.0604608801960878e-01 9.7059278176061570e-01 4.2176128262627499e-01 3.5711678574189554e-02 7.4313246812491618e-01 3.1832846377420676e-02 9.5716694824294557e-01 9.1573552518906709e-01 8.4912930586877711e-01 3.9222701953416816e-01 2.7692298496088996e-01 4.8537564872284122e-01 7.9220732955955442e-01 9.3399324775755055e-01 6.5547789017755664e-01 4.6171390631153941e-02 8.0028046888880011e-01 9.5949242639290300e-01 6.7873515485777347e-01 1.7118668781156177e-01 9.7131781235847536e-02 Process [0] 1 1 1 1 1 KSPGetIterationNumber 5 KSPGetResidualNorm 0.000000 Process [0] -0.810214 2.33178 -1.31131 1.09323 1.17322 gaurish108 at gaurish108-laptop:~/Desktop$ %-------------------------------------------------------------------- This is what I get on running on TWO processes: (2) gaurish108 at gaurish108-laptop:~/Desktop$ $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -n 2 ./ex4 -f square 1.5761308167754828e-01 1.4188633862721534e-01 6.5574069915658684e-01 7.5774013057833345e-01 7.0604608801960878e-01 9.7059278176061570e-01 4.2176128262627499e-01 3.5711678574189554e-02 7.4313246812491618e-01 3.1832846377420676e-02 9.5716694824294557e-01 9.1573552518906709e-01 8.4912930586877711e-01 3.9222701953416816e-01 2.7692298496088996e-01 4.8537564872284122e-01 7.9220732955955442e-01 9.3399324775755055e-01 6.5547789017755664e-01 4.6171390631153941e-02 8.0028046888880011e-01 9.5949242639290300e-01 6.7873515485777347e-01 1.7118668781156177e-01 9.7131781235847536e-02 Process [0] 1 1 1 1 1 Process [1] 1 1 1 1 1 [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------------ [1]PETSC ERROR: Nonconforming object sizes! [1]PETSC ERROR: Mat mat,Vec x: global dim 5 10! [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: Petsc Release Version 3.1.0, Patch 5, Mon Sep 27 11:51:54 CDT 2010 [1]PETSC ERROR: See docs/changes/index.html for recent updates. [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [1]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Nonconforming object sizes! [0]PETSC ERROR: Mat mat,Vec x: global dim 5 10! [0]PETSC ERROR: See docs/index.html for manual pages. [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: ./ex4 on a linux-gnu named gaurish108-laptop by gaurish108 Mon Jan 17 23:49:18 2011 [1]PETSC ERROR: Libraries linked from /home/gaurish108/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/linux-gnu-c-debug/lib [1]PETSC ERROR: Configure run at Sat Nov 13 20:34:38 2010 [1]PETSC ERROR: Configure options --with-cc=gcc --with-fc=gfortran --download-f-blas-lapack=1 --download-mpich=1 --download-superlu_dist=1 --download-parmetis=1 --with-superlu_dist=1 --with-parmetis=1 [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: MatMultTranspose() line 1947 in src/mat/interface/matrix.c [1]PETSC ERROR: KSPSolve_CGNE() line 103 in src/ksp/ksp/impls/cg/cgne/cgne.c [1]PETSC ERROR: KSPSolve() line 396 in src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: main() line 78 in src/mat/examples/tutorials/ex4.c application called MPI_Abort(MPI_COMM_WORLD, 60) - process 1[cli_1]: aborting job: application called MPI_Abort(MPI_COMM_WORLD, 60) - process 1 [0]0:Return code = 0, signaled with Interrupt [0]1:Return code = 60 gaurish108 at gaurish108-laptop:~/Desktop$ -------------- next part -------------- An HTML attachment was scrubbed... URL: From gaurish108 at gmail.com Tue Jan 18 01:23:06 2011 From: gaurish108 at gmail.com (Gaurish Telang) Date: Tue, 18 Jan 2011 02:23:06 -0500 Subject: [petsc-users] Trouble with solving 5x5 system... Ay=x on 2 or more processors In-Reply-To: References: Message-ID: Hmm,,,,, Then how come the code worked perfectly on 1 processor? Any way here is my code. I cant really understand where I went wrong. Possibly with VecCreate(). I think when more than one processor is involved I am not using it correctly static char help[] = "'*x.\n\ -f : file to load \n\n"; /* Include "petscmat.h" so that we can use matrices. automatically includes: petscsys.h - base PETSc routines petscvec.h - vectors petscmat.h - matrices petscis.h - index sets petscviewer.h - viewers */ #include "petscmat.h" #include "petscvec.h" #include "petscksp.h" /* For the iterative solvers */ //extern PetscErrorCode LowRankUpdate(Mat,Mat,Vec,Vec,Vec,Vec,PetscInt); //#include #include #include #undef __FUNCT__ #define __FUNCT__ "main" int main(int argc,char **args) { Mat U; /* matrix */ PetscViewer fd; /* viewer */ char file[PETSC_MAX_PATH_LEN]; /* input file name */ PetscErrorCode ierr; PetscTruth flg; Vec x,y; PetscInt i,n,m; PetscScalar *xx; KSP ksp; PC pc; PetscMPIInt size; PetscInt num_iters; PetscReal rnorm; KSPConvergedReason reason; PetscInitialize(&argc,&args,(char *)0,help); /* Determine file from which we read the matrix */ ierr = PetscOptionsGetString(PETSC_NULL,"-f",file,PETSC_MAX_PATH_LEN-1,&flg);CHKERRQ(ierr); if (!flg) SETERRQ(1,"Must indicate binary file with the -f option"); /* Open binary file. Note that we use FILE_MODE_READ to indicate reading from this file. */ ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,file,FILE_MODE_READ,&fd);CHKERRQ(ierr); ierr = MatLoad(fd,MATMPIDENSE,&U);CHKERRQ(ierr); ierr = PetscViewerDestroy(fd);CHKERRQ(ierr); ierr = MatGetSize(U,&m,&n);CHKERRQ(ierr); ierr = VecCreateMPI(PETSC_COMM_WORLD,m,PETSC_DETERMINE,&x);CHKERRQ(ierr); ierr = VecCreateMPI(PETSC_COMM_WORLD,m,PETSC_DETERMINE,&y);CHKERRQ(ierr); ierr=MatView(U,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr); //ierr=MatView(U,PETSC_VIEWER_DRAW_WORLD);CHKERRQ(ierr); /* -------------------------------------------------------------------------------------- */ ierr=VecSet(x,1);CHKERRQ(ierr); ierr = VecView(x,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr); /* ------------------------------------------------------------------------------------ */ ierr = KSPCreate(PETSC_COMM_WORLD, &ksp); CHKERRQ(ierr); ierr = KSPSetType(ksp, KSPCGNE); CHKERRQ(ierr); ierr = KSPSetOperators(ksp, U, U, DIFFERENT_NONZERO_PATTERN); CHKERRQ(ierr); ierr = KSPSetFromOptions(ksp); CHKERRQ(ierr); ierr = KSPSolve(ksp, x, y); CHKERRQ(ierr); ierr = KSPGetIterationNumber(ksp, &num_iters); CHKERRQ(ierr); ierr = KSPGetResidualNorm(ksp, &rnorm); CHKERRQ(ierr); ierr = KSPGetConvergedReason(ksp, &reason); CHKERRQ(ierr); printf ("KSPGetIterationNumber %i \n ", num_iters); printf("KSPGetResidualNorm %f \n" , rnorm ); // printf("KSPConvergedReason %f \n ",reason); ierr=VecView(y,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr); /* Free work space. All PETSc objects should be destroyed when they are no longer needed. */ ierr = MatDestroy(U);CHKERRQ(ierr); ierr = VecDestroy(x);CHKERRQ(ierr); ierr = VecDestroy(y);CHKERRQ(ierr); ierr = PetscFinalize();CHKERRQ(ierr); return 0; } On Mon, Jan 17, 2011 at 11:56 PM, Gaurish Telang wrote: > Hi, > > I am trying to solve a linear system where Ay=x and A is a 5x5 matrix > stored in a binary file called 'square' and x=[1;1;1;1;1] > > I am trying to display the matrix A, and the vectors x(rhs) and y(soln) in > that order to standard output. > > On running my code on a single processor the answer returned is accurate. > But on using 2 processors I get weird error messages PART of which says > > [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [1]PETSC ERROR: Nonconforming object sizes! > [1]PETSC ERROR: Mat mat,Vec x: global dim 5 10! > > Also somehow the vector x gets displayed TWICE when run on two processes. > A however gets displayed ONCE (as it should!!) > > > I am attaching the output I get when I run on 1 process and the when I run > the same code on 2 processes. > please let me know where I could be going wrong. > > (1) > > This is what I get on running on ONE process: (Here system is solved > successfully) > > gaurish108 at gaurish108-laptop:~/Desktop$ $PETSC_DIR/$PETSC_ARCH/bin/mpiexec > -n 1 ./ex4 -f square > > 1.5761308167754828e-01 1.4188633862721534e-01 6.5574069915658684e-01 > 7.5774013057833345e-01 7.0604608801960878e-01 > 9.7059278176061570e-01 4.2176128262627499e-01 3.5711678574189554e-02 > 7.4313246812491618e-01 3.1832846377420676e-02 > 9.5716694824294557e-01 9.1573552518906709e-01 8.4912930586877711e-01 > 3.9222701953416816e-01 2.7692298496088996e-01 > 4.8537564872284122e-01 7.9220732955955442e-01 9.3399324775755055e-01 > 6.5547789017755664e-01 4.6171390631153941e-02 > 8.0028046888880011e-01 9.5949242639290300e-01 6.7873515485777347e-01 > 1.7118668781156177e-01 9.7131781235847536e-02 > Process [0] > 1 > 1 > 1 > 1 > 1 > KSPGetIterationNumber 5 > KSPGetResidualNorm 0.000000 > Process [0] > -0.810214 > 2.33178 > -1.31131 > 1.09323 > 1.17322 > gaurish108 at gaurish108-laptop:~/Desktop$ > > %-------------------------------------------------------------------- > This is what I get on running on TWO processes: > > (2) > > gaurish108 at gaurish108-laptop:~/Desktop$ $PETSC_DIR/$PETSC_ARCH/bin/mpiexec > -n 2 ./ex4 -f square > 1.5761308167754828e-01 1.4188633862721534e-01 6.5574069915658684e-01 > 7.5774013057833345e-01 7.0604608801960878e-01 > 9.7059278176061570e-01 4.2176128262627499e-01 3.5711678574189554e-02 > 7.4313246812491618e-01 3.1832846377420676e-02 > 9.5716694824294557e-01 9.1573552518906709e-01 8.4912930586877711e-01 > 3.9222701953416816e-01 2.7692298496088996e-01 > 4.8537564872284122e-01 7.9220732955955442e-01 9.3399324775755055e-01 > 6.5547789017755664e-01 4.6171390631153941e-02 > 8.0028046888880011e-01 9.5949242639290300e-01 6.7873515485777347e-01 > 1.7118668781156177e-01 9.7131781235847536e-02 > Process [0] > 1 > 1 > 1 > 1 > 1 > Process [1] > 1 > 1 > 1 > 1 > 1 > [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [1]PETSC ERROR: Nonconforming object sizes! > [1]PETSC ERROR: Mat mat,Vec x: global dim 5 10! > [1]PETSC ERROR: > ------------------------------------------------------------------------ > [1]PETSC ERROR: Petsc Release Version 3.1.0, Patch 5, Mon Sep 27 11:51:54 > CDT 2010 > [1]PETSC ERROR: See docs/changes/index.html for recent updates. > [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [1]PETSC ERROR: --------------------- Error Message > ------------------------------------ > [0]PETSC ERROR: Nonconforming object sizes! > [0]PETSC ERROR: Mat mat,Vec x: global dim 5 10! > [0]PETSC ERROR: See docs/index.html for manual pages. > [1]PETSC ERROR: > ------------------------------------------------------------------------ > [1]PETSC ERROR: ./ex4 on a linux-gnu named gaurish108-laptop by gaurish108 > Mon Jan 17 23:49:18 2011 > [1]PETSC ERROR: Libraries linked from > /home/gaurish108/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/linux-gnu-c-debug/lib > [1]PETSC ERROR: Configure run at Sat Nov 13 20:34:38 2010 > [1]PETSC ERROR: Configure options --with-cc=gcc --with-fc=gfortran > --download-f-blas-lapack=1 --download-mpich=1 --download-superlu_dist=1 > --download-parmetis=1 --with-superlu_dist=1 --with-parmetis=1 > [1]PETSC ERROR: > ------------------------------------------------------------------------ > [1]PETSC ERROR: MatMultTranspose() line 1947 in src/mat/interface/matrix.c > [1]PETSC ERROR: KSPSolve_CGNE() line 103 in > src/ksp/ksp/impls/cg/cgne/cgne.c > [1]PETSC ERROR: KSPSolve() line 396 in src/ksp/ksp/interface/itfunc.c > [1]PETSC ERROR: main() line 78 in src/mat/examples/tutorials/ex4.c > application called MPI_Abort(MPI_COMM_WORLD, 60) - process 1[cli_1]: > aborting job: > application called MPI_Abort(MPI_COMM_WORLD, 60) - process 1 > [0]0:Return code = 0, signaled with Interrupt > [0]1:Return code = 60 > gaurish108 at gaurish108-laptop:~/Desktop$ > > > > > > > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From aron.ahmadia at kaust.edu.sa Tue Jan 18 01:30:43 2011 From: aron.ahmadia at kaust.edu.sa (Aron Ahmadia) Date: Tue, 18 Jan 2011 10:30:43 +0300 Subject: [petsc-users] Trouble with solving 5x5 system... Ay=x on 2 or more processors In-Reply-To: References: Message-ID: Gaurish, I would suggest you spend some time reading the PETSc user's manual available here: http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manual.pdf These two lines are incorrect. ierr = VecCreateMPI(PETSC_COMM_WORLD,m,PETSC_DETERMINE,&x);CHKERRQ(ierr); ierr = VecCreateMPI(PETSC_COMM_WORLD,m,PETSC_DETERMINE,&y);CHKERRQ(ierr); The problem, as indicated by Shri, is that you are determining your global size from your local size. In the case of a single-process run, these two are the same. For any other code, global_size = sum(local_size[]), corresponding to the local size on each process. You can verify this by running with 3 processes, you will see a vector of length 15 instead of 10. A On Tue, Jan 18, 2011 at 10:23 AM, Gaurish Telang wrote: > Hmm,,,,, > > Then how come the code worked perfectly on 1 processor? > > Any way here is my code. > > I cant really understand where I went wrong. Possibly with VecCreate(). I > think when more than one processor is involved I am not using it correctly > > > > static char help[] = "'*x.\n\ > -f : file to load \n\n"; > > /* > Include "petscmat.h" so that we can use matrices. > automatically includes: > petscsys.h - base PETSc routines petscvec.h - vectors > petscmat.h - matrices > petscis.h - index sets petscviewer.h - > viewers > */ > #include "petscmat.h" > #include "petscvec.h" > #include "petscksp.h" /* For the iterative solvers */ > //extern PetscErrorCode LowRankUpdate(Mat,Mat,Vec,Vec,Vec,Vec,PetscInt); > //#include > #include > #include > > #undef __FUNCT__ > #define __FUNCT__ "main" > int main(int argc,char **args) > { > Mat U; /* matrix */ > PetscViewer fd; /* viewer */ > char file[PETSC_MAX_PATH_LEN]; /* input file name */ > PetscErrorCode ierr; > PetscTruth flg; > Vec x,y; > PetscInt i,n,m; > PetscScalar *xx; > > KSP ksp; > PC pc; > PetscMPIInt size; > > PetscInt num_iters; > PetscReal rnorm; > KSPConvergedReason reason; > > PetscInitialize(&argc,&args,(char *)0,help); > > /* > Determine file from which we read the matrix > > */ > ierr = > PetscOptionsGetString(PETSC_NULL,"-f",file,PETSC_MAX_PATH_LEN-1,&flg);CHKERRQ(ierr); > if (!flg) SETERRQ(1,"Must indicate binary file with the -f option"); > > > /* > Open binary file. Note that we use FILE_MODE_READ to indicate > reading from this file. > */ > ierr = > PetscViewerBinaryOpen(PETSC_COMM_WORLD,file,FILE_MODE_READ,&fd);CHKERRQ(ierr); > > ierr = MatLoad(fd,MATMPIDENSE,&U);CHKERRQ(ierr); > ierr = PetscViewerDestroy(fd);CHKERRQ(ierr); > > ierr = MatGetSize(U,&m,&n);CHKERRQ(ierr); > > ierr = VecCreateMPI(PETSC_COMM_WORLD,m,PETSC_DETERMINE,&x);CHKERRQ(ierr); > > ierr = VecCreateMPI(PETSC_COMM_WORLD,m,PETSC_DETERMINE,&y);CHKERRQ(ierr); > ierr=MatView(U,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr); > > //ierr=MatView(U,PETSC_VIEWER_DRAW_WORLD);CHKERRQ(ierr); > /* > -------------------------------------------------------------------------------------- > */ > ierr=VecSet(x,1);CHKERRQ(ierr); > > ierr = VecView(x,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr); > /* > ------------------------------------------------------------------------------------ > */ > ierr = KSPCreate(PETSC_COMM_WORLD, &ksp); CHKERRQ(ierr); > ierr = KSPSetType(ksp, KSPCGNE); CHKERRQ(ierr); > ierr = KSPSetOperators(ksp, U, U, DIFFERENT_NONZERO_PATTERN); > CHKERRQ(ierr); > ierr = KSPSetFromOptions(ksp); CHKERRQ(ierr); > > ierr = KSPSolve(ksp, x, y); CHKERRQ(ierr); > > ierr = KSPGetIterationNumber(ksp, &num_iters); CHKERRQ(ierr); > > > ierr = KSPGetResidualNorm(ksp, &rnorm); CHKERRQ(ierr); > > > ierr = KSPGetConvergedReason(ksp, &reason); CHKERRQ(ierr); > > printf ("KSPGetIterationNumber %i \n ", num_iters); > printf("KSPGetResidualNorm %f \n" , rnorm ); > // printf("KSPConvergedReason %f \n ",reason); > > ierr=VecView(y,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr); > > /* > Free work space. All PETSc objects should be destroyed when they > are no longer needed. > */ > ierr = MatDestroy(U);CHKERRQ(ierr); > ierr = VecDestroy(x);CHKERRQ(ierr); > ierr = VecDestroy(y);CHKERRQ(ierr); > ierr = PetscFinalize();CHKERRQ(ierr); > return 0; > > } > > > > > > > > On Mon, Jan 17, 2011 at 11:56 PM, Gaurish Telang wrote: > >> Hi, >> >> I am trying to solve a linear system where Ay=x and A is a 5x5 matrix >> stored in a binary file called 'square' and x=[1;1;1;1;1] >> >> I am trying to display the matrix A, and the vectors x(rhs) and y(soln) >> in that order to standard output. >> >> On running my code on a single processor the answer returned is accurate. >> But on using 2 processors I get weird error messages PART of which says >> >> [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message >> ------------------------------------ >> [1]PETSC ERROR: Nonconforming object sizes! >> [1]PETSC ERROR: Mat mat,Vec x: global dim 5 10! >> >> Also somehow the vector x gets displayed TWICE when run on two processes. >> A however gets displayed ONCE (as it should!!) >> >> >> I am attaching the output I get when I run on 1 process and the when I run >> the same code on 2 processes. >> please let me know where I could be going wrong. >> >> (1) >> >> This is what I get on running on ONE process: (Here system is solved >> successfully) >> >> gaurish108 at gaurish108-laptop:~/Desktop$ >> $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -n 1 ./ex4 -f square >> >> 1.5761308167754828e-01 1.4188633862721534e-01 6.5574069915658684e-01 >> 7.5774013057833345e-01 7.0604608801960878e-01 >> 9.7059278176061570e-01 4.2176128262627499e-01 3.5711678574189554e-02 >> 7.4313246812491618e-01 3.1832846377420676e-02 >> 9.5716694824294557e-01 9.1573552518906709e-01 8.4912930586877711e-01 >> 3.9222701953416816e-01 2.7692298496088996e-01 >> 4.8537564872284122e-01 7.9220732955955442e-01 9.3399324775755055e-01 >> 6.5547789017755664e-01 4.6171390631153941e-02 >> 8.0028046888880011e-01 9.5949242639290300e-01 6.7873515485777347e-01 >> 1.7118668781156177e-01 9.7131781235847536e-02 >> Process [0] >> 1 >> 1 >> 1 >> 1 >> 1 >> KSPGetIterationNumber 5 >> KSPGetResidualNorm 0.000000 >> Process [0] >> -0.810214 >> 2.33178 >> -1.31131 >> 1.09323 >> 1.17322 >> gaurish108 at gaurish108-laptop:~/Desktop$ >> >> %-------------------------------------------------------------------- >> This is what I get on running on TWO processes: >> >> (2) >> >> gaurish108 at gaurish108-laptop:~/Desktop$ >> $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -n 2 ./ex4 -f square >> 1.5761308167754828e-01 1.4188633862721534e-01 6.5574069915658684e-01 >> 7.5774013057833345e-01 7.0604608801960878e-01 >> 9.7059278176061570e-01 4.2176128262627499e-01 3.5711678574189554e-02 >> 7.4313246812491618e-01 3.1832846377420676e-02 >> 9.5716694824294557e-01 9.1573552518906709e-01 8.4912930586877711e-01 >> 3.9222701953416816e-01 2.7692298496088996e-01 >> 4.8537564872284122e-01 7.9220732955955442e-01 9.3399324775755055e-01 >> 6.5547789017755664e-01 4.6171390631153941e-02 >> 8.0028046888880011e-01 9.5949242639290300e-01 6.7873515485777347e-01 >> 1.7118668781156177e-01 9.7131781235847536e-02 >> Process [0] >> 1 >> 1 >> 1 >> 1 >> 1 >> Process [1] >> 1 >> 1 >> 1 >> 1 >> 1 >> [0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message >> ------------------------------------ >> [1]PETSC ERROR: Nonconforming object sizes! >> [1]PETSC ERROR: Mat mat,Vec x: global dim 5 10! >> [1]PETSC ERROR: >> ------------------------------------------------------------------------ >> [1]PETSC ERROR: Petsc Release Version 3.1.0, Patch 5, Mon Sep 27 11:51:54 >> CDT 2010 >> [1]PETSC ERROR: See docs/changes/index.html for recent updates. >> [1]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> [1]PETSC ERROR: --------------------- Error Message >> ------------------------------------ >> [0]PETSC ERROR: Nonconforming object sizes! >> [0]PETSC ERROR: Mat mat,Vec x: global dim 5 10! >> [0]PETSC ERROR: See docs/index.html for manual pages. >> [1]PETSC ERROR: >> ------------------------------------------------------------------------ >> [1]PETSC ERROR: ./ex4 on a linux-gnu named gaurish108-laptop by gaurish108 >> Mon Jan 17 23:49:18 2011 >> [1]PETSC ERROR: Libraries linked from >> /home/gaurish108/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/linux-gnu-c-debug/lib >> [1]PETSC ERROR: Configure run at Sat Nov 13 20:34:38 2010 >> [1]PETSC ERROR: Configure options --with-cc=gcc --with-fc=gfortran >> --download-f-blas-lapack=1 --download-mpich=1 --download-superlu_dist=1 >> --download-parmetis=1 --with-superlu_dist=1 --with-parmetis=1 >> [1]PETSC ERROR: >> ------------------------------------------------------------------------ >> [1]PETSC ERROR: MatMultTranspose() line 1947 in src/mat/interface/matrix.c >> [1]PETSC ERROR: KSPSolve_CGNE() line 103 in >> src/ksp/ksp/impls/cg/cgne/cgne.c >> [1]PETSC ERROR: KSPSolve() line 396 in src/ksp/ksp/interface/itfunc.c >> [1]PETSC ERROR: main() line 78 in src/mat/examples/tutorials/ex4.c >> application called MPI_Abort(MPI_COMM_WORLD, 60) - process 1[cli_1]: >> aborting job: >> application called MPI_Abort(MPI_COMM_WORLD, 60) - process 1 >> [0]0:Return code = 0, signaled with Interrupt >> [0]1:Return code = 60 >> gaurish108 at gaurish108-laptop:~/Desktop$ >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From burckhardt at itis.ethz.ch Tue Jan 18 02:06:06 2011 From: burckhardt at itis.ethz.ch (Kathrin Burckhardt) Date: Tue, 18 Jan 2011 09:06:06 +0100 Subject: [petsc-users] BiCGStab and right preconditioning Message-ID: <20110118090606.hs874f0g4kswkgc0@mail.oetiker.ch> Dear all, I have a problem using BiCGStab with right preconditioning. Solving my system using gmres everything looks ok: 0 KSP preconditioned resid norm 4.490868831255e-04 true resid norm 4.490868831255e-04 ||Ae||/||Ax|| 6.402973612108e-01 1 KSP preconditioned resid norm 2.641616638766e-04 true resid norm 2.641616638766e-04 ||Ae||/||Ax|| 3.766353965541e-01 Solving it with bcgs, however, I got: 0 KSP preconditioned resid norm 4.490868831255e-04 true resid norm 1.222737120188e+01 ||Ae||/||Ax|| 1.743349407273e+04 1 KSP preconditioned resid norm 1.801799542022e-04 true resid norm 1.222736696074e+01 ||Ae||/||Ax|| 1.743348802582e+04 What could be wrong? Looking forward to your help, Kathrin From knepley at gmail.com Tue Jan 18 08:57:17 2011 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 18 Jan 2011 08:57:17 -0600 Subject: [petsc-users] BiCGStab and right preconditioning In-Reply-To: <20110118090606.hs874f0g4kswkgc0@mail.oetiker.ch> References: <20110118090606.hs874f0g4kswkgc0@mail.oetiker.ch> Message-ID: On Tue, Jan 18, 2011 at 2:06 AM, Kathrin Burckhardt wrote: > Dear all, > > I have a problem using BiCGStab with right preconditioning. > > Solving my system using gmres everything looks ok: > > 0 KSP preconditioned resid norm 4.490868831255e-04 true resid norm > 4.490868831255e-04 ||Ae||/||Ax|| 6.402973612108e-01 > 1 KSP preconditioned resid norm 2.641616638766e-04 true resid norm > 2.641616638766e-04 ||Ae||/||Ax|| 3.766353965541e-01 > > > Solving it with bcgs, however, I got: > > 0 KSP preconditioned resid norm 4.490868831255e-04 true resid norm > 1.222737120188e+01 ||Ae||/||Ax|| 1.743349407273e+04 > 1 KSP preconditioned resid norm 1.801799542022e-04 true resid norm > 1.222736696074e+01 ||Ae||/||Ax|| 1.743348802582e+04 > The first one is the most disturbing since the initial residual should be the true residual with right preconditioning. Can you run with -ksp_view -ksp_monitor and send it just to make sure you are running with what you think. Thanks, Matt > What could be wrong? > > Looking forward to your help, > Kathrin > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Tue Jan 18 11:25:07 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 18 Jan 2011 11:25:07 -0600 Subject: [petsc-users] BiCGStab and right preconditioning In-Reply-To: <20110118090606.hs874f0g4kswkgc0@mail.oetiker.ch> References: <20110118090606.hs874f0g4kswkgc0@mail.oetiker.ch> Message-ID: <1F0E0E7C-2A73-42EE-AF7B-A7FF45D85961@mcs.anl.gov> Are you running a PETSc example? Please send exactly what options you are using to run the code. On a test case it is working correctly for me, we need your help to reproduce the problem. Barry On Jan 18, 2011, at 2:06 AM, Kathrin Burckhardt wrote: > Dear all, > > I have a problem using BiCGStab with right preconditioning. > > Solving my system using gmres everything looks ok: > > 0 KSP preconditioned resid norm 4.490868831255e-04 true resid norm 4.490868831255e-04 ||Ae||/||Ax|| 6.402973612108e-01 > 1 KSP preconditioned resid norm 2.641616638766e-04 true resid norm 2.641616638766e-04 ||Ae||/||Ax|| 3.766353965541e-01 > > > Solving it with bcgs, however, I got: > > 0 KSP preconditioned resid norm 4.490868831255e-04 true resid norm 1.222737120188e+01 ||Ae||/||Ax|| 1.743349407273e+04 > 1 KSP preconditioned resid norm 1.801799542022e-04 true resid norm 1.222736696074e+01 ||Ae||/||Ax|| 1.743348802582e+04 > > What could be wrong? > > Looking forward to your help, > Kathrin From gaetank at gmail.com Tue Jan 18 22:38:44 2011 From: gaetank at gmail.com (Gaetan Kenway) Date: Tue, 18 Jan 2011 23:38:44 -0500 Subject: [petsc-users] PCSetup() changes between 3.0 and 3.1 Message-ID: <1295411924.17413.11.camel@E6600> Hello I've run across a somewhat curious problem. I have included a snipped of code below, that is from a Newton-Krylov flow solver. The code is from the "FormJacobian" function for a snes. The code works fine with PETSc 3.1 but when I tried it with Petsc 3.0, I get a PETSc Error code 73: "object in argument is in wrong state, e.g. unassembled mat " on the PCSetup(pc,ierr) call. The subroutine setupNK_KSP_PC(dRdwPre) assembled the preconditioner matrix (dRdwPre) and performs the MatAssemblyBegin/End functions. I do realize, it is possible to just use PETSc 3.1, but I would like to know if its just a fluke that it works, or I've done something incorrect. I also realize, it is not typical to set options like this directly in code, but is is necessary for our code. I apologize I don't have a minimum representative example, but this section of code is quite buried and an example would have to be coded from scratch. Any suggestions are greatly appreciated Gaetan Kenway *********** BEGIN CODE **************** ! Dummy assembly begin/end calls for the matrix-free Matrx call MatAssemblyBegin(dRdw,MAT_FINAL_ASSEMBLY,ierr) call EChk(ierr,__FILE__,__LINE__) call MatAssemblyEnd(dRdw,MAT_FINAL_ASSEMBLY,ierr) call EChk(ierr,__FILE__,__LINE__) ! Assemble the approximate PC call setupNK_KSP_PC(dRdwPre) ! Setup the required options for the KSP solver call SNESGetKSP(snes,ksp,ierr); call KSPSetType(ksp,ksp_solver_type,ierr); call KSPGMRESSetRestart(ksp, ksp_subspace,ierr); call KSPSetPreconditionerSide(ksp,PC_RIGHT,ierr); ! Setup the required options for the Global PC call KSPGetPC(ksp,pc,ierr); call PCSetType(pc,global_pc_type,ierr); if (trim(global_pc_type) == 'asm') then call PCASMSetOverlap(pc,asm_overlap,ierr); call PCSetup(pc,ierr); call PCASMGetSubKSP( pc, nlocal, first, subksp, ierr ); end if ************** END CODE **************** From knepley at gmail.com Tue Jan 18 22:43:44 2011 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 18 Jan 2011 22:43:44 -0600 Subject: [petsc-users] PCSetup() changes between 3.0 and 3.1 In-Reply-To: <1295411924.17413.11.camel@E6600> References: <1295411924.17413.11.camel@E6600> Message-ID: On Tue, Jan 18, 2011 at 10:38 PM, Gaetan Kenway wrote: > Hello > > I've run across a somewhat curious problem. I have included a snipped of > code below, that is from a Newton-Krylov flow solver. The code is from > the "FormJacobian" function for a snes. The code works fine with PETSc > 3.1 but when I tried it with Petsc 3.0, I get a PETSc Error code 73: > > "object in argument is in wrong state, e.g. unassembled mat " > Its impossible to know what is going on here without the full stack from the error message. Please send that. Thanks, Matt > on the PCSetup(pc,ierr) call. The subroutine setupNK_KSP_PC(dRdwPre) > assembled the preconditioner matrix (dRdwPre) and performs the > MatAssemblyBegin/End functions. I do realize, it is possible to just use > PETSc 3.1, but I would like to know if its just a fluke that it works, > or I've done something incorrect. I also realize, it is not typical to > set options like this directly in code, but is is necessary for our > code. I apologize I don't have a minimum representative example, but > this section of code is quite buried and an example would have to be > coded from scratch. > > Any suggestions are greatly appreciated > > Gaetan Kenway > > > > *********** BEGIN CODE **************** > > ! Dummy assembly begin/end calls for the matrix-free Matrx > call MatAssemblyBegin(dRdw,MAT_FINAL_ASSEMBLY,ierr) > call EChk(ierr,__FILE__,__LINE__) > call MatAssemblyEnd(dRdw,MAT_FINAL_ASSEMBLY,ierr) > call EChk(ierr,__FILE__,__LINE__) > > ! Assemble the approximate PC > call setupNK_KSP_PC(dRdwPre) > > ! Setup the required options for the KSP solver > call SNESGetKSP(snes,ksp,ierr); > call KSPSetType(ksp,ksp_solver_type,ierr); > call KSPGMRESSetRestart(ksp, ksp_subspace,ierr); > call KSPSetPreconditionerSide(ksp,PC_RIGHT,ierr); > > ! Setup the required options for the Global PC > call KSPGetPC(ksp,pc,ierr); > call PCSetType(pc,global_pc_type,ierr); > > if (trim(global_pc_type) == 'asm') then > call PCASMSetOverlap(pc,asm_overlap,ierr); > call PCSetup(pc,ierr); > call PCASMGetSubKSP( pc, nlocal, first, subksp, ierr ); > end if > > ************** END CODE **************** > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From zonexo at gmail.com Wed Jan 19 03:40:41 2011 From: zonexo at gmail.com (TAY wee-beng) Date: Wed, 19 Jan 2011 10:40:41 +0100 Subject: [petsc-users] Errors when changing from ifort to gfortran in cygwin In-Reply-To: References: <804ab5d40612031941p36cbc2bdla94931835bdb06e0@mail.gmail.com> <804ab5d40612032300t7c2fa2ealaf4741197d7aad5d@mail.gmail.com> <804ab5d40612042158t5a30f7cfgbd04e77708ea1973@mail.gmail.com> Message-ID: <4D36B199.7030601@gmail.com> Hi, I am switching from ifort to gfortran in cygwin. My code compiles with ifort and PETSc and now I'm changing to gfortran. However, there are lots of error msg. I understand this is a PETSc mailing list but I hope someone with experience in using gfortran can help as well. I have attached the makefile_debug The errors are: Initially using -Wall: lots of "Warning: Nonconforming tab character at (1)" Changing to -Werror: /cygdrive/d/wtay/Lib/MPICH2_cygwin/include/mpif.h:503.7: Included at /cygdrive/d/wtay/Lib/petsc-3.1-p7_gcc_debug/include/finclude/petscsys.h:11: Included at /cygdrive/d/wtay/Lib/petsc-3.1-p7_gcc_debug/include/finclude/petsc.h:6: Included at /cygdrive/d/wtay/Lib/petsc-3.1-p7_gcc_debug/include/finclude/petsc.h90:5: Included at global_cywin.F90:8: SAVE /MPIFCMB1/,/MPIFCMB2/ 1 Error: Unclassifiable statement at (1) /cygdrive/d/wtay/Lib/MPICH2_cygwin/include/mpif.h:504.7: Included at /cygdrive/d/wtay/Lib/petsc-3.1-p7_gcc_debug/include/finclude/petscsys.h:11: Included at /cygdrive/d/wtay/Lib/petsc-3.1-p7_gcc_debug/include/finclude/petsc.h:6: Included at /cygdrive/d/wtay/Lib/petsc-3.1-p7_gcc_debug/include/finclude/petsc.h90:5: Included at global_cywin.F90:8: SAVE /MPIFCMB3/,/MPIFCMB4/,/MPIFCMB5/,/MPIFCMB6/ 1 Error: Unclassifiable statement at (1) /cygdrive/d/wtay/Lib/MPICH2_cygwin/include/mpif.h:505.7: Included at /cygdrive/d/wtay/Lib/petsc-3.1-p7_gcc_debug/include/finclude/petscsys.h:11: Included at /cygdrive/d/wtay/Lib/petsc-3.1-p7_gcc_debug/include/finclude/petsc.h:6: Included at /cygdrive/d/wtay/Lib/petsc-3.1-p7_gcc_debug/include/finclude/petsc.h90:5: Included at global_cywin.F90:8: SAVE /MPIFCMB7/,/MPIFCMB8/ 1 Error: Unclassifiable statement at (1) global_cywin.F90:581.132: DECIDE,total_k,total_k,5,PETSC_NULL_INTEGER,5,PETSC_NULL_INTEGER,A_mat,ierr 1 Error: Syntax error in argument list at (1) global_cywin.F90:581.132: DECIDE,total_k,total_k,5,PETSC_NULL_INTEGER,5,PETSC_NULL_INTEGER,A_mat,ierr 1 Warning: Line truncated at (1) global_cywin.F90:1226.132: OF PNPOLY.RESULTS INVALID') 1 Warning: Line truncated at (1) I tried to shorten parts of the code by adding & but I still get the "Error: Unclassifiable statement at (1)" Anyone have any ideas? Thanks Yours sincerely, TAY wee-beng -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: makefile_debug URL: From zonexo at gmail.com Wed Jan 19 05:09:41 2011 From: zonexo at gmail.com (TAY wee-beng) Date: Wed, 19 Jan 2011 12:09:41 +0100 Subject: [petsc-users] Makefile on Photran with PETSc in linux In-Reply-To: References: <1295411924.17413.11.camel@E6600> Message-ID: <4D36C675.3000005@gmail.com> Hi, I can compile with PETSc on linux with the makefile provided by the PETSc team. I'm now trying to use photran. I have made a makefile which is very basic and it basically list out all the commands to run. It can be used to compile on photran. However, I hope to use the makefile provided by the PETSc team because it is much more flexible. However when I use it, it says: **** Build of configuration Default for project ibm2d_high_Re **** make all make: *** No rule to make target `all'. Stop. I have attached the 2 makefiles: Makefile - from PETSc team Makefile_old - my own makefile Yours sincerely, TAY wee-beng -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: Makefile URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: Makefile_old URL: From bsmith at mcs.anl.gov Wed Jan 19 07:22:02 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 19 Jan 2011 07:22:02 -0600 Subject: [petsc-users] PCSetup() changes between 3.0 and 3.1 In-Reply-To: <1295411924.17413.11.camel@E6600> References: <1295411924.17413.11.camel@E6600> Message-ID: <6F9E6A97-F36F-4CCA-9140-151BEA56D682@mcs.anl.gov> On Jan 18, 2011, at 10:38 PM, Gaetan Kenway wrote: > Hello > > I've run across a somewhat curious problem. I have included a snipped of > code below, that is from a Newton-Krylov flow solver. The code is from > the "FormJacobian" function for a snes. The code works fine with PETSc > 3.1 but when I tried it with Petsc 3.0, I get a PETSc Error code 73: > > "object in argument is in wrong state, e.g. unassembled mat " > > on the PCSetup(pc,ierr) call. The subroutine setupNK_KSP_PC(dRdwPre) > assembled the preconditioner matrix (dRdwPre) and performs the > MatAssemblyBegin/End functions. I do realize, it is possible to just use > PETSc 3.1, but I would like to know if its just a fluke that it works, > or I've done something incorrect. You haven't done anything incorrectly. The order that options can be set for all the hierarchy of solvers in PETSc is tricky and may change over time. We strive to make it more flexible as time goes by, but sometimes you can only set things once the code has gotten into a certain state, like it knows the matrices are ready. Please just stick to 3.1 Barry > I also realize, it is not typical to > set options like this directly in code, but is is necessary for our > code. I apologize I don't have a minimum representative example, but > this section of code is quite buried and an example would have to be > coded from scratch. > > Any suggestions are greatly appreciated > > Gaetan Kenway > > > > *********** BEGIN CODE **************** > > ! Dummy assembly begin/end calls for the matrix-free Matrx > call MatAssemblyBegin(dRdw,MAT_FINAL_ASSEMBLY,ierr) > call EChk(ierr,__FILE__,__LINE__) > call MatAssemblyEnd(dRdw,MAT_FINAL_ASSEMBLY,ierr) > call EChk(ierr,__FILE__,__LINE__) > > ! Assemble the approximate PC > call setupNK_KSP_PC(dRdwPre) > > ! Setup the required options for the KSP solver > call SNESGetKSP(snes,ksp,ierr); > call KSPSetType(ksp,ksp_solver_type,ierr); > call KSPGMRESSetRestart(ksp, ksp_subspace,ierr); > call KSPSetPreconditionerSide(ksp,PC_RIGHT,ierr); > > ! Setup the required options for the Global PC > call KSPGetPC(ksp,pc,ierr); > call PCSetType(pc,global_pc_type,ierr); > > if (trim(global_pc_type) == 'asm') then > call PCASMSetOverlap(pc,asm_overlap,ierr); > call PCSetup(pc,ierr); > call PCASMGetSubKSP( pc, nlocal, first, subksp, ierr ); > end if > > ************** END CODE **************** > From balay at mcs.anl.gov Wed Jan 19 08:41:37 2011 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 19 Jan 2011 08:41:37 -0600 (CST) Subject: [petsc-users] Errors when changing from ifort to gfortran in cygwin In-Reply-To: <4D36B199.7030601@gmail.com> References: <804ab5d40612031941p36cbc2bdla94931835bdb06e0@mail.gmail.com> <804ab5d40612032300t7c2fa2ealaf4741197d7aad5d@mail.gmail.com> <804ab5d40612042158t5a30f7cfgbd04e77708ea1973@mail.gmail.com> <4D36B199.7030601@gmail.com> Message-ID: On Wed, 19 Jan 2011, TAY wee-beng wrote: > Hi, > > I am switching from ifort to gfortran in cygwin. My code compiles with ifort > and PETSc and now I'm changing to gfortran. However, there are lots of error > msg. > > I understand this is a PETSc mailing list but I hope someone with experience > in using gfortran can help as well. > > I have attached the makefile_debug > > The errors are: > > Initially using -Wall: > > lots of "Warning: Nonconforming tab character at (1)" > > Changing to -Werror: > > /cygdrive/d/wtay/Lib/MPICH2_cygwin/include/mpif.h:503.7: > Included at > /cygdrive/d/wtay/Lib/petsc-3.1-p7_gcc_debug/include/finclude/petscsys.h:11: > Included at > /cygdrive/d/wtay/Lib/petsc-3.1-p7_gcc_debug/include/finclude/petsc.h:6: > Included at > /cygdrive/d/wtay/Lib/petsc-3.1-p7_gcc_debug/include/finclude/petsc.h90:5: > Included at global_cywin.F90:8: > > SAVE /MPIFCMB1/,/MPIFCMB2/ > 1 > Error: Unclassifiable statement at (1) I guess 'save' statements should be placed only in the declaration of variables. Check src/snes/examples/tutorials/ex33f.F for usage. > /cygdrive/d/wtay/Lib/MPICH2_cygwin/include/mpif.h:504.7: > Included at > /cygdrive/d/wtay/Lib/petsc-3.1-p7_gcc_debug/include/finclude/petscsys.h:11: > Included at > /cygdrive/d/wtay/Lib/petsc-3.1-p7_gcc_debug/include/finclude/petsc.h:6: > Included at > /cygdrive/d/wtay/Lib/petsc-3.1-p7_gcc_debug/include/finclude/petsc.h90:5: > Included at global_cywin.F90:8: > > SAVE /MPIFCMB3/,/MPIFCMB4/,/MPIFCMB5/,/MPIFCMB6/ > 1 > Error: Unclassifiable statement at (1) > /cygdrive/d/wtay/Lib/MPICH2_cygwin/include/mpif.h:505.7: > Included at > /cygdrive/d/wtay/Lib/petsc-3.1-p7_gcc_debug/include/finclude/petscsys.h:11: > Included at > /cygdrive/d/wtay/Lib/petsc-3.1-p7_gcc_debug/include/finclude/petsc.h:6: > Included at > /cygdrive/d/wtay/Lib/petsc-3.1-p7_gcc_debug/include/finclude/petsc.h90:5: > Included at global_cywin.F90:8: > > SAVE /MPIFCMB7/,/MPIFCMB8/ > 1 > Error: Unclassifiable statement at (1) > global_cywin.F90:581.132: > > DECIDE,total_k,total_k,5,PETSC_NULL_INTEGER,5,PETSC_NULL_INTEGER,A_mat,ierr > 1 > Error: Syntax error in argument list at (1) > global_cywin.F90:581.132: I guess you added a line break in the middle of a variable PETSC_DECIDE. Satish > > DECIDE,total_k,total_k,5,PETSC_NULL_INTEGER,5,PETSC_NULL_INTEGER,A_mat,ierr > 1 > Warning: Line truncated at (1) > global_cywin.F90:1226.132: > > OF PNPOLY.RESULTS INVALID') > 1 > Warning: Line truncated at (1) > > I tried to shorten parts of the code by adding & but I still get the "Error: > Unclassifiable statement at (1)" > > Anyone have any ideas? > > Thanks > > Yours sincerely, > > TAY wee-beng > > From balay at mcs.anl.gov Wed Jan 19 08:47:52 2011 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 19 Jan 2011 08:47:52 -0600 (CST) Subject: [petsc-users] Makefile on Photran with PETSc in linux In-Reply-To: <4D36C675.3000005@gmail.com> References: <1295411924.17413.11.camel@E6600> <4D36C675.3000005@gmail.com> Message-ID: you shouldn't start a new thread by replying to previous threads. It messes up threaded e-mail readers. On Wed, 19 Jan 2011, TAY wee-beng wrote: > Hi, > > I can compile with PETSc on linux with the makefile provided by the PETSc > team. I'm now trying to use photran. I have made a makefile which is very > basic and it basically list out all the commands to run. It can be used to > compile on photran. > > However, I hope to use the makefile provided by the PETSc team because it is > much more flexible. However when I use it, it says: > > **** Build of configuration Default for project ibm2d_high_Re **** > > make all > make: *** No rule to make target `all'. Stop. ALL: a.out I guess you've typed in 'ALL' instead of 'all' Satish > > I have attached the 2 makefiles: > > Makefile - from PETSc team > Makefile_old - my own makefile > > > Yours sincerely, > > TAY wee-beng > > > From zonexo at gmail.com Wed Jan 19 08:54:43 2011 From: zonexo at gmail.com (TAY wee-beng) Date: Wed, 19 Jan 2011 15:54:43 +0100 Subject: [petsc-users] Makefile on Photran with PETSc in linux In-Reply-To: References: <1295411924.17413.11.camel@E6600> <4D36C675.3000005@gmail.com> Message-ID: <4D36FB33.9040803@gmail.com> Oh Sorry Satish. I will be careful next time. Thanks for your help. Yours sincerely, TAY wee-beng On 19/1/2011 3:47 PM, Satish Balay wrote: > you shouldn't start a new thread by replying to previous threads. > It messes up threaded e-mail readers. > > On Wed, 19 Jan 2011, TAY wee-beng wrote: > >> Hi, >> >> I can compile with PETSc on linux with the makefile provided by the PETSc >> team. I'm now trying to use photran. I have made a makefile which is very >> basic and it basically list out all the commands to run. It can be used to >> compile on photran. >> >> However, I hope to use the makefile provided by the PETSc team because it is >> much more flexible. However when I use it, it says: >> >> **** Build of configuration Default for project ibm2d_high_Re **** >> >> make all >> make: *** No rule to make target `all'. Stop. > ALL: a.out > > I guess you've typed in 'ALL' instead of 'all' > > Satish > >> I have attached the 2 makefiles: >> >> Makefile - from PETSc team >> Makefile_old - my own makefile >> >> >> Yours sincerely, >> >> TAY wee-beng >> >> >> From amesga1 at tigers.lsu.edu Wed Jan 19 10:57:54 2011 From: amesga1 at tigers.lsu.edu (Ataollah Mesgarnejad) Date: Wed, 19 Jan 2011 10:57:54 -0600 Subject: [petsc-users] PetscViewerASCIIOpen Message-ID: Dear All, Is there a way to change the file write mode to append when using PetscViewerASCIIOpen ? Thanks, Ata M -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Jan 19 11:14:11 2011 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 19 Jan 2011 11:14:11 -0600 Subject: [petsc-users] PetscViewerASCIIOpen In-Reply-To: References: Message-ID: On Wed, Jan 19, 2011 at 10:57 AM, Ataollah Mesgarnejad < amesga1 at tigers.lsu.edu> wrote: > Dear All, > > Is there a way to change the file write mode to append when using > PetscViewerASCIIOpen ? > http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/Viewer/PetscViewerFileSetMode.html Matt > Thanks, > Ata M > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From amesga1 at tigers.lsu.edu Wed Jan 19 11:20:25 2011 From: amesga1 at tigers.lsu.edu (Ataollah Mesgarnejad) Date: Wed, 19 Jan 2011 11:20:25 -0600 Subject: [petsc-users] PetscViewerASCIIOpen In-Reply-To: References: Message-ID: Awesome. Thanks. On Wed, Jan 19, 2011 at 11:14 AM, Matthew Knepley wrote: > On Wed, Jan 19, 2011 at 10:57 AM, Ataollah Mesgarnejad < > amesga1 at tigers.lsu.edu> wrote: > >> Dear All, >> >> Is there a way to change the file write mode to append when using >> PetscViewerASCIIOpen ? >> > > > http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/Viewer/PetscViewerFileSetMode.html > > Matt > > >> Thanks, >> Ata M >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- A. Mesgarnejad PhD Student, Research Assistant Mechanical Engineering Department Louisiana State University 2203 Patrick F. Taylor Hall Baton Rouge, La 70803 -------------- next part -------------- An HTML attachment was scrubbed... URL: From gaurish108 at gmail.com Sun Jan 23 00:39:18 2011 From: gaurish108 at gmail.com (Gaurish Telang) Date: Sun, 23 Jan 2011 01:39:18 -0500 Subject: [petsc-users] Using KSPLSQR without explicitly computing U(transpose)U Message-ID: Hi, I have been trying to solve an overdetermined system of linear equations on PETSc using KSPLSQR. Now for my minimization problem |Ux-b| where U has dimension 2683x1274, U has full rank but is badly conditioned. This makes the algorithm LSQR an ideal algorithm for doing Least sqaures here, (at least according to the paper by Paige and Saunders.) I have been able to solve the normal equations , U(transpose)Ux=U(transpose)b successfully with KSPLSQR, in the usual way one solves linear systems with SQUARE matrices. But for this I had to compute U(transpose)U explicitly. The algorithm mentioned in the paper does NOT involve an explicit computation of U(transpose)U. Is there a way to avoid the explicit and expensive computation of U(transpose)U and use KSPLSQR? The details on this page http://www.mcs.anl.gov/petsc/petsc-2/snapshots/petsc-current/docs/manualpages/KSP/KSPLSQR.htmldid not really help. I have also experimented with the suggesstions in the thread http://lists.mcs.anl.gov/pipermail/petsc-users/2010-August/006784.html but I kept getting errors saying there is a dimension mismatch I use KSPSetOperators as ierr = KSPSetOperators(ksp,Prod,Prod,SAME_PRECONDITIONER);CHKERRQ(ierr); where Prod=U(Transpose)U calculated in the previous statements. Anyway, I am providing my current working code underneath just for reference. U and b are provided through files. I am guessing the answer lies in the way one uses KSPSetOperators but I am not sure. Regards, Gaurish Telang %-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- static char help[] = " Doing least squares \n\ -f : file to load \n\n"; /* Include "petscmat.h" so that we can use matrices. automatically includes: petscsys.h - base PETSc routines petscvec.h - vectors petscmat.h - matrices petscis.h - index sets petscviewer.h - viewers */ #include "petscmat.h" #include "petscvec.h" #include "petscksp.h" /* For the iterative solvers */ #include #include #undef __FUNCT__ #define __FUNCT__ "main" int main(int argc,char **args) { Mat U,U_Transpose,Prod; /* matrix */ PetscViewer fd; /* viewer */ char file[PETSC_MAX_PATH_LEN]; /* input file name */ PetscErrorCode ierr; PetscTruth flg; Vec b,x,new_rhs; PetscInt i,n,m,bsize=2683,index; /* bsize indicates the text file size of bx.mat */ PetscScalar *xx; PetscScalar rhs[bsize]; FILE *fp; KSP ksp; PC pc; PetscMPIInt size; PetscInt num_iters; PetscReal rnorm; KSPConvergedReason reason; PetscInitialize(&argc,&args,(char *)0,help); ierr = PetscOptionsGetString(PETSC_NULL,"-f",file,PETSC_MAX_PATH_LEN-1,&flg);CHKERRQ(ierr); if (!flg) SETERRQ(1,"Must indicate binary file with the -f option"); ierr = PetscViewerBinaryOpen(PETSC_COMM_WORLD,file,FILE_MODE_READ,&fd);CHKERRQ(ierr); ierr = MatLoad(fd,MATMPIAIJ,&U);CHKERRQ(ierr); ierr = PetscViewerDestroy(fd);CHKERRQ(ierr); ierr = MatGetSize(U,&m,&n);CHKERRQ(ierr); PetscPrintf(PETSC_COMM_WORLD,"# matrix Rows=%i and # matrix Columns=%i \n\n\n",m,n); PetscPrintf(PETSC_COMM_WORLD,"The matrix is \n\n"); // ierr=MatView(U,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr); ierr=VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,bsize,&b);CHKERRQ(ierr); ierr = VecSet(b,0);CHKERRQ(ierr); /* Reading in the RHS vector. */ fp=fopen("b1.mat","r"); if (fp==NULL) { fprintf(stderr, "Cannot open file"); exit(1); } for (i = 0; i < bsize; i++) { if (fscanf(fp,"%lf", &rhs[i]) != 1) { fprintf(stderr, "Failed to read rhs vector[%d]\n", i); exit(1); } } index=0; /*Putting b into final form */ for (i=0; i> */ ierr = VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,n,&x);CHKERRQ(ierr); ierr = VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,n,&new_rhs);CHKERRQ(ierr); /* Creating the U_transpose matrix and actually seting the values in it */ ierr=MatTranspose(U, MAT_INITIAL_MATRIX,&U_Transpose);CHKERRQ(ierr); ierr=PetscPrintf(PETSC_COMM_WORLD,"\n \n U Transpose:\n\n");CHKERRQ(ierr); // ierr=MatView(U_Transpose,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr); /* Creating the matrix Prod and setting it to U_transpose*U */ ierr=MatMatMult(U_Transpose,U,MAT_INITIAL_MATRIX,PETSC_DEFAULT,&Prod);CHKERRQ(ierr); ierr=PetscPrintf(PETSC_COMM_WORLD,"\n\n Product Ut *U:\n");CHKERRQ(ierr); // ierr=MatView(Prod,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr); /* Creating the matrix-vec product */ ierr=MatMult(U_Transpose,b,new_rhs);CHKERRQ(ierr); ierr=PetscPrintf(PETSC_COMM_WORLD,"\n\n new rhs:\n");CHKERRQ(ierr); // ierr=VecView(new_rhs,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr); //------------------------------------------------------ ierr = KSPCreate(PETSC_COMM_WORLD,&ksp);CHKERRQ(ierr); /* Set operators. Here the matrix that defines the linear system also serves as the preconditioning matrix. */ ierr = KSPSetOperators(ksp,Prod,Prod,SAME_PRECONDITIONER);CHKERRQ(ierr); ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr); ierr = PCSetType(pc,PCJACOBI);CHKERRQ(ierr); ierr = KSPSetTolerances(ksp,1.e-6,PETSC_DEFAULT,PETSC_DEFAULT,PETSC_DEFAULT);CHKERRQ(ierr); /* Set runtime options, e.g., -ksp_type -pc_type -ksp_monitor -ksp_rtol These options will override those specified above as long as KSPSetFromOptions() is called _after_ any other customization routines. */ ierr = KSPSetFromOptions(ksp);CHKERRQ(ierr); ierr = KSPSolve(ksp,new_rhs,x);CHKERRQ(ierr); ierr = KSPView(ksp,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr); /* ----------------------------------------------------------------------------------------- */ ierr=VecView(x,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr); /* --------------------------------------------------------------------------------------- */ ierr = MatDestroy(U);CHKERRQ(ierr); ierr = MatDestroy(U_Transpose);CHKERRQ(ierr); ierr = MatDestroy(Prod);CHKERRQ(ierr); ierr = VecDestroy(b);CHKERRQ(ierr); ierr= VecDestroy(x);CHKERRQ(ierr); ierr= VecDestroy(new_rhs);CHKERRQ(ierr); ierr = PetscFinalize();CHKERRQ(ierr); return 0; } -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at 59A2.org Sun Jan 23 08:40:57 2011 From: jed at 59A2.org (Jed Brown) Date: Sun, 23 Jan 2011 11:40:57 -0300 Subject: [petsc-users] Using KSPLSQR without explicitly computing U(transpose)U In-Reply-To: References: Message-ID: On Sun, Jan 23, 2011 at 03:39, Gaurish Telang wrote: > But for this I had to compute U(transpose)U explicitly. The algorithm > mentioned in the paper does NOT involve an explicit computation of > U(transpose)U. > > Is there a way to avoid the explicit and expensive computation of > U(transpose)U and use KSPLSQR? The details on this page > The need for explicit U^T U is for algebraic preconditioners (like incomplete factorization). You can try using cheaper approximations to this product or use a PCShell if you want to avoid explicitly forming the product. Chances are that unpreconditioned LSQR will converge too slowly. -------------- next part -------------- An HTML attachment was scrubbed... URL: From thomas.dufaud at univ-lyon1.fr Mon Jan 24 08:09:03 2011 From: thomas.dufaud at univ-lyon1.fr (DUFAUD THOMAS) Date: Mon, 24 Jan 2011 15:09:03 +0100 Subject: [petsc-users] 2 levels of parallelism for ASM Message-ID: <08C8F20B14E34F46B1DCB7D954E608D301E679331A73@BVMBX1.univ-lyon1.fr> Hi, I noticed that the local solution of an ASM preconditioner is performed on a single processor per domain, usually setting a KSP PREONLY to perform an ILU factorization. I would like to perform those local solution with a krylov method (GMRES) among a set of processors. Is it possible, for an ASM preconditioner, to set a subgroup of processors per domain and then define parallel sub-solver over a sub-communicator? If it is the case how can I manage operation such as MatIncreaseOverlap? If it is not the case, does it exist a way to do that in PETSc? Thanks, Thomas From PRaeth at hpti.com Mon Jan 24 08:10:12 2011 From: PRaeth at hpti.com (Raeth, Peter) Date: Mon, 24 Jan 2011 14:10:12 +0000 Subject: [petsc-users] 2 levels of parallelism for ASM In-Reply-To: <08C8F20B14E34F46B1DCB7D954E608D301E679331A73@BVMBX1.univ-lyon1.fr> References: <08C8F20B14E34F46B1DCB7D954E608D301E679331A73@BVMBX1.univ-lyon1.fr> Message-ID: <3474F869C1954540B771FD9CAEBCB65704A9BFAC@CORTINA.HPTI.COM> Hello Thomas. Am only just now learning how to use PETSc. My particular concern is calculating the Kronecker Tensor Product. Running out of memory is the biggest roadblock, even though I think I have four times the required memory available and am destroying matrices as they are no longer needed. Would like to lurk on your thread as a learning exercise. Do you want to start a thread on PETSc's users' group? Best, Peter. Peter G. Raeth, Ph.D. Senior Staff Scientist Signal and Image Processing High Performance Technologies, Inc 937-904-5147 praeth at hpti.com ________________________________________ From: petsc-users-bounces at mcs.anl.gov [petsc-users-bounces at mcs.anl.gov] on behalf of DUFAUD THOMAS [thomas.dufaud at univ-lyon1.fr] Sent: Monday, January 24, 2011 9:09 AM To: petsc-users at mcs.anl.gov Subject: [petsc-users] 2 levels of parallelism for ASM Hi, I noticed that the local solution of an ASM preconditioner is performed on a single processor per domain, usually setting a KSP PREONLY to perform an ILU factorization. I would like to perform those local solution with a krylov method (GMRES) among a set of processors. Is it possible, for an ASM preconditioner, to set a subgroup of processors per domain and then define parallel sub-solver over a sub-communicator? If it is the case how can I manage operation such as MatIncreaseOverlap? If it is not the case, does it exist a way to do that in PETSc? Thanks, Thomas From PRaeth at hpti.com Mon Jan 24 08:13:51 2011 From: PRaeth at hpti.com (Raeth, Peter) Date: Mon, 24 Jan 2011 14:13:51 +0000 Subject: [petsc-users] 2 levels of parallelism for ASM In-Reply-To: <3474F869C1954540B771FD9CAEBCB65704A9BFAC@CORTINA.HPTI.COM> References: <08C8F20B14E34F46B1DCB7D954E608D301E679331A73@BVMBX1.univ-lyon1.fr>, <3474F869C1954540B771FD9CAEBCB65704A9BFAC@CORTINA.HPTI.COM> Message-ID: <3474F869C1954540B771FD9CAEBCB65704A9BFB9@CORTINA.HPTI.COM> * blush * Thought this was an internal list. Looking forward to the PETSC user's group discussion. Peter G. Raeth, Ph.D. Senior Staff Scientist Signal and Image Processing High Performance Technologies, Inc 937-904-5147 praeth at hpti.com ________________________________________ From: petsc-users-bounces at mcs.anl.gov [petsc-users-bounces at mcs.anl.gov] on behalf of Raeth, Peter [PRaeth at hpti.com] Sent: Monday, January 24, 2011 9:10 AM To: PETSc users list Subject: Re: [petsc-users] 2 levels of parallelism for ASM Hello Thomas. Am only just now learning how to use PETSc. My particular concern is calculating the Kronecker Tensor Product. Running out of memory is the biggest roadblock, even though I think I have four times the required memory available and am destroying matrices as they are no longer needed. Would like to lurk on your thread as a learning exercise. Do you want to start a thread on PETSc's users' group? Best, Peter. Peter G. Raeth, Ph.D. Senior Staff Scientist Signal and Image Processing High Performance Technologies, Inc 937-904-5147 praeth at hpti.com ________________________________________ From: petsc-users-bounces at mcs.anl.gov [petsc-users-bounces at mcs.anl.gov] on behalf of DUFAUD THOMAS [thomas.dufaud at univ-lyon1.fr] Sent: Monday, January 24, 2011 9:09 AM To: petsc-users at mcs.anl.gov Subject: [petsc-users] 2 levels of parallelism for ASM Hi, I noticed that the local solution of an ASM preconditioner is performed on a single processor per domain, usually setting a KSP PREONLY to perform an ILU factorization. I would like to perform those local solution with a krylov method (GMRES) among a set of processors. Is it possible, for an ASM preconditioner, to set a subgroup of processors per domain and then define parallel sub-solver over a sub-communicator? If it is the case how can I manage operation such as MatIncreaseOverlap? If it is not the case, does it exist a way to do that in PETSc? Thanks, Thomas From bsmith at mcs.anl.gov Mon Jan 24 12:49:24 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 24 Jan 2011 12:49:24 -0600 Subject: [petsc-users] 2 levels of parallelism for ASM In-Reply-To: <08C8F20B14E34F46B1DCB7D954E608D301E679331A73@BVMBX1.univ-lyon1.fr> References: <08C8F20B14E34F46B1DCB7D954E608D301E679331A73@BVMBX1.univ-lyon1.fr> Message-ID: <80C9035D-C8D4-493B-A790-3078184DF07C@mcs.anl.gov> Thomas, There is no way to have parallel subdomains in PETSc 3.1 for additive Schwarz but one of us has just added support in petsc-dev for exactly this approach. You can access petsc-dev via http://www.mcs.anl.gov/petsc/petsc-as/developers/index.html Since this is a new not yet released feature please join the mailing list petsc-dev at mcs.anl.gov http://www.mcs.anl.gov/petsc/petsc-as/miscellaneous/mailing-lists.html and communicate issues regarding this top on that list. Barry On Jan 24, 2011, at 8:09 AM, DUFAUD THOMAS wrote: > Hi, > I noticed that the local solution of an ASM preconditioner is performed on a single processor per domain, usually setting a KSP PREONLY to perform an ILU factorization. > I would like to perform those local solution with a krylov method (GMRES) among a set of processors. > > Is it possible, for an ASM preconditioner, to set a subgroup of processors per domain and then define parallel sub-solver over a sub-communicator? > > If it is the case how can I manage operation such as MatIncreaseOverlap? > If it is not the case, does it exist a way to do that in PETSc? > > Thanks, > > Thomas From PRaeth at hpti.com Mon Jan 24 15:08:25 2011 From: PRaeth at hpti.com (Raeth, Peter) Date: Mon, 24 Jan 2011 21:08:25 +0000 Subject: [petsc-users] Out of memory during MatAssemblyBegin Message-ID: <3474F869C1954540B771FD9CAEBCB65704A9C080@CORTINA.HPTI.COM> Am running out of memory while using MatAssemblyBegin on a dense matrix that spans several processors. My calculations show that the matrices I am using do not require more than 25% of available memory. Different about this matrix compared to the others is that the program runs out of memory after the matrix has been populated by a single process, rather than by multiple processes. Used MatSetValues. Since the values are held in cache until MatAssemblyEnd is called (as I understand things), is it possible that using one process to populate the entire matrix is causing this problem? The data is brought in only row by row for the population process. All buffer memory is cleared before the call to MatAssemblyBegin. The error dump contains: mpirun -prefix [%g] -np 256 Peter.x [0] [0]PETSC ERROR: --------------------- Error Message ------------------------------------ [0] [0]PETSC ERROR: Out of memory. This could be due to allocating [0] [0]PETSC ERROR: too large an object or bleeding by not properly [0] [0]PETSC ERROR: destroying unneeded objects. [0] [0]PETSC ERROR: Memory allocated 1372407920 Memory used by process -122585088 [0] [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info. [0] [0]PETSC ERROR: Memory requested 18446744071829395456! [0] [0]PETSC ERROR: ------------------------------------------------------------------------ [0] [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 6, Tue Nov 16 17:02:32 CST 2010 [0] [0]PETSC ERROR: See docs/changes/index.html for recent updates. [0] [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0] [0]PETSC ERROR: See docs/index.html for manual pages. [0] [0]PETSC ERROR: ------------------------------------------------------------------------ [0] [0]PETSC ERROR: Peter.x on a linux-int named hawk-6 by praeth Mon Jan 24 15:44:28 2011 [0] [0]PETSC ERROR: Libraries linked from /default/praeth/MATH/petsc-3.1-p6/linux-intel-g/lib [0] [0]PETSC ERROR: Configure run at Tue Dec 21 08:45:25 2010 [0] [0]PETSC ERROR: Configure options --download-superlu=1 --download-parmetis=1 --download-superlu_dist=1 --with-debugging=1 --with-error-checking=1 -PETSC_ARCH=linux-intel-g --with-fc="ifort -lmpi" --with-cc="icc -lmpi" --with-gnu-compilers=false [0] [0]PETSC ERROR: ------------------------------------------------------------------------ [0] [0]PETSC ERROR: PetscMallocAlign() line 49 in src/sys/memory/mal.c [0] [0]PETSC ERROR: PetscTrMallocDefault() line 192 in src/sys/memory/mtr.c [0] [0]PETSC ERROR: MatStashScatterBegin_Private() line 510 in src/mat/utils/matstash.c [0] [0]PETSC ERROR: MatAssemblyBegin_MPIDense() line 286 in src/mat/impls/dense/mpi/mpidense.c [0] [0]PETSC ERROR: MatAssemblyBegin() line 4564 in src/mat/interface/matrix.c [0] [0]PETSC ERROR: User provided function() line 195 in "unknowndirectory/"Peter.c [-1] MPI: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize() [-1] MPI: aborting job exit Had tried to use the suggestion to employ -malloc_dump or -malloc_log but do not see any result from the batch run. Thank you all for any insights you can offer. Best, Peter. -------------- next part -------------- An HTML attachment was scrubbed... URL: From karpeev at mcs.anl.gov Mon Jan 24 15:16:07 2011 From: karpeev at mcs.anl.gov (Dmitry Karpeev) Date: Mon, 24 Jan 2011 15:16:07 -0600 Subject: [petsc-users] 2 levels of parallelism for ASM In-Reply-To: <80C9035D-C8D4-493B-A790-3078184DF07C@mcs.anl.gov> References: <08C8F20B14E34F46B1DCB7D954E608D301E679331A73@BVMBX1.univ-lyon1.fr> <80C9035D-C8D4-493B-A790-3078184DF07C@mcs.anl.gov> Message-ID: petsc-dev has PCGASM, which is a "generalization" of PCASM that allows for subdomains that live on a subcommunicator of the PC's communicator. The API is nearly identical to ASM's, and GASM will eventually replace ASM, once we are reasonably sure it works correctly (e.g., I'm chasing down a small memory leak in GASM at the moment). The difficulty with subdomains straddling several ranks is that the user is responsible for generating these subdomains. PCGASMCreateSubdomains2D is a helper subroutine that will produce a rank-straddling partition using DA-like data. This is of limited use, since it works for structured 2D meshes only. The currently implemented partitioning "algorithm" is sufficiently naive to serialize the subdomain solves. This can be improved, but in the absence of users I have not made the time to do it. The longer-term plan is to have an interface to various mesh packages to read the subdomain partition information from them (in addition to the parallel partition). Similar functionality is required for FETI-like subdivisions, and I'm currently working on one of these mesh/partitioning hookups (initially, for MOAB). We can definitely help the particular application/user with using this functionality. Dmitry. On Mon, Jan 24, 2011 at 12:49 PM, Barry Smith wrote: > > Thomas, > > There is no way to have parallel subdomains in PETSc 3.1 for additive > Schwarz but one of us has just added support in petsc-dev for exactly this > approach. You can access petsc-dev via > http://www.mcs.anl.gov/petsc/petsc-as/developers/index.html Since this > is a new not yet released feature please join the mailing list > petsc-dev at mcs.anl.gov > http://www.mcs.anl.gov/petsc/petsc-as/miscellaneous/mailing-lists.html and > communicate issues regarding this top on that list. > > Barry > > > > On Jan 24, 2011, at 8:09 AM, DUFAUD THOMAS wrote: > > > Hi, > > I noticed that the local solution of an ASM preconditioner is performed > on a single processor per domain, usually setting a KSP PREONLY to perform > an ILU factorization. > > I would like to perform those local solution with a krylov method (GMRES) > among a set of processors. > > > > Is it possible, for an ASM preconditioner, to set a subgroup of > processors per domain and then define parallel sub-solver over a > sub-communicator? > > > > If it is the case how can I manage operation such as MatIncreaseOverlap? > > If it is not the case, does it exist a way to do that in PETSc? > > > > Thanks, > > > > Thomas > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Mon Jan 24 15:23:37 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 24 Jan 2011 15:23:37 -0600 Subject: [petsc-users] Out of memory during MatAssemblyBegin In-Reply-To: <3474F869C1954540B771FD9CAEBCB65704A9C080@CORTINA.HPTI.COM> References: <3474F869C1954540B771FD9CAEBCB65704A9C080@CORTINA.HPTI.COM> Message-ID: On Jan 24, 2011, at 3:08 PM, Raeth, Peter wrote: > Am running out of memory while using MatAssemblyBegin on a dense matrix that spans several processors. My calculations show that the matrices I am using do not require more than 25% of available memory. > > Different about this matrix compared to the others is that the program runs out of memory after the matrix has been populated by a single process, rather than by multiple processes. Used MatSetValues. Since the values are held in cache until MatAssemblyEnd is called (as I understand things), is it possible that using one process to populate the entire matrix is causing this problem? Yes, absolutely, this is a terrible non-scalable way of filling a parallel matrix. You can fake it by calling MatAssemblyBegin/End() repeatedly with the flag MAT_FLUSH_ASSEMBLY to keep the stash from getting too big. But you really need a much better way of setting values into the matrix. How are these "brought in row by row" matrix entries generated? Barry > The data is brought in only row by row for the population process. All buffer memory is cleared before the call to MatAssemblyBegin. > > The error dump contains: > > mpirun -prefix [%g] -np 256 Peter.x > [0] [0]PETSC ERROR: --------------------- Error Message ------------------------------------ > [0] [0]PETSC ERROR: Out of memory. This could be due to allocating > [0] [0]PETSC ERROR: too large an object or bleeding by not properly > [0] [0]PETSC ERROR: destroying unneeded objects. > [0] [0]PETSC ERROR: Memory allocated 1372407920 Memory used by process -122585088 > [0] [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info. > [0] [0]PETSC ERROR: Memory requested 18446744071829395456! > [0] [0]PETSC ERROR: ------------------------------------------------------------------------ > [0] [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 6, Tue Nov 16 17:02:32 CST 2010 > [0] [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0] [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0] [0]PETSC ERROR: See docs/index.html for manual pages. > [0] [0]PETSC ERROR: ------------------------------------------------------------------------ > [0] [0]PETSC ERROR: Peter.x on a linux-int named hawk-6 by praeth Mon Jan 24 15:44:28 2011 > [0] [0]PETSC ERROR: Libraries linked from /default/praeth/MATH/petsc-3.1-p6/linux-intel-g/lib > [0] [0]PETSC ERROR: Configure run at Tue Dec 21 08:45:25 2010 > [0] [0]PETSC ERROR: Configure options --download-superlu=1 --download-parmetis=1 --download-superlu_dist=1 --with-debugging=1 --with-error-checking=1 -PETSC_ARCH=linux-intel-g --with-fc="ifort -lmpi" --with-cc="icc -lmpi" --with-gnu-compilers=false > [0] [0]PETSC ERROR: ------------------------------------------------------------------------ > [0] [0]PETSC ERROR: PetscMallocAlign() line 49 in src/sys/memory/mal.c > [0] [0]PETSC ERROR: PetscTrMallocDefault() line 192 in src/sys/memory/mtr.c > [0] [0]PETSC ERROR: MatStashScatterBegin_Private() line 510 in src/mat/utils/matstash.c > [0] [0]PETSC ERROR: MatAssemblyBegin_MPIDense() line 286 in src/mat/impls/dense/mpi/mpidense.c > [0] [0]PETSC ERROR: MatAssemblyBegin() line 4564 in src/mat/interface/matrix.c > [0] [0]PETSC ERROR: User provided function() line 195 in "unknowndirectory/"Peter.c > [-1] MPI: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize() > [-1] MPI: aborting job > exit > > Had tried to use the suggestion to employ -malloc_dump or -malloc_log but do not see any result from the batch run. > > Thank you all for any insights you can offer. > > > Best, > > Peter. > From rongliang.chan at gmail.com Mon Jan 24 15:32:26 2011 From: rongliang.chan at gmail.com (Rongliang Chen) Date: Mon, 24 Jan 2011 14:32:26 -0700 Subject: [petsc-users] Problem on LU factorization Message-ID: Hi, I face a problem on the LU factorization. When I use the PETSC's default LU factorization, my code does not converge for KSP. When I use superlu with command line "-sub_pc_factor_mat_solver_package superlu", it said "[43]PETSC ERROR: ------------------------------------------------------------------------ [43]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [43]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [43]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[43]PETSCERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [43]PETSC ERROR: likely location of problem given in stack below [43]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [43]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [43]PETSC ERROR: INSTEAD the line number of the start of the function [43]PETSC ERROR: is given. [43]PETSC ERROR: [43] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [43]PETSC ERROR: [43] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c ............................ " When I use superlu_dist, my code converges well, but I found that the compute time is very high. What's maybe the problem? Thanks. Regards, Rongliang -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Jan 24 15:40:47 2011 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 24 Jan 2011 15:40:47 -0600 Subject: [petsc-users] Problem on LU factorization In-Reply-To: References: Message-ID: On Mon, Jan 24, 2011 at 3:32 PM, Rongliang Chen wrote: > Hi, > > I face a problem on the LU factorization. When I use the PETSC's default LU > factorization, my code does not converge for KSP. This is no enough information. Note that if you use this in parallel, it is just BlokcJacobi LU and thus it is not surprising that it does not converge. > When I use superlu with command line "-sub_pc_factor_mat_solver_package > superlu", it said "[43]PETSC ERROR: > ------------------------------------------------------------------------ > [43]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > probably memory access out of range > [43]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [43]PETSC ERROR: or see > http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[43]PETSCERROR: or try > http://valgrind.org on GNU/linux and Apple Mac OS X to find memory > corruption errors > [43]PETSC ERROR: likely location of problem given in stack below > [43]PETSC ERROR: --------------------- Stack Frames > ------------------------------------ > [43]PETSC ERROR: Note: The EXACT line numbers in the stack are not > available, > [43]PETSC ERROR: INSTEAD the line number of the start of the function > [43]PETSC ERROR: is given. > [43]PETSC ERROR: [43] MatLUFactorNumeric_SuperLU line 121 > src/mat/impls/aij/seq/superlu/superlu.c > [43]PETSC ERROR: [43] MatLUFactorNumeric line 2575 > src/mat/interface/matrix.c > ............................ > " > Please confirm that you have the latest patch level. If so, send the matrix in PETSc binary format to petsc-maint at mcs.anl.gov along with the precise solver options and output of -ksp_view. > When I use superlu_dist, my code converges well, but I found that the > compute time is very high. What's maybe the problem? Thanks. > This is not surprising either. Sparse LU factorization can be expensive. Matt > > Regards, > > Rongliang > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Mon Jan 24 16:06:22 2011 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 24 Jan 2011 16:06:22 -0600 (CST) Subject: [petsc-users] Problem on LU factorization In-Reply-To: References: Message-ID: On Mon, 24 Jan 2011, Matthew Knepley wrote: > > When I use superlu with command line "-sub_pc_factor_mat_solver_package > > superlu", it said > > "[43]PETSC ERROR: > > ------------------------------------------------------------------------ > > [43]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > > probably memory access out of range > > [43]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > > [43]PETSC ERROR: or see > > http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[43]PETSCERROR: or try > > http://valgrind.org on GNU/linux and Apple Mac OS X to find memory > > corruption errors > > [43]PETSC ERROR: likely location of problem given in stack below > > [43]PETSC ERROR: --------------------- Stack Frames > > ------------------------------------ > > [43]PETSC ERROR: Note: The EXACT line numbers in the stack are not > > available, > > [43]PETSC ERROR: INSTEAD the line number of the start of the function > > [43]PETSC ERROR: is given. > > [43]PETSC ERROR: [43] MatLUFactorNumeric_SuperLU line 121 > > src/mat/impls/aij/seq/superlu/superlu.c > > [43]PETSC ERROR: [43] MatLUFactorNumeric line 2575 > > src/mat/interface/matrix.c > > ............................ > > " > > > > Please confirm that you have the latest patch level. If so, send the matrix > in PETSc binary format to petsc-maint at mcs.anl.gov > along with the precise solver options and output of -ksp_view. More likely there is memory corruption somewhere - should run this code with valgrind to weed out such issues.. Satish From PRaeth at hpti.com Tue Jan 25 07:36:45 2011 From: PRaeth at hpti.com (Raeth, Peter) Date: Tue, 25 Jan 2011 13:36:45 +0000 Subject: [petsc-users] Out of memory during MatAssemblyBegin In-Reply-To: References: <3474F869C1954540B771FD9CAEBCB65704A9C080@CORTINA.HPTI.COM>, Message-ID: <3474F869C1954540B771FD9CAEBCB65704A9C0E9@CORTINA.HPTI.COM> The matrix resides on disk. It was generated by a single-process program. Its purpose is for comparing those results with those generated by a PETSc-based multi-process program. The current approach works well for small and medium-sized matrices but not for the large matrix. What I can do is let each process determine which rows it holds locally. Then each process can read its rows and populate its part of the matrix. Just a bit more code. Not a big problem. Thank you very much Barry for your input. Let me assure you that I have no intention of faking or hacking. :) This project is too important to our transition from shared-memory machines. (See http://www.afrl.hpc.mil/hardware/hawk.php.) Best, Peter. Peter G. Raeth, Ph.D. Senior Staff Scientist Signal and Image Processing High Performance Technologies, Inc 937-904-5147 praeth at hpti.com ________________________________________ From: petsc-users-bounces at mcs.anl.gov [petsc-users-bounces at mcs.anl.gov] on behalf of Barry Smith [bsmith at mcs.anl.gov] Sent: Monday, January 24, 2011 4:23 PM To: PETSc users list Subject: Re: [petsc-users] Out of memory during MatAssemblyBegin On Jan 24, 2011, at 3:08 PM, Raeth, Peter wrote: > Am running out of memory while using MatAssemblyBegin on a dense matrix that spans several processors. My calculations show that the matrices I am using do not require more than 25% of available memory. > > Different about this matrix compared to the others is that the program runs out of memory after the matrix has been populated by a single process, rather than by multiple processes. Used MatSetValues. Since the values are held in cache until MatAssemblyEnd is called (as I understand things), is it possible that using one process to populate the entire matrix is causing this problem? Yes, absolutely, this is a terrible non-scalable way of filling a parallel matrix. You can fake it by calling MatAssemblyBegin/End() repeatedly with the flag MAT_FLUSH_ASSEMBLY to keep the stash from getting too big. But you really need a much better way of setting values into the matrix. How are these "brought in row by row" matrix entries generated? Barry > The data is brought in only row by row for the population process. All buffer memory is cleared before the call to MatAssemblyBegin. > > The error dump contains: > > mpirun -prefix [%g] -np 256 Peter.x > [0] [0]PETSC ERROR: --------------------- Error Message ------------------------------------ > [0] [0]PETSC ERROR: Out of memory. This could be due to allocating > [0] [0]PETSC ERROR: too large an object or bleeding by not properly > [0] [0]PETSC ERROR: destroying unneeded objects. > [0] [0]PETSC ERROR: Memory allocated 1372407920 Memory used by process -122585088 > [0] [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info. > [0] [0]PETSC ERROR: Memory requested 18446744071829395456! > [0] [0]PETSC ERROR: ------------------------------------------------------------------------ > [0] [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 6, Tue Nov 16 17:02:32 CST 2010 > [0] [0]PETSC ERROR: See docs/changes/index.html for recent updates. > [0] [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. > [0] [0]PETSC ERROR: See docs/index.html for manual pages. > [0] [0]PETSC ERROR: ------------------------------------------------------------------------ > [0] [0]PETSC ERROR: Peter.x on a linux-int named hawk-6 by praeth Mon Jan 24 15:44:28 2011 > [0] [0]PETSC ERROR: Libraries linked from /default/praeth/MATH/petsc-3.1-p6/linux-intel-g/lib > [0] [0]PETSC ERROR: Configure run at Tue Dec 21 08:45:25 2010 > [0] [0]PETSC ERROR: Configure options --download-superlu=1 --download-parmetis=1 --download-superlu_dist=1 --with-debugging=1 --with-error-checking=1 -PETSC_ARCH=linux-intel-g --with-fc="ifort -lmpi" --with-cc="icc -lmpi" --with-gnu-compilers=false > [0] [0]PETSC ERROR: ------------------------------------------------------------------------ > [0] [0]PETSC ERROR: PetscMallocAlign() line 49 in src/sys/memory/mal.c > [0] [0]PETSC ERROR: PetscTrMallocDefault() line 192 in src/sys/memory/mtr.c > [0] [0]PETSC ERROR: MatStashScatterBegin_Private() line 510 in src/mat/utils/matstash.c > [0] [0]PETSC ERROR: MatAssemblyBegin_MPIDense() line 286 in src/mat/impls/dense/mpi/mpidense.c > [0] [0]PETSC ERROR: MatAssemblyBegin() line 4564 in src/mat/interface/matrix.c > [0] [0]PETSC ERROR: User provided function() line 195 in "unknowndirectory/"Peter.c > [-1] MPI: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize() > [-1] MPI: aborting job > exit > > Had tried to use the suggestion to employ -malloc_dump or -malloc_log but do not see any result from the batch run. > > Thank you all for any insights you can offer. > > > Best, > > Peter. > From bsmith at mcs.anl.gov Tue Jan 25 08:01:23 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 25 Jan 2011 08:01:23 -0600 Subject: [petsc-users] Out of memory during MatAssemblyBegin In-Reply-To: <3474F869C1954540B771FD9CAEBCB65704A9C0E9@CORTINA.HPTI.COM> References: <3474F869C1954540B771FD9CAEBCB65704A9C080@CORTINA.HPTI.COM>, <3474F869C1954540B771FD9CAEBCB65704A9C0E9@CORTINA.HPTI.COM> Message-ID: On Jan 25, 2011, at 7:36 AM, Raeth, Peter wrote: > The matrix resides on disk. It was generated by a single-process program. Its purpose is for comparing those results with those generated by a PETSc-based multi-process program. The current approach works well for small and medium-sized matrices but not for the large matrix. > > What I can do is let each process determine which rows it holds locally. Then each process can read its rows You don't want to do this with standard Unix IO. Having all the processes trying to access the same file will really stall out. Since it is a dense matrix you can easily use MPI IO and have each process access its piece of the matrix "in parallel". Barry > and populate its part of the matrix. Just a bit more code. Not a big problem. > > Thank you very much Barry for your input. Let me assure you that I have no intention of faking or hacking. :) This project is too important to our transition from shared-memory machines. (See http://www.afrl.hpc.mil/hardware/hawk.php.) > > > Best, > > Peter. > > Peter G. Raeth, Ph.D. > Senior Staff Scientist > Signal and Image Processing > High Performance Technologies, Inc > 937-904-5147 > praeth at hpti.com > > ________________________________________ > From: petsc-users-bounces at mcs.anl.gov [petsc-users-bounces at mcs.anl.gov] on behalf of Barry Smith [bsmith at mcs.anl.gov] > Sent: Monday, January 24, 2011 4:23 PM > To: PETSc users list > Subject: Re: [petsc-users] Out of memory during MatAssemblyBegin > > On Jan 24, 2011, at 3:08 PM, Raeth, Peter wrote: > >> Am running out of memory while using MatAssemblyBegin on a dense matrix that spans several processors. My calculations show that the matrices I am using do not require more than 25% of available memory. >> >> Different about this matrix compared to the others is that the program runs out of memory after the matrix has been populated by a single process, rather than by multiple processes. Used MatSetValues. Since the values are held in cache until MatAssemblyEnd is called (as I understand things), is it possible that using one process to populate the entire matrix is causing this problem? > > > Yes, absolutely, this is a terrible non-scalable way of filling a parallel matrix. You can fake it by calling MatAssemblyBegin/End() repeatedly with the flag MAT_FLUSH_ASSEMBLY to keep the stash from getting too big. But you really need a much better way of setting values into the matrix. How are these "brought in row by row" matrix entries generated? > > Barry > > >> The data is brought in only row by row for the population process. All buffer memory is cleared before the call to MatAssemblyBegin. >> >> The error dump contains: >> >> mpirun -prefix [%g] -np 256 Peter.x >> [0] [0]PETSC ERROR: --------------------- Error Message ------------------------------------ >> [0] [0]PETSC ERROR: Out of memory. This could be due to allocating >> [0] [0]PETSC ERROR: too large an object or bleeding by not properly >> [0] [0]PETSC ERROR: destroying unneeded objects. >> [0] [0]PETSC ERROR: Memory allocated 1372407920 Memory used by process -122585088 >> [0] [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info. >> [0] [0]PETSC ERROR: Memory requested 18446744071829395456! >> [0] [0]PETSC ERROR: ------------------------------------------------------------------------ >> [0] [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 6, Tue Nov 16 17:02:32 CST 2010 >> [0] [0]PETSC ERROR: See docs/changes/index.html for recent updates. >> [0] [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> [0] [0]PETSC ERROR: See docs/index.html for manual pages. >> [0] [0]PETSC ERROR: ------------------------------------------------------------------------ >> [0] [0]PETSC ERROR: Peter.x on a linux-int named hawk-6 by praeth Mon Jan 24 15:44:28 2011 >> [0] [0]PETSC ERROR: Libraries linked from /default/praeth/MATH/petsc-3.1-p6/linux-intel-g/lib >> [0] [0]PETSC ERROR: Configure run at Tue Dec 21 08:45:25 2010 >> [0] [0]PETSC ERROR: Configure options --download-superlu=1 --download-parmetis=1 --download-superlu_dist=1 --with-debugging=1 --with-error-checking=1 -PETSC_ARCH=linux-intel-g --with-fc="ifort -lmpi" --with-cc="icc -lmpi" --with-gnu-compilers=false >> [0] [0]PETSC ERROR: ------------------------------------------------------------------------ >> [0] [0]PETSC ERROR: PetscMallocAlign() line 49 in src/sys/memory/mal.c >> [0] [0]PETSC ERROR: PetscTrMallocDefault() line 192 in src/sys/memory/mtr.c >> [0] [0]PETSC ERROR: MatStashScatterBegin_Private() line 510 in src/mat/utils/matstash.c >> [0] [0]PETSC ERROR: MatAssemblyBegin_MPIDense() line 286 in src/mat/impls/dense/mpi/mpidense.c >> [0] [0]PETSC ERROR: MatAssemblyBegin() line 4564 in src/mat/interface/matrix.c >> [0] [0]PETSC ERROR: User provided function() line 195 in "unknowndirectory/"Peter.c >> [-1] MPI: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize() >> [-1] MPI: aborting job >> exit >> >> Had tried to use the suggestion to employ -malloc_dump or -malloc_log but do not see any result from the batch run. >> >> Thank you all for any insights you can offer. >> >> >> Best, >> >> Peter. >> > From PRaeth at hpti.com Tue Jan 25 08:03:34 2011 From: PRaeth at hpti.com (Raeth, Peter) Date: Tue, 25 Jan 2011 14:03:34 +0000 Subject: [petsc-users] Out of memory during MatAssemblyBegin In-Reply-To: References: <3474F869C1954540B771FD9CAEBCB65704A9C080@CORTINA.HPTI.COM>, <3474F869C1954540B771FD9CAEBCB65704A9C0E9@CORTINA.HPTI.COM>, Message-ID: <3474F869C1954540B771FD9CAEBCB65704A9C102@CORTINA.HPTI.COM> AH ! Good idea Barry. Thanks. Peter G. Raeth, Ph.D. Senior Staff Scientist Signal and Image Processing High Performance Technologies, Inc 937-904-5147 praeth at hpti.com ________________________________________ From: petsc-users-bounces at mcs.anl.gov [petsc-users-bounces at mcs.anl.gov] on behalf of Barry Smith [bsmith at mcs.anl.gov] Sent: Tuesday, January 25, 2011 9:01 AM To: PETSc users list Subject: Re: [petsc-users] Out of memory during MatAssemblyBegin On Jan 25, 2011, at 7:36 AM, Raeth, Peter wrote: > The matrix resides on disk. It was generated by a single-process program. Its purpose is for comparing those results with those generated by a PETSc-based multi-process program. The current approach works well for small and medium-sized matrices but not for the large matrix. > > What I can do is let each process determine which rows it holds locally. Then each process can read its rows You don't want to do this with standard Unix IO. Having all the processes trying to access the same file will really stall out. Since it is a dense matrix you can easily use MPI IO and have each process access its piece of the matrix "in parallel". Barry > and populate its part of the matrix. Just a bit more code. Not a big problem. > > Thank you very much Barry for your input. Let me assure you that I have no intention of faking or hacking. :) This project is too important to our transition from shared-memory machines. (See http://www.afrl.hpc.mil/hardware/hawk.php.) > > > Best, > > Peter. > > Peter G. Raeth, Ph.D. > Senior Staff Scientist > Signal and Image Processing > High Performance Technologies, Inc > 937-904-5147 > praeth at hpti.com > > ________________________________________ > From: petsc-users-bounces at mcs.anl.gov [petsc-users-bounces at mcs.anl.gov] on behalf of Barry Smith [bsmith at mcs.anl.gov] > Sent: Monday, January 24, 2011 4:23 PM > To: PETSc users list > Subject: Re: [petsc-users] Out of memory during MatAssemblyBegin > > On Jan 24, 2011, at 3:08 PM, Raeth, Peter wrote: > >> Am running out of memory while using MatAssemblyBegin on a dense matrix that spans several processors. My calculations show that the matrices I am using do not require more than 25% of available memory. >> >> Different about this matrix compared to the others is that the program runs out of memory after the matrix has been populated by a single process, rather than by multiple processes. Used MatSetValues. Since the values are held in cache until MatAssemblyEnd is called (as I understand things), is it possible that using one process to populate the entire matrix is causing this problem? > > > Yes, absolutely, this is a terrible non-scalable way of filling a parallel matrix. You can fake it by calling MatAssemblyBegin/End() repeatedly with the flag MAT_FLUSH_ASSEMBLY to keep the stash from getting too big. But you really need a much better way of setting values into the matrix. How are these "brought in row by row" matrix entries generated? > > Barry > > >> The data is brought in only row by row for the population process. All buffer memory is cleared before the call to MatAssemblyBegin. >> >> The error dump contains: >> >> mpirun -prefix [%g] -np 256 Peter.x >> [0] [0]PETSC ERROR: --------------------- Error Message ------------------------------------ >> [0] [0]PETSC ERROR: Out of memory. This could be due to allocating >> [0] [0]PETSC ERROR: too large an object or bleeding by not properly >> [0] [0]PETSC ERROR: destroying unneeded objects. >> [0] [0]PETSC ERROR: Memory allocated 1372407920 Memory used by process -122585088 >> [0] [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info. >> [0] [0]PETSC ERROR: Memory requested 18446744071829395456! >> [0] [0]PETSC ERROR: ------------------------------------------------------------------------ >> [0] [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 6, Tue Nov 16 17:02:32 CST 2010 >> [0] [0]PETSC ERROR: See docs/changes/index.html for recent updates. >> [0] [0]PETSC ERROR: See docs/faq.html for hints about trouble shooting. >> [0] [0]PETSC ERROR: See docs/index.html for manual pages. >> [0] [0]PETSC ERROR: ------------------------------------------------------------------------ >> [0] [0]PETSC ERROR: Peter.x on a linux-int named hawk-6 by praeth Mon Jan 24 15:44:28 2011 >> [0] [0]PETSC ERROR: Libraries linked from /default/praeth/MATH/petsc-3.1-p6/linux-intel-g/lib >> [0] [0]PETSC ERROR: Configure run at Tue Dec 21 08:45:25 2010 >> [0] [0]PETSC ERROR: Configure options --download-superlu=1 --download-parmetis=1 --download-superlu_dist=1 --with-debugging=1 --with-error-checking=1 -PETSC_ARCH=linux-intel-g --with-fc="ifort -lmpi" --with-cc="icc -lmpi" --with-gnu-compilers=false >> [0] [0]PETSC ERROR: ------------------------------------------------------------------------ >> [0] [0]PETSC ERROR: PetscMallocAlign() line 49 in src/sys/memory/mal.c >> [0] [0]PETSC ERROR: PetscTrMallocDefault() line 192 in src/sys/memory/mtr.c >> [0] [0]PETSC ERROR: MatStashScatterBegin_Private() line 510 in src/mat/utils/matstash.c >> [0] [0]PETSC ERROR: MatAssemblyBegin_MPIDense() line 286 in src/mat/impls/dense/mpi/mpidense.c >> [0] [0]PETSC ERROR: MatAssemblyBegin() line 4564 in src/mat/interface/matrix.c >> [0] [0]PETSC ERROR: User provided function() line 195 in "unknowndirectory/"Peter.c >> [-1] MPI: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize() >> [-1] MPI: aborting job >> exit >> >> Had tried to use the suggestion to employ -malloc_dump or -malloc_log but do not see any result from the batch run. >> >> Thank you all for any insights you can offer. >> >> >> Best, >> >> Peter. >> > From gdiso at ustc.edu Tue Jan 25 21:57:45 2011 From: gdiso at ustc.edu (Gong Ding) Date: Wed, 26 Jan 2011 11:57:45 +0800 Subject: [petsc-users] How to symmetrical the pattern of an unsymmetrical matrix Message-ID: <59025EE265CF46D5B906CF32B79662DE@cogendaeda> Dear all, I have unsymmetrical jacobian matrix in MPIAIJ format (it is nearlly symmetric, I guess). I'd like to pad it to symmetrical pattern by just add 0 to corresponding matrix entry, which is required to some matrix partition step. Any comment about how to do this? Gong Ding From gaurish108 at gmail.com Tue Jan 25 22:13:45 2011 From: gaurish108 at gmail.com (Gaurish Telang) Date: Tue, 25 Jan 2011 23:13:45 -0500 Subject: [petsc-users] Using SuperLU_Dist with petsc-3.0.0-p12: Necessary to use -pc_type lu ?? Message-ID: Hi I have configured petsc-3.0.0-p12 along with SuperLU_Dist. When we use SuperLU as an external solver the manual saya that we must pass -pc_factor_mat_solver_package superlu_dist as an option. I notice that when we provide -pc_type lu the example runs smoothly. but without this option, -log_summary says WARNING! There are options you set that were not used! WARNING! could be spelling mistake, etc! Option left: name:-pc_factor_mat_solver_package value: superlu_dist Does this mean -pc_type lu must always be provided as an option when using SuperLU_Dist???? Also there was a bug when using SuperLU_Dist along with -ksp_view in petsc-3.1p5. I dont have the same problem when using petsc-3.0.0-p12 . Are there any reported problems when using petsc-3.0.0p12 with SuperLU_Dist Gaurish -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Jan 25 22:34:01 2011 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 25 Jan 2011 22:34:01 -0600 Subject: [petsc-users] Using SuperLU_Dist with petsc-3.0.0-p12: Necessary to use -pc_type lu ?? In-Reply-To: References: Message-ID: On Tue, Jan 25, 2011 at 10:13 PM, Gaurish Telang wrote: > Hi I have configured petsc-3.0.0-p12 along with SuperLU_Dist. > > When we use SuperLU as an external solver the manual saya that we must pass > -pc_factor_mat_solver_package superlu_dist as an option. > > I notice that when we provide -pc_type lu the example runs smoothly. but > without this option, -log_summary says > > WARNING! There are options you set that were not used! > WARNING! could be spelling mistake, etc! > Option left: name:-pc_factor_mat_solver_package value: superlu_dist > > Does this mean -pc_type lu must always be provided as an option when using > SuperLU_Dist???? > You must tell the PC to do LU factorization in order for the solver package to be meaningful. One way to do that is -pc_type lu. > Also there was a bug when using SuperLU_Dist along with -ksp_view in > petsc-3.1p5. I dont have the same problem when using petsc-3.0.0-p12 . Are > there any reported problems when using petsc-3.0.0p12 > with SuperLU_Dist None that I know of. Matt > > Gaurish > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Tue Jan 25 23:08:39 2011 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 25 Jan 2011 23:08:39 -0600 (CST) Subject: [petsc-users] Using SuperLU_Dist with petsc-3.0.0-p12: Necessary to use -pc_type lu ?? In-Reply-To: References: Message-ID: On Tue, 25 Jan 2011, Matthew Knepley wrote: > > Also there was a bug when using SuperLU_Dist along with -ksp_view in > > petsc-3.1p5. I dont have the same problem when using petsc-3.0.0-p12 . Are > > there any reported problems when using petsc-3.0.0p12 > > with SuperLU_Dist > > > None that I know of. Wrt petc-3.0.0 - that version of superlu might be susceptible to hangs in certain cases. Wrt petsc-3.1 - the latest patchlevel [petsc-3.1p7] should have the fix for -ksp_view Satish From balay at mcs.anl.gov Tue Jan 25 23:19:27 2011 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 25 Jan 2011 23:19:27 -0600 (CST) Subject: [petsc-users] Problem on LU factorization In-Reply-To: References: Message-ID: I don't see anything obviously wrong with this build I guess the other thing to do is to build debug version on the machine and run in a debugger to determine the problem. [I believe there is a way to debug on bgl..] Satish On Tue, 25 Jan 2011, Rongliang Chen wrote: > Hi Balay, > > Thank you for your reply. > I have checked my code with valgrind on my own computer and there is no > problem. > But when I run my code on the IBM Blue Gene/L with > "-sub_pc_factor_mat_solver_package superlu", it has such problem. > Since there is not valgrind on IBM Blue Gene/L, I can not test my code with > valgrind on it. > > But if use the PETSC's default LU factorization, there is no such problem. > So I suspect that there is some problem with my petsc's installation. > Can you help me to check if my installation is correct? > Following is the detail of the installation and the configure.log and > make.log are attached. > > Installing Petsc on IBM Blue Gene/L: > > 1. patch -p0 < /contrib/bgl/petsc/petsc-3.0.0-p4/petsc-3.0.0-p4.patch > 2. ./config/bgl-ibm-goto_lapack.py and the the "bgl-ibm-goto_lapack.py" is > : > ****************************************************************************************** > #!/usr/bin/env python > # > # BGL has broken 'libc' dependencies. The option 'LIBS' is used to > # workarround this problem. > # > # LIBS="-lc -lnss_files -lnss_dns -lresolv" > # > # Another workarround is to modify mpicc/mpif77 scripts and make them > # link with the corresponding compilers, and these additional > # libraries. The following tarball has the modified compiler scripts > # > # ftp://ftp.mcs.anl.gov/pub/petsc/tmp/petsc-bgl-tools.tar.gz > # > configure_options = [ > '--with-cc=/contrib/bgl/bin/mpxlc', > '--with-cxx=/contrib/bgl/bin/mpxlC', > '--with-fc=/contrib/bgl/bin/mpxlf -qnosave', > '--with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys', # required by BLACS to > get mpif.h > '--with-lapack-lib=/contrib/bgl/lib/liblapack440.a', > '--with-blas-lib=/contrib/bgl/lib/libblas440.a', > # '--with-blas-lapack-lib=-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib > -lgoto', > > '--with-is-color-value-type=short', > '--with-shared=0', > > '-COPTFLAGS=-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1', > '-CXXOPTFLAGS=-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1', > '-FOPTFLAGS=-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1', > '--with-debugging=0', > > # the following option gets automatically enabled on BGL/with IBM > compilers. > # '--with-fortran-kernels=bgl' > > '--with-x=0', > '--with-x11=0', > '--with-batch=1', > '--with-memcmp-ok', > '--sizeof-char=1', > '--sizeof-void-p=4', > '--sizeof-short=2', > '--sizeof-int=4', > '--sizeof-long=4', > '--sizeof-size-t=4', > '--sizeof-long-long=8', > '--sizeof-float=4', > '--sizeof-double=8', > '--bits-per-byte=8', > '--sizeof-MPI-Comm=4', > '--sizeof-MPI-Fint=4', > '--have-mpi-long-double=1', > > > '--download-superlu=/home/rchen/soft/petsc-3.1-p7-nodebug/externalpackages/superlu_4.0-March_7_2010.tar.gz', > > '--download-superlu_dist=/home/rchen/soft/petsc-3.1-p7-nodebug/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz', > > '--download-parmetis=/home/rchen/soft/petsc-3.1-p7-nodebug/externalpackages/ParMetis-dev-p3.tar.gz', > > '--download-scalapack=/home/rchen/soft/petsc-3.1-p7-nodebug/externalpackages/scalapack.tgz', > > '--download-blacs=/home/rchen/soft/petsc-3.1-p7-nodebug/externalpackages/blacs-dev.tar.gz', > > '--download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7-nodebug/externalpackages/fblaslapack-3.1.1.tar.gz', > > '--download-mumps=/home/rchen/soft/petsc-3.1-p7-nodebug/externalpackages/MUMPS_4.9.2.tar.gz', > > # '--download-f-blas-lapack=1', > # '--download-hypre=1', > # '--download-spooles=1', > # '--download-superlu=1', > # '--download-parmetis=1', > # '--download-superlu_dist=1', > # '--download-blacs=1', > > '-PETSC_ARCH=bgl-ibm-goto-O3_440d' > ] > > if __name__ == '__main__': > import sys,os > sys.path.insert(0,os.path.abspath('config')) > import configure > configure.petsc_configure(configure_options) > > # Extra options used for testing locally > test_options = [] > ************************************************************************ > 3. cqsub -n 1 -t 20 -O conftest -q debug ./conftest > 4. ./reconfigure.py > 5. make all > > Thank you! > > Best, > > Rongliang > > > ---------------------------------------------------------------------- > > > > Message: 1 > > Date: Mon, 24 Jan 2011 16:06:22 -0600 (CST) > > From: Satish Balay > > Subject: Re: [petsc-users] Problem on LU factorization > > To: PETSc users list > > Message-ID: > > > > Content-Type: TEXT/PLAIN; charset=US-ASCII > > > > On Mon, 24 Jan 2011, Matthew Knepley wrote: > > > > > > When I use superlu with command line "-sub_pc_factor_mat_solver_package > > > > superlu", it said > > > > > > "[43]PETSC ERROR: > > > > > > ------------------------------------------------------------------------ > > > > [43]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > > > > probably memory access out of range > > > > [43]PETSC ERROR: Try option -start_in_debugger or > > -on_error_attach_debugger > > > > [43]PETSC ERROR: or see > > > > > > http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[43]PETSCERROR: > > or try > > > > http://valgrind.org on GNU/linux and Apple Mac OS X to find memory > > > > corruption errors > > > > [43]PETSC ERROR: likely location of problem given in stack below > > > > [43]PETSC ERROR: --------------------- Stack Frames > > > > ------------------------------------ > > > > [43]PETSC ERROR: Note: The EXACT line numbers in the stack are not > > > > available, > > > > [43]PETSC ERROR: INSTEAD the line number of the start of the > > function > > > > [43]PETSC ERROR: is given. > > > > [43]PETSC ERROR: [43] MatLUFactorNumeric_SuperLU line 121 > > > > src/mat/impls/aij/seq/superlu/superlu.c > > > > [43]PETSC ERROR: [43] MatLUFactorNumeric line 2575 > > > > src/mat/interface/matrix.c > > > > ............................ > > > > " > > > > > > > > > > Please confirm that you have the latest patch level. If so, send the > > matrix > > > in PETSc binary format to petsc-maint at mcs.anl.gov > > > along with the precise solver options and output of -ksp_view. > > > > More likely there is memory corruption somewhere - should run this > > code with valgrind to weed out such issues.. > > > > Satish > > > > > > ------------------------------ > > > > > From thomas.witkowski at tu-dresden.de Wed Jan 26 06:51:07 2011 From: thomas.witkowski at tu-dresden.de (Thomas Witkowski) Date: Wed, 26 Jan 2011 13:51:07 +0100 Subject: [petsc-users] Implementing Schur complement approach (domain decomposition) Message-ID: <4D4018BB.5060901@tu-dresden.de> I want to solve the equation in my FEM code (that makes already use of PETSc) with a Schur complement approach (iterative substructuring). Although I have some basic knowledge about PETSc, I have no good idea how to start with it. To concretize my question, I want to solve a system of the form [A_II A_IB] * [u_I] = [f_I] [A_IB^T A_BB] [u_B] [f_B] A_II is a block diagonal matrix with each block consisting of all interior node of one partition. A_BB is the block consisting of all bounday nodes. A_IB is the connection between the interior and the bounday node. The same for the unknown vector u und the right hand side vector f. My first idea is not to assemble to matrices A_II, A_IB and A_BB in a global way, but just local and to define their action using a MatShell for each of these matrices. Okay, but what's about the global index of the whole system. Till now I have a continuous global index of the nodes on each partition, what is required by PETSC, if I'm right. But for using a Schur complement approach I need to split the index in the interior and the boundary nodes. Who to circumvent this (first of my) problem? Thank you for any advise, Thomas From hzhang at mcs.anl.gov Wed Jan 26 09:39:17 2011 From: hzhang at mcs.anl.gov (Hong Zhang) Date: Wed, 26 Jan 2011 09:39:17 -0600 Subject: [petsc-users] Problem on LU factorization In-Reply-To: References: Message-ID: As Matt said, this is no enough information on what solver combination is being used. Since superlu_dist works while petsc/superlu's sequential lu fails, you might need parallel lu, or shift if zero pivot causes crash. As suggested by Satish, run code in debug version from which informative error display would be shown. Hong On Tue, Jan 25, 2011 at 11:19 PM, Satish Balay wrote: > I don't see anything obviously wrong with this build > > I guess the other thing to do is to build debug version on the machine > and run in a debugger to determine the problem. [I believe there is a > way to debug on bgl..] > > Satish > > On Tue, 25 Jan 2011, Rongliang Chen wrote: > >> Hi Balay, >> >> Thank you for your reply. >> I have checked my code with valgrind on my own computer and there is no >> problem. >> But when I run my code on the IBM Blue Gene/L with >> "-sub_pc_factor_mat_solver_package superlu", it has such problem. >> Since there is not valgrind on IBM Blue Gene/L, I can not test my code with >> valgrind on it. >> >> But if use the PETSC's default LU factorization, there is no such problem. >> So I suspect that there is some problem with my petsc's installation. >> Can you help me to check if my installation is correct? >> Following is the detail of the installation and the configure.log and >> make.log are attached. >> >> Installing Petsc on IBM Blue Gene/L: >> >> 1. patch -p0 < /contrib/bgl/petsc/petsc-3.0.0-p4/petsc-3.0.0-p4.patch >> 2. ./config/bgl-ibm-goto_lapack.py ?and the ?the "bgl-ibm-goto_lapack.py" is >> : >> ****************************************************************************************** >> #!/usr/bin/env python >> # >> # BGL has broken 'libc' dependencies. The option 'LIBS' is used to >> # workarround this problem. >> # >> # LIBS="-lc -lnss_files -lnss_dns -lresolv" >> # >> # Another workarround is to modify mpicc/mpif77 scripts and make them >> # link with the corresponding compilers, and these additional >> # libraries. The following tarball has the modified compiler scripts >> # >> # ftp://ftp.mcs.anl.gov/pub/petsc/tmp/petsc-bgl-tools.tar.gz >> # >> configure_options = [ >> ? '--with-cc=/contrib/bgl/bin/mpxlc', >> ? '--with-cxx=/contrib/bgl/bin/mpxlC', >> ? '--with-fc=/contrib/bgl/bin/mpxlf -qnosave', >> ? '--with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys', ?# required by BLACS to >> get mpif.h >> ? '--with-lapack-lib=/contrib/bgl/lib/liblapack440.a', >> ? '--with-blas-lib=/contrib/bgl/lib/libblas440.a', >> # ?'--with-blas-lapack-lib=-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib >> -lgoto', >> >> ? '--with-is-color-value-type=short', >> ? '--with-shared=0', >> >> ? '-COPTFLAGS=-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1', >> ? '-CXXOPTFLAGS=-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1', >> ? '-FOPTFLAGS=-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1', >> ? '--with-debugging=0', >> >> ? # the following option gets automatically enabled on BGL/with IBM >> compilers. >> ? # '--with-fortran-kernels=bgl' >> >> ? '--with-x=0', >> ? '--with-x11=0', >> ? '--with-batch=1', >> ? '--with-memcmp-ok', >> ? '--sizeof-char=1', >> ? '--sizeof-void-p=4', >> ? '--sizeof-short=2', >> ? '--sizeof-int=4', >> ? '--sizeof-long=4', >> ? '--sizeof-size-t=4', >> ? '--sizeof-long-long=8', >> ? '--sizeof-float=4', >> ? '--sizeof-double=8', >> ? '--bits-per-byte=8', >> ? '--sizeof-MPI-Comm=4', >> ? '--sizeof-MPI-Fint=4', >> ? '--have-mpi-long-double=1', >> >> >> '--download-superlu=/home/rchen/soft/petsc-3.1-p7-nodebug/externalpackages/superlu_4.0-March_7_2010.tar.gz', >> >> '--download-superlu_dist=/home/rchen/soft/petsc-3.1-p7-nodebug/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz', >> >> '--download-parmetis=/home/rchen/soft/petsc-3.1-p7-nodebug/externalpackages/ParMetis-dev-p3.tar.gz', >> >> '--download-scalapack=/home/rchen/soft/petsc-3.1-p7-nodebug/externalpackages/scalapack.tgz', >> >> '--download-blacs=/home/rchen/soft/petsc-3.1-p7-nodebug/externalpackages/blacs-dev.tar.gz', >> >> '--download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7-nodebug/externalpackages/fblaslapack-3.1.1.tar.gz', >> >> '--download-mumps=/home/rchen/soft/petsc-3.1-p7-nodebug/externalpackages/MUMPS_4.9.2.tar.gz', >> >> # ?'--download-f-blas-lapack=1', >> # ?'--download-hypre=1', >> # ?'--download-spooles=1', >> # ?'--download-superlu=1', >> # ?'--download-parmetis=1', >> # ?'--download-superlu_dist=1', >> # ?'--download-blacs=1', >> >> ? ?'-PETSC_ARCH=bgl-ibm-goto-O3_440d' >> ? ] >> >> if __name__ == '__main__': >> ? import sys,os >> ? sys.path.insert(0,os.path.abspath('config')) >> ? import configure >> ? configure.petsc_configure(configure_options) >> >> # Extra options used for testing locally >> test_options = [] >> ************************************************************************ >> 3. cqsub -n 1 -t 20 -O conftest -q debug ./conftest >> 4. ./reconfigure.py >> 5. make all >> >> Thank you! >> >> Best, >> >> Rongliang >> >> >> ---------------------------------------------------------------------- >> > >> > Message: 1 >> > Date: Mon, 24 Jan 2011 16:06:22 -0600 (CST) >> > From: Satish Balay >> > Subject: Re: [petsc-users] Problem on LU factorization >> > To: PETSc users list >> > Message-ID: >> > ? ? ? ? >> > Content-Type: TEXT/PLAIN; charset=US-ASCII >> > >> > On Mon, 24 Jan 2011, Matthew Knepley wrote: >> > >> > > > When I use superlu with command line "-sub_pc_factor_mat_solver_package >> > > > superlu", it said >> > > >> > > "[43]PETSC ERROR: >> > > > >> > ------------------------------------------------------------------------ >> > > > [43]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >> > > > probably memory access out of range >> > > > [43]PETSC ERROR: Try option -start_in_debugger or >> > -on_error_attach_debugger >> > > > [43]PETSC ERROR: or see >> > > > >> > http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[43]PETSCERROR: >> > or try >> > > > http://valgrind.org on GNU/linux and Apple Mac OS X to find memory >> > > > corruption errors >> > > > [43]PETSC ERROR: likely location of problem given in stack below >> > > > [43]PETSC ERROR: --------------------- ?Stack Frames >> > > > ------------------------------------ >> > > > [43]PETSC ERROR: Note: The EXACT line numbers in the stack are not >> > > > available, >> > > > [43]PETSC ERROR: ? ? ? INSTEAD the line number of the start of the >> > function >> > > > [43]PETSC ERROR: ? ? ? is given. >> > > > [43]PETSC ERROR: [43] MatLUFactorNumeric_SuperLU line 121 >> > > > src/mat/impls/aij/seq/superlu/superlu.c >> > > > [43]PETSC ERROR: [43] MatLUFactorNumeric line 2575 >> > > > src/mat/interface/matrix.c >> > > > ............................ >> > > > ?" >> > > > >> > > >> > > Please confirm that you have the latest patch level. If so, send the >> > matrix >> > > in PETSc binary format to petsc-maint at mcs.anl.gov >> > > along with the precise solver options and output of -ksp_view. >> > >> > More likely there is memory corruption somewhere - should run this >> > code with valgrind to weed out such issues.. >> > >> > Satish >> > >> > >> > ------------------------------ >> > >> > >> > > From rongliang.chan at gmail.com Wed Jan 26 12:45:46 2011 From: rongliang.chan at gmail.com (Rongliang Chen) Date: Wed, 26 Jan 2011 11:45:46 -0700 Subject: [petsc-users] Problem on LU factorization Message-ID: Hello, Thank you for all your reply. I have tested my code using the debug version and there is the same problem. The error message is attached. The problem I am solving is a nonlinear system. I use inexact Newton method with preconditioned restarted GMRES (-ksp_type gnres) for the linear system. The preconditioner is ASM (-pc_type asm) and the sub problem is solved by LU (-sub_pc_type lu). Since there are zeros on the diagonal, I use the command line "-sub_pc_factor_shift_amount 1.e-10 -sub_pc_factor_shift_type nonzero" to add a small number to the zero diagonals. I found some error message in the configure.log(see bellow). Could this be a problem? ********************************************************* # Directories files, # %s (device %ld, inode %ld): impossibilities so far. impossibilities in %lu directories. # %s: could not be stat'd. # %s (device %ld, inode %ld): could not be opened. Recursive variable `%s' references itself (eventually) warning: undefined variable `%.*s' unterminated variable reference %s:%s | %s # Implicit rule search has been done. # Implicit rule search has not been done. # Modification time never checked. # File does not exist. # Last modified %s # File has been updated. # File has not been updated. # Invalid value in `update_status' member! # Not a target: # Also makes: # File is an intermediate prerequisite. # Implicit/static pattern stem: `%s' # A default or MAKEFILES makefile. # Command-line target. # Phony target (prerequisite of .PHONY). # Precious file (prerequisite of .PRECIOUS). # Failed to be updated. # Successfully updated. # Needs to be updated (-q is set). question_flag file.c print_file # Dependencies commands running (THIS IS A BUG). # File is very old. # Commands currently running (THIS IS A BUG). # Invalid value in `command_state' member! # Files # files hash-table stats: %04d-%02d-%02d %02d:%02d:%02d .%09d %s: Timestamp out of range; substituting %s Current time *** Deleting intermediate file `%s' Removing intermediate files... can't rename single-colon `%s' to double-colon `%s' Commands were specified for file `%s' at %s:%lu, but `%s' is now considered the same file as `%s'. Commands for `%s' will be ignored in favor of those for `%s'. Commands for file `%s' were found by implicit rule search, can't rename double-colon `%s' to single-colon `%s' ......... TEST checkDynamicLinker from config.setCompilers(/home/rchen/soft/petsc-3.1-p7-nodebug/config/BuildSystem/config/setCompilers.py:1257) TESTING: checkDynamicLinker from config.setCompilers(config/BuildSystem/config/setCompilers.py:1257) Check that the linker can produce dynamic libraries Checking for header: dlfcn.h sh: /contrib/bgl/bin/mpxlc -E conftest.c Executing: /contrib/bgl/bin/mpxlc -E conftest.c sh: Possible ERROR while running preprocessor: "conftest.c", line 3.10: 1506-296 (S) #include file not found. ret = 256 error message = {"conftest.c", line 3.10: 1506-296 (S) #include file not found. } Source: #include "confdefs.h" #include "conffix.h" #include Dynamic libraries disabled since dlfcn.h was missing ******************************************** Thank you! Regards, Rongliang -------------------------------------------------------------------------- > Message: 5 > Date: Tue, 25 Jan 2011 23:19:27 -0600 (CST) > From: Satish Balay > Subject: Re: [petsc-users] Problem on LU factorization > To: Rongliang Chen > Cc: PETSc Users mail list > Message-ID: > > Content-Type: TEXT/PLAIN; charset=US-ASCII > > I don't see anything obviously wrong with this build > > I guess the other thing to do is to build debug version on the machine > and run in a debugger to determine the problem. [I believe there is a > way to debug on bgl..] > > Satish > > > ------------------------------ > > Message: 7 > Date: Wed, 26 Jan 2011 09:39:17 -0600 > From: Hong Zhang > Subject: Re: [petsc-users] Problem on LU factorization > To: PETSc users list > Message-ID: > > Content-Type: text/plain; charset=ISO-8859-1 > > As Matt said, this is no enough information on what solver combination > is being used. > Since superlu_dist works while petsc/superlu's sequential lu fails, > you might need > parallel lu, or shift if zero pivot causes crash. As suggested by Satish, > run code in debug version from which informative error display would be > shown. > > Hong > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- /bin/bash: SHELL: readonly variable /bin/bash: PATH: readonly variable Can't expand MemType 1: jcol 25274 [43]PETSC ERROR: ------------------------------------------------------------------------ [43]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [43]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [43]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[43]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [43]PETSC ERROR: likely location of problem given in stack below [43]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [43]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [43]PETSC ERROR: INSTEAD the line number of the start of the function [43]PETSC ERROR: is given. [43]PETSC ERROR: [43] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [43]PETSC ERROR: [43] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [43]PETSC ERROR: --------------------- Error Message ------------------------------------ [43]PETSC ERROR: Signal received! [43]PETSC ERROR: ------------------------------------------------------------------------ [43]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [43]PETSC ERROR: See docs/changes/index.html for recent updates. [43]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [43]PETSC ERROR: See docs/index.html for manual pages. [43]PETSC ERROR: ------------------------------------------------------------------------ [43]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 [43]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [43]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 [43]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [43]PETSC ERROR: ------------------------------------------------------------------------ [43]PETSC ERROR: User provided function() line 0 in unknown directory unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 43[64]PETSC ERROR: [65]PETSC ERROR: [66]PETSC ERROR: [67]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ [64]PETSC ERROR: [65]PETSC ERROR: [66]PETSC ERROR: [67]PETSC ERROR: Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end [64]PETSC ERROR: [65]PETSC ERROR: [66]PETSC ERROR: [67]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger Try option -start_in_debugger or -on_error_attach_debugger Try option -start_in_debugger or -on_error_attach_debugger Try option -start_in_debugger or -on_error_attach_debugger [64]PETSC ERROR: [65]PETSC ERROR: [66]PETSC ERROR: [67]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[68]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[69]PETSC ERROR: [64]PETSC ERROR: [71]PETSC ERROR: [65]PETSC ERROR: [70]PETSC ERROR: [66]PETSC ERROR: ------------------------------------------------------------------------ [67]PETSC ERROR: ------------------------------------------------------------------------ or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors ------------------------------------------------------------------------ or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors ------------------------------------------------------------------------ or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [68]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [69]PETSC ERROR: [64]PETSC ERROR: [71]PETSC ERROR: [65]PETSC ERROR: [70]PETSC ERROR: [66]PETSC ERROR: Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end [67]PETSC ERROR: Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end likely location of problem given in stack below Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end likely location of problem given in stack below Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end likely location of problem given in stack below [68]PETSC ERROR: likely location of problem given in stack below [69]PETSC ERROR: [64]PETSC ERROR: [71]PETSC ERROR: [65]PETSC ERROR: [70]PETSC ERROR: [66]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [67]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger --------------------- Stack Frames ------------------------------------ Try option -start_in_debugger or -on_error_attach_debugger --------------------- Stack Frames ------------------------------------ Try option -start_in_debugger or -on_error_attach_debugger --------------------- Stack Frames ------------------------------------ [68]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [69]PETSC ERROR: [64]PETSC ERROR: [71]PETSC ERROR: [65]PETSC ERROR: [70]PETSC ERROR: [66]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[67]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalNote: The EXACT line numbers in the stack are not available, or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalNote: The EXACT line numbers in the stack are not available, or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalNote: The EXACT line numbers in the stack are not available, [68]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [69]PETSC ERROR: [64]PETSC ERROR: [71]PETSC ERROR: [65]PETSC ERROR: [70]PETSC ERROR: [66]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [67]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors INSTEAD the line number of the start of the function or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors INSTEAD the line number of the start of the function or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors INSTEAD the line number of the start of the function [68]PETSC ERROR: INSTEAD the line number of the start of the function [69]PETSC ERROR: [64]PETSC ERROR: [71]PETSC ERROR: [65]PETSC ERROR: [70]PETSC ERROR: [66]PETSC ERROR: likely location of problem given in stack below [67]PETSC ERROR: likely location of problem given in stack below is given. likely location of problem given in stack below is given. likely location of problem given in stack below is given. [68]PETSC ERROR: is given. [69]PETSC ERROR: [64]PETSC ERROR: [71]PETSC ERROR: [65]PETSC ERROR: [70]PETSC ERROR: [66]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [67]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [64] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c --------------------- Stack Frames ------------------------------------ [65] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c --------------------- Stack Frames ------------------------------------ [66] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [68]PETSC ERROR: [67] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [69]PETSC ERROR: [64]PETSC ERROR: [71]PETSC ERROR: [65]PETSC ERROR: [70]PETSC ERROR: [66]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [67]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [64] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c Note: The EXACT line numbers in the stack are not available, [65] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c Note: The EXACT line numbers in the stack are not available, [66] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [68]PETSC ERROR: [67] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [69]PETSC ERROR: [64]PETSC ERROR: [71]PETSC ERROR: [65]PETSC ERROR: [70]PETSC ERROR: [66]PETSC ERROR: INSTEAD the line number of the start of the function [67]PETSC ERROR: INSTEAD the line number of the start of the function --------------------- Error Message ------------------------------------ INSTEAD the line number of the start of the function --------------------- Error Message ------------------------------------ INSTEAD the line number of the start of the function --------------------- Error Message ------------------------------------ [68]PETSC ERROR: --------------------- Error Message ------------------------------------ [69]PETSC ERROR: [64]PETSC ERROR: [71]PETSC ERROR: [65]PETSC ERROR: [70]PETSC ERROR: [66]PETSC ERROR: is given. [67]PETSC ERROR: is given. Signal received! is given. Signal received! is given. Signal received! [68]PETSC ERROR: Signal received! [69]PETSC ERROR: [64]PETSC ERROR: [71]PETSC ERROR: [65]PETSC ERROR: [70]PETSC ERROR: [66]PETSC ERROR: [68] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [67]PETSC ERROR: [69] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c ------------------------------------------------------------------------ [71] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c ------------------------------------------------------------------------ [70] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c ------------------------------------------------------------------------ [68]PETSC ERROR: ------------------------------------------------------------------------ [69]PETSC ERROR: [64]PETSC ERROR: [71]PETSC ERROR: [65]PETSC ERROR: [70]PETSC ERROR: [66]PETSC ERROR: [68] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [67]PETSC ERROR: [69] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [71] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [70] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [68]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [69]PETSC ERROR: [64]PETSC ERROR: [71]PETSC ERROR: [65]PETSC ERROR: [70]PETSC ERROR: [66]PETSC ERROR: --------------------- Error Message ------------------------------------ [67]PETSC ERROR: --------------------- Error Message ------------------------------------ See docs/changes/index.html for recent updates. --------------------- Error Message ------------------------------------ See docs/changes/index.html for recent updates. --------------------- Error Message ------------------------------------ See docs/changes/index.html for recent updates. [68]PETSC ERROR: See docs/changes/index.html for recent updates. [69]PETSC ERROR: [64]PETSC ERROR: [71]PETSC ERROR: [65]PETSC ERROR: [70]PETSC ERROR: [66]PETSC ERROR: Signal received! [67]PETSC ERROR: Signal received! See docs/faq.html for hints about trouble shooting. Signal received! See docs/faq.html for hints about trouble shooting. Signal received! See docs/faq.html for hints about trouble shooting. [68]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [69]PETSC ERROR: [64]PETSC ERROR: [71]PETSC ERROR: [65]PETSC ERROR: [70]PETSC ERROR: [66]PETSC ERROR: ------------------------------------------------------------------------ [67]PETSC ERROR: ------------------------------------------------------------------------ See docs/index.html for manual pages. ------------------------------------------------------------------------ See docs/index.html for manual pages. ------------------------------------------------------------------------ See docs/index.html for manual pages. [68]PETSC ERROR: See docs/index.html for manual pages. [69]PETSC ERROR: [64]PETSC ERROR: [71]PETSC ERROR: [65]PETSC ERROR: [70]PETSC ERROR: [66]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [67]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 ------------------------------------------------------------------------ Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 ------------------------------------------------------------------------ Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 ------------------------------------------------------------------------ [68]PETSC ERROR: ------------------------------------------------------------------------ [69]PETSC ERROR: [64]PETSC ERROR: [71]PETSC ERROR: [65]PETSC ERROR: [70]PETSC ERROR: [66]PETSC ERROR: See docs/changes/index.html for recent updates. [67]PETSC ERROR: See docs/changes/index.html for recent updates. /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0ND by Unknown Wed Jan 26 11:03:56 2011 See docs/changes/index.html for recent updates. /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0ND by Unknown Wed Jan 26 11:03:56 2011 See docs/changes/index.html for recent updates. /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0ND by Unknown Wed Jan 26 11:03:56 2011 [68]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0ND by Unknown Wed Jan 26 11:03:56 2011 [69]PETSC ERROR: [64]PETSC ERROR: [71]PETSC ERROR: [65]PETSC ERROR: [70]PETSC ERROR: [66]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [67]PETSC ERROR: See docs/faq.html for hints about trouble shooting. Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib See docs/faq.html for hints about trouble shooting. Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib See docs/faq.html for hints about trouble shooting. Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [68]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [69]PETSC ERROR: [64]PETSC ERROR: [71]PETSC ERROR: [65]PETSC ERROR: [70]PETSC ERROR: [66]PETSC ERROR: See docs/index.html for manual pages. [67]PETSC ERROR: See docs/index.html for manual pages. Configure run at Sat Dec 25 23:04:15 2010 See docs/index.html for manual pages. Configure run at Sat Dec 25 23:04:15 2010 See docs/index.html for manual pages. Configure run at Sat Dec 25 23:04:15 2010 [68]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 [69]PETSC ERROR: [64]PETSC ERROR: [71]PETSC ERROR: [65]PETSC ERROR: [70]PETSC ERROR: [66]PETSC ERROR: ------------------------------------------------------------------------ [67]PETSC ERROR: ------------------------------------------------------------------------ Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz ------------------------------------------------------------------------ Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz ------------------------------------------------------------------------ Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [68]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [69]PETSC ERROR: [64]PETSC ERROR: [71]PETSC ERROR: [65]PETSC ERROR: [70]PETSC ERROR: [66]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NC by Unknown Wed Jan 26 11:03:56 2011 [67]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NC by Unknown Wed Jan 26 11:03:56 2011 ------------------------------------------------------------------------ /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NC by Unknown Wed Jan 26 11:03:56 2011 ------------------------------------------------------------------------ /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NC by Unknown Wed Jan 26 11:03:56 2011 ------------------------------------------------------------------------ [68]PETSC ERROR: ------------------------------------------------------------------------ [69]PETSC ERROR: [64]PETSC ERROR: [71]PETSC ERROR: [65]PETSC ERROR: [70]PETSC ERROR: [66]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [67]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib User provided function() line 0 in unknown directory unknown file Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib User provided function() line 0 in unknown directory unknown file Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib User provided function() line 0 in unknown directory unknown file [68]PETSC ERROR: User provided function() line 0 in unknown directory unknown file [69]PETSC ERROR: application called MPI_Abort(MPI_COMM_WORLD, 59) - process 64[71]PETSC ERROR: application called MPI_Abort(MPI_COMM_WORLD, 59) - process 65[70]PETSC ERROR: application called MPI_Abort(MPI_COMM_WORLD, 59) - process 66Configure run at Sat Dec 25 23:04:15 2010 application called MPI_Abort(MPI_COMM_WORLD, 59) - process 67Configure run at Sat Dec 25 23:04:15 2010 Configure run at Sat Dec 25 23:04:15 2010 Configure run at Sat Dec 25 23:04:15 2010 [68]PETSC ERROR: [69]PETSC ERROR: [71]PETSC ERROR: [70]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [68]PETSC ERROR: [69]PETSC ERROR: [71]PETSC ERROR: [70]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ [68]PETSC ERROR: [69]PETSC ERROR: [71]PETSC ERROR: [70]PETSC ERROR: User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 68application called MPI_Abort(MPI_COMM_WORLD, 59) - process 69application called MPI_Abort(MPI_COMM_WORLD, 59) - process 71application called MPI_Abort(MPI_COMM_WORLD, 59) - process 70[4]PETSC ERROR: [5]PETSC ERROR: [12]PETSC ERROR: [7]PETSC ERROR: [6]PETSC ERROR: [13]PETSC ERROR: [14]PETSC ERROR: [15]PETSC ERROR: [20]PETSC ERROR: [21]PETSC ERROR: [23]PETSC ERROR: [28]PETSC ERROR: [22]PETSC ERROR: [29]PETSC ERROR: [36]PETSC ERROR: [37]PETSC ERROR: [30]PETSC ERROR: [44]PETSC ERROR: [38]PETSC ERROR: [45]PETSC ERROR: [31]PETSC ERROR: [39]PETSC ERROR: [46]PETSC ERROR: [53]PETSC ERROR: [52]PETSC ERROR: [47]PETSC ERROR: [54]PETSC ERROR: [55]PETSC ERROR: [60]PETSC ERROR: [62]PETSC ERROR: [61]PETSC ERROR: [63]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ [0]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: ------------------------------------------------------------------------ [2]PETSC ERROR: ------------------------------------------------------------------------ [8]PETSC ERROR: ------------------------------------------------------------------------ [3]PETSC ERROR: ------------------------------------------------------------------------ [9]PETSC ERROR: [4]PETSC ERROR: [10]PETSC ERROR: [5]PETSC ERROR: [11]PETSC ERROR: [12]PETSC ERROR: [17]PETSC ERROR: [7]PETSC ERROR: [16]PETSC ERROR: [6]PETSC ERROR: [18]PETSC ERROR: [13]PETSC ERROR: [19]PETSC ERROR: [14]PETSC ERROR: [24]PETSC ERROR: [15]PETSC ERROR: [25]PETSC ERROR: [20]PETSC ERROR: [32]PETSC ERROR: [21]PETSC ERROR: [26]PETSC ERROR: [23]PETSC ERROR: [33]PETSC ERROR: [28]PETSC ERROR: [41]PETSC ERROR: [22]PETSC ERROR: [40]PETSC ERROR: [29]PETSC ERROR: [27]PETSC ERROR: [36]PETSC ERROR: [35]PETSC ERROR: [37]PETSC ERROR: [34]PETSC ERROR: [30]PETSC ERROR: [42]PETSC ERROR: [44]PETSC ERROR: [48]PETSC ERROR: [38]PETSC ERROR: [49]PETSC ERROR: [45]PETSC ERROR: [50]PETSC ERROR: [31]PETSC ERROR: [51]PETSC ERROR: [39]PETSC ERROR: [56]PETSC ERROR: [46]PETSC ERROR: [58]PETSC ERROR: [53]PETSC ERROR: [57]PETSC ERROR: [52]PETSC ERROR: [59]PETSC ERROR: [47]PETSC ERROR: ------------------------------------------------------------------------ [54]PETSC ERROR: ------------------------------------------------------------------------ [55]PETSC ERROR: ------------------------------------------------------------------------ [60]PETSC ERROR: ------------------------------------------------------------------------ [62]PETSC ERROR: ------------------------------------------------------------------------ [61]PETSC ERROR: ------------------------------------------------------------------------ [63]PETSC ERROR: ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end ------------------------------------------------------------------------ Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end [0]PETSC ERROR: Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end [1]PETSC ERROR: Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end [2]PETSC ERROR: Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end [8]PETSC ERROR: Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end [3]PETSC ERROR: Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end [9]PETSC ERROR: Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end [10]PETSC ERROR: Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end [11]PETSC ERROR: [4]PETSC ERROR: [17]PETSC ERROR: [5]PETSC ERROR: [16]PETSC ERROR: [12]PETSC ERROR: [18]PETSC ERROR: [7]PETSC ERROR: [19]PETSC ERROR: [6]PETSC ERROR: [24]PETSC ERROR: [13]PETSC ERROR: [25]PETSC ERROR: [14]PETSC ERROR: [15]PETSC ERROR: [20]PETSC ERROR: [21]PETSC ERROR: [32]PETSC ERROR: [23]PETSC ERROR: [26]PETSC ERROR: [28]PETSC ERROR: [33]PETSC ERROR: [22]PETSC ERROR: [41]PETSC ERROR: [29]PETSC ERROR: [40]PETSC ERROR: [36]PETSC ERROR: [27]PETSC ERROR: [37]PETSC ERROR: [35]PETSC ERROR: [30]PETSC ERROR: [34]PETSC ERROR: [44]PETSC ERROR: [42]PETSC ERROR: [38]PETSC ERROR: [48]PETSC ERROR: [45]PETSC ERROR: [49]PETSC ERROR: [31]PETSC ERROR: [50]PETSC ERROR: [39]PETSC ERROR: [51]PETSC ERROR: [46]PETSC ERROR: [56]PETSC ERROR: [53]PETSC ERROR: [58]PETSC ERROR: [52]PETSC ERROR: [57]PETSC ERROR: [47]PETSC ERROR: [59]PETSC ERROR: [54]PETSC ERROR: Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end [55]PETSC ERROR: Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end [60]PETSC ERROR: Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end [62]PETSC ERROR: Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end [61]PETSC ERROR: Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end [63]PETSC ERROR: Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger Caught signal number 15 Terminate: Somet process (or the batch system) has told this process to end Try option -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [2]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [8]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [3]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [9]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [10]PETSC ERROR: [4]PETSC ERROR: [11]PETSC ERROR: [5]PETSC ERROR: [17]PETSC ERROR: [12]PETSC ERROR: [16]PETSC ERROR: [7]PETSC ERROR: [18]PETSC ERROR: [6]PETSC ERROR: [19]PETSC ERROR: [13]PETSC ERROR: [24]PETSC ERROR: [14]PETSC ERROR: [25]PETSC ERROR: [15]PETSC ERROR: [32]PETSC ERROR: [20]PETSC ERROR: [26]PETSC ERROR: [21]PETSC ERROR: [33]PETSC ERROR: [23]PETSC ERROR: [41]PETSC ERROR: [28]PETSC ERROR: [40]PETSC ERROR: [22]PETSC ERROR: [27]PETSC ERROR: [29]PETSC ERROR: [35]PETSC ERROR: [36]PETSC ERROR: [34]PETSC ERROR: [37]PETSC ERROR: [42]PETSC ERROR: [30]PETSC ERROR: [48]PETSC ERROR: [44]PETSC ERROR: [49]PETSC ERROR: [38]PETSC ERROR: [50]PETSC ERROR: [45]PETSC ERROR: [51]PETSC ERROR: [31]PETSC ERROR: [56]PETSC ERROR: [39]PETSC ERROR: [58]PETSC ERROR: [46]PETSC ERROR: [57]PETSC ERROR: [53]PETSC ERROR: [59]PETSC ERROR: [52]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [47]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [54]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [55]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [60]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [62]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [61]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [63]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#SignalTry option -start_in_debugger or -on_error_attach_debugger or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[1]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[2]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[8]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[3]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[9]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[10]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[11]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[17]PETSC ERROR: [4]PETSC ERROR: [16]PETSC ERROR: [5]PETSC ERROR: [18]PETSC ERROR: [12]PETSC ERROR: [19]PETSC ERROR: [7]PETSC ERROR: [24]PETSC ERROR: [6]PETSC ERROR: [25]PETSC ERROR: [13]PETSC ERROR: [32]PETSC ERROR: [14]PETSC ERROR: [26]PETSC ERROR: [15]PETSC ERROR: [33]PETSC ERROR: [20]PETSC ERROR: [41]PETSC ERROR: [21]PETSC ERROR: [40]PETSC ERROR: [23]PETSC ERROR: [27]PETSC ERROR: [28]PETSC ERROR: [35]PETSC ERROR: [22]PETSC ERROR: [34]PETSC ERROR: [29]PETSC ERROR: [42]PETSC ERROR: [36]PETSC ERROR: [48]PETSC ERROR: [37]PETSC ERROR: [49]PETSC ERROR: [30]PETSC ERROR: [50]PETSC ERROR: [44]PETSC ERROR: [51]PETSC ERROR: [38]PETSC ERROR: [56]PETSC ERROR: [45]PETSC ERROR: [58]PETSC ERROR: [31]PETSC ERROR: [57]PETSC ERROR: [39]PETSC ERROR: [59]PETSC ERROR: [46]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[53]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[52]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[47]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[54]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[55]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[60]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[62]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[61]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signal[63]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors or see http://www.mcs.anl.gov/petsc/petsc-as/documentation/troubleshooting.html#Signalor try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [2]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [8]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [3]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [9]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [10]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [11]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [17]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [16]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [18]PETSC ERROR: [4]PETSC ERROR: [19]PETSC ERROR: [5]PETSC ERROR: [24]PETSC ERROR: [12]PETSC ERROR: [25]PETSC ERROR: [7]PETSC ERROR: [32]PETSC ERROR: [6]PETSC ERROR: [26]PETSC ERROR: [13]PETSC ERROR: [33]PETSC ERROR: [14]PETSC ERROR: [41]PETSC ERROR: [15]PETSC ERROR: [40]PETSC ERROR: [20]PETSC ERROR: [27]PETSC ERROR: [21]PETSC ERROR: [35]PETSC ERROR: [23]PETSC ERROR: [34]PETSC ERROR: [28]PETSC ERROR: [42]PETSC ERROR: [22]PETSC ERROR: [48]PETSC ERROR: [29]PETSC ERROR: [49]PETSC ERROR: [36]PETSC ERROR: [50]PETSC ERROR: [37]PETSC ERROR: [51]PETSC ERROR: [30]PETSC ERROR: [56]PETSC ERROR: [44]PETSC ERROR: [58]PETSC ERROR: [38]PETSC ERROR: [57]PETSC ERROR: [45]PETSC ERROR: [59]PETSC ERROR: [31]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [39]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [46]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [53]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [52]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [47]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [54]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [55]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [60]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [62]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [61]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [63]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors likely location of problem given in stack below or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors likely location of problem given in stack below or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors likely location of problem given in stack below or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors likely location of problem given in stack below or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors likely location of problem given in stack below or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors likely location of problem given in stack below or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors likely location of problem given in stack below or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors likely location of problem given in stack below or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors likely location of problem given in stack below or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors likely location of problem given in stack below or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors likely location of problem given in stack below or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors likely location of problem given in stack below or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors likely location of problem given in stack below or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors likely location of problem given in stack below or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors likely location of problem given in stack below or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors likely location of problem given in stack below or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors likely location of problem given in stack below or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors likely location of problem given in stack below or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors likely location of problem given in stack below or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors likely location of problem given in stack below [0]PETSC ERROR: likely location of problem given in stack below [1]PETSC ERROR: likely location of problem given in stack below [2]PETSC ERROR: likely location of problem given in stack below [8]PETSC ERROR: likely location of problem given in stack below [3]PETSC ERROR: likely location of problem given in stack below [9]PETSC ERROR: likely location of problem given in stack below [10]PETSC ERROR: likely location of problem given in stack below [11]PETSC ERROR: likely location of problem given in stack below [17]PETSC ERROR: likely location of problem given in stack below [16]PETSC ERROR: likely location of problem given in stack below [18]PETSC ERROR: likely location of problem given in stack below [19]PETSC ERROR: likely location of problem given in stack below [24]PETSC ERROR: [4]PETSC ERROR: [25]PETSC ERROR: [5]PETSC ERROR: [32]PETSC ERROR: [12]PETSC ERROR: [26]PETSC ERROR: [7]PETSC ERROR: [33]PETSC ERROR: [6]PETSC ERROR: [41]PETSC ERROR: [13]PETSC ERROR: [40]PETSC ERROR: [14]PETSC ERROR: [27]PETSC ERROR: [15]PETSC ERROR: [35]PETSC ERROR: [20]PETSC ERROR: [34]PETSC ERROR: [21]PETSC ERROR: [42]PETSC ERROR: [23]PETSC ERROR: [48]PETSC ERROR: [28]PETSC ERROR: [49]PETSC ERROR: [22]PETSC ERROR: [50]PETSC ERROR: [29]PETSC ERROR: [51]PETSC ERROR: [36]PETSC ERROR: [56]PETSC ERROR: [37]PETSC ERROR: [58]PETSC ERROR: [30]PETSC ERROR: [57]PETSC ERROR: [44]PETSC ERROR: [59]PETSC ERROR: [38]PETSC ERROR: likely location of problem given in stack below [45]PETSC ERROR: likely location of problem given in stack below [31]PETSC ERROR: likely location of problem given in stack below [39]PETSC ERROR: likely location of problem given in stack below [46]PETSC ERROR: likely location of problem given in stack below [53]PETSC ERROR: likely location of problem given in stack below [52]PETSC ERROR: likely location of problem given in stack below [47]PETSC ERROR: likely location of problem given in stack below [54]PETSC ERROR: likely location of problem given in stack below [55]PETSC ERROR: likely location of problem given in stack below [60]PETSC ERROR: likely location of problem given in stack below [62]PETSC ERROR: likely location of problem given in stack below [61]PETSC ERROR: likely location of problem given in stack below [63]PETSC ERROR: likely location of problem given in stack below --------------------- Stack Frames ------------------------------------ likely location of problem given in stack below --------------------- Stack Frames ------------------------------------ likely location of problem given in stack below --------------------- Stack Frames ------------------------------------ likely location of problem given in stack below --------------------- Stack Frames ------------------------------------ likely location of problem given in stack below --------------------- Stack Frames ------------------------------------ likely location of problem given in stack below --------------------- Stack Frames ------------------------------------ likely location of problem given in stack below --------------------- Stack Frames ------------------------------------ likely location of problem given in stack below --------------------- Stack Frames ------------------------------------ likely location of problem given in stack below --------------------- Stack Frames ------------------------------------ likely location of problem given in stack below --------------------- Stack Frames ------------------------------------ likely location of problem given in stack below --------------------- Stack Frames ------------------------------------ likely location of problem given in stack below --------------------- Stack Frames ------------------------------------ likely location of problem given in stack below --------------------- Stack Frames ------------------------------------ likely location of problem given in stack below --------------------- Stack Frames ------------------------------------ likely location of problem given in stack below --------------------- Stack Frames ------------------------------------ likely location of problem given in stack below --------------------- Stack Frames ------------------------------------ likely location of problem given in stack below --------------------- Stack Frames ------------------------------------ likely location of problem given in stack below --------------------- Stack Frames ------------------------------------ [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [1]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [2]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [8]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [3]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [9]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [10]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [11]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [17]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [16]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [18]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [19]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [24]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [25]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [32]PETSC ERROR: [4]PETSC ERROR: [26]PETSC ERROR: [5]PETSC ERROR: [33]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [41]PETSC ERROR: [12]PETSC ERROR: [40]PETSC ERROR: [7]PETSC ERROR: [27]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [35]PETSC ERROR: [4]PETSC ERROR: [34]PETSC ERROR: [13]PETSC ERROR: [42]PETSC ERROR: [6]PETSC ERROR: [48]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [49]PETSC ERROR: [14]PETSC ERROR: [50]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [51]PETSC ERROR: [15]PETSC ERROR: [56]PETSC ERROR: [20]PETSC ERROR: [58]PETSC ERROR: [5]PETSC ERROR: [57]PETSC ERROR: [21]PETSC ERROR: [59]PETSC ERROR: INSTEAD the line number of the start of the function --------------------- Stack Frames ------------------------------------ [23]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [28]PETSC ERROR: --------------------- Stack Frames ------------------------------------ Note: The EXACT line numbers in the stack are not available, --------------------- Stack Frames ------------------------------------ Note: The EXACT line numbers in the stack are not available, --------------------- Stack Frames ------------------------------------ [29]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [22]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [12]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [37]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [36]PETSC ERROR: --------------------- Stack Frames ------------------------------------ Note: The EXACT line numbers in the stack are not available, --------------------- Stack Frames ------------------------------------ [30]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [7]PETSC ERROR: --------------------- Stack Frames ------------------------------------ Note: The EXACT line numbers in the stack are not available, --------------------- Stack Frames ------------------------------------ [38]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [45]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [44]PETSC ERROR: --------------------- Stack Frames ------------------------------------ Note: The EXACT line numbers in the stack are not available, --------------------- Stack Frames ------------------------------------ [31]PETSC ERROR: --------------------- Stack Frames ------------------------------------ INSTEAD the line number of the start of the function --------------------- Stack Frames ------------------------------------ [39]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [46]PETSC ERROR: --------------------- Stack Frames ------------------------------------ Note: The EXACT line numbers in the stack are not available, --------------------- Stack Frames ------------------------------------ [4]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [52]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [53]PETSC ERROR: --------------------- Stack Frames ------------------------------------ Note: The EXACT line numbers in the stack are not available, --------------------- Stack Frames ------------------------------------ [54]PETSC ERROR: --------------------- Stack Frames ------------------------------------ Note: The EXACT line numbers in the stack are not available, --------------------- Stack Frames ------------------------------------ [47]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [55]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [60]PETSC ERROR: [0]PETSC ERROR: [62]PETSC ERROR: [1]PETSC ERROR: [61]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [63]PETSC ERROR: [2]PETSC ERROR: [13]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [6]PETSC ERROR: [3]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [8]PETSC ERROR: INSTEAD the line number of the start of the function [9]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [10]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, Note: The EXACT line numbers in the stack are not available, [14]PETSC ERROR: [11]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [1]PETSC ERROR: INSTEAD the line number of the start of the function [16]PETSC ERROR: [15]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, Note: The EXACT line numbers in the stack are not available, [17]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [18]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, INSTEAD the line number of the start of the function [20]PETSC ERROR: [19]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [24]PETSC ERROR: [5]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, Note: The EXACT line numbers in the stack are not available, Note: The EXACT line numbers in the stack are not available, Note: The EXACT line numbers in the stack are not available, [32]PETSC ERROR: [21]PETSC ERROR: [25]PETSC ERROR: is given. Note: The EXACT line numbers in the stack are not available, Note: The EXACT line numbers in the stack are not available, [26]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [33]PETSC ERROR: [23]PETSC ERROR: [2]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [41]PETSC ERROR: [28]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, Note: The EXACT line numbers in the stack are not available, INSTEAD the line number of the start of the function Note: The EXACT line numbers in the stack are not available, [40]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [35]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [27]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, Note: The EXACT line numbers in the stack are not available, Note: The EXACT line numbers in the stack are not available, [42]PETSC ERROR: INSTEAD the line number of the start of the function [34]PETSC ERROR: INSTEAD the line number of the start of the function [3]PETSC ERROR: [29]PETSC ERROR: [48]PETSC ERROR: [22]PETSC ERROR: [49]PETSC ERROR: [12]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [37]PETSC ERROR: [50]PETSC ERROR: [36]PETSC ERROR: [51]PETSC ERROR: INSTEAD the line number of the start of the function Note: The EXACT line numbers in the stack are not available, [30]PETSC ERROR: [56]PETSC ERROR: [7]PETSC ERROR: [0]PETSC ERROR: INSTEAD the line number of the start of the function [58]PETSC ERROR: [38]PETSC ERROR: [57]PETSC ERROR: [45]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [44]PETSC ERROR: [59]PETSC ERROR: INSTEAD the line number of the start of the function Note: The EXACT line numbers in the stack are not available, [31]PETSC ERROR: [8]PETSC ERROR: is given. [9]PETSC ERROR: [39]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [46]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, INSTEAD the line number of the start of the function [10]PETSC ERROR: [4]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [52]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [53]PETSC ERROR: INSTEAD the line number of the start of the function INSTEAD the line number of the start of the function Note: The EXACT line numbers in the stack are not available, [54]PETSC ERROR: [11]PETSC ERROR: INSTEAD the line number of the start of the function [1]PETSC ERROR: [47]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [55]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [60]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [62]PETSC ERROR: [16]PETSC ERROR: [61]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [63]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [13]PETSC ERROR: INSTEAD the line number of the start of the function [6]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, INSTEAD the line number of the start of the function Note: The EXACT line numbers in the stack are not available, INSTEAD the line number of the start of the function [17]PETSC ERROR: is given. Note: The EXACT line numbers in the stack are not available, INSTEAD the line number of the start of the function Note: The EXACT line numbers in the stack are not available, INSTEAD the line number of the start of the function [18]PETSC ERROR: [14]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, INSTEAD the line number of the start of the function is given. is given. Note: The EXACT line numbers in the stack are not available, [15]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, INSTEAD the line number of the start of the function [19]PETSC ERROR: INSTEAD the line number of the start of the function Note: The EXACT line numbers in the stack are not available, INSTEAD the line number of the start of the function [24]PETSC ERROR: [20]PETSC ERROR: INSTEAD the line number of the start of the function INSTEAD the line number of the start of the function INSTEAD the line number of the start of the function [5]PETSC ERROR: [32]PETSC ERROR: INSTEAD the line number of the start of the function [25]PETSC ERROR: INSTEAD the line number of the start of the function INSTEAD the line number of the start of the function [21]PETSC ERROR: [26]PETSC ERROR: [4] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [33]PETSC ERROR: INSTEAD the line number of the start of the function [2]PETSC ERROR: INSTEAD the line number of the start of the function [41]PETSC ERROR: [23]PETSC ERROR: INSTEAD the line number of the start of the function INSTEAD the line number of the start of the function is given. [28]PETSC ERROR: [40]PETSC ERROR: INSTEAD the line number of the start of the function [35]PETSC ERROR: INSTEAD the line number of the start of the function [27]PETSC ERROR: INSTEAD the line number of the start of the function INSTEAD the line number of the start of the function INSTEAD the line number of the start of the function [42]PETSC ERROR: INSTEAD the line number of the start of the function [34]PETSC ERROR: INSTEAD the line number of the start of the function [3]PETSC ERROR: is given. [48]PETSC ERROR: is given. [49]PETSC ERROR: [29]PETSC ERROR: INSTEAD the line number of the start of the function [22]PETSC ERROR: [50]PETSC ERROR: [12]PETSC ERROR: [51]PETSC ERROR: [37]PETSC ERROR: INSTEAD the line number of the start of the function [36]PETSC ERROR: [56]PETSC ERROR: is given. [0]PETSC ERROR: [30]PETSC ERROR: [58]PETSC ERROR: [7]PETSC ERROR: [57]PETSC ERROR: is given. INSTEAD the line number of the start of the function [38]PETSC ERROR: [59]PETSC ERROR: [45]PETSC ERROR: INSTEAD the line number of the start of the function [44]PETSC ERROR: [8]PETSC ERROR: is given. [9]PETSC ERROR: [31]PETSC ERROR: INSTEAD the line number of the start of the function [5] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c INSTEAD the line number of the start of the function [39]PETSC ERROR: [10]PETSC ERROR: [46]PETSC ERROR: INSTEAD the line number of the start of the function is given. INSTEAD the line number of the start of the function [4]PETSC ERROR: is given. [52]PETSC ERROR: INSTEAD the line number of the start of the function [53]PETSC ERROR: [11]PETSC ERROR: is given. [1]PETSC ERROR: [54]PETSC ERROR: INSTEAD the line number of the start of the function is given. INSTEAD the line number of the start of the function [47]PETSC ERROR: INSTEAD the line number of the start of the function [55]PETSC ERROR: [16]PETSC ERROR: [60]PETSC ERROR: INSTEAD the line number of the start of the function [62]PETSC ERROR: INSTEAD the line number of the start of the function [61]PETSC ERROR: is given. [63]PETSC ERROR: INSTEAD the line number of the start of the function [13]PETSC ERROR: INSTEAD the line number of the start of the function [6]PETSC ERROR: [17]PETSC ERROR: is given. INSTEAD the line number of the start of the function is given. INSTEAD the line number of the start of the function [12] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [18]PETSC ERROR: is given. INSTEAD the line number of the start of the function is given. [0] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [14]PETSC ERROR: INSTEAD the line number of the start of the function is given. INSTEAD the line number of the start of the function [7] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [19]PETSC ERROR: [15]PETSC ERROR: INSTEAD the line number of the start of the function is given. [24]PETSC ERROR: is given. is given. is given. is given. [20]PETSC ERROR: [32]PETSC ERROR: is given. [25]PETSC ERROR: [5]PETSC ERROR: is given. is given. [26]PETSC ERROR: is given. [33]PETSC ERROR: [21]PETSC ERROR: [2]PETSC ERROR: [4] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [41]PETSC ERROR: is given. is given. is given. [1] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [23]PETSC ERROR: [40]PETSC ERROR: is given. [35]PETSC ERROR: [28]PETSC ERROR: [27]PETSC ERROR: is given. is given. is given. [42]PETSC ERROR: is given. [34]PETSC ERROR: is given. [3]PETSC ERROR: is given. [48]PETSC ERROR: is given. [49]PETSC ERROR: [13] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c is given. [6] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [50]PETSC ERROR: [29]PETSC ERROR: [51]PETSC ERROR: [22]PETSC ERROR: is given. [12]PETSC ERROR: [56]PETSC ERROR: [37]PETSC ERROR: [0]PETSC ERROR: [36]PETSC ERROR: [58]PETSC ERROR: [14] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [57]PETSC ERROR: [30]PETSC ERROR: is given. [7]PETSC ERROR: [59]PETSC ERROR: [15] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c is given. [38]PETSC ERROR: [8]PETSC ERROR: [45]PETSC ERROR: [9]PETSC ERROR: [44]PETSC ERROR: is given. [20] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c is given. [31]PETSC ERROR: [10]PETSC ERROR: [5] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c is given. [39]PETSC ERROR: is given. [46]PETSC ERROR: [2] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [21] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c is given. [4]PETSC ERROR: [11]PETSC ERROR: [52]PETSC ERROR: [1]PETSC ERROR: [53]PETSC ERROR: is given. [23] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c is given. [54]PETSC ERROR: is given. [28] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [16]PETSC ERROR: [47]PETSC ERROR: is given. [55]PETSC ERROR: is given. [60]PETSC ERROR: [3] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [62]PETSC ERROR: is given. [61]PETSC ERROR: is given. [63]PETSC ERROR: [17]PETSC ERROR: [13]PETSC ERROR: is given. [6]PETSC ERROR: is given. [29] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [18]PETSC ERROR: [22] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c is given. [12] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [0] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [37] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c is given. [36] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c is given. [14]PETSC ERROR: [19]PETSC ERROR: [30] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c is given. [7] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [24]PETSC ERROR: [15]PETSC ERROR: [8] VecSet line 492 src/vec/vec/interface/rvector.c [38] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [9] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [45] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [32]PETSC ERROR: [44] VecSet line 492 src/vec/vec/interface/rvector.c [25]PETSC ERROR: [20]PETSC ERROR: [10] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [31] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [26]PETSC ERROR: [5]PETSC ERROR: [33]PETSC ERROR: [39] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [2]PETSC ERROR: [46] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [41]PETSC ERROR: [21]PETSC ERROR: [11] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c --------------------- Error Message ------------------------------------ [1] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [52] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [40]PETSC ERROR: [53] VecSet line 492 src/vec/vec/interface/rvector.c [35]PETSC ERROR: [23]PETSC ERROR: [27]PETSC ERROR: [54] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [16] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [28]PETSC ERROR: [42]PETSC ERROR: [47] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [34]PETSC ERROR: [55] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [3]PETSC ERROR: [60] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [48]PETSC ERROR: [62] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [49]PETSC ERROR: [61] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [17] VecSet line 492 src/vec/vec/interface/rvector.c [63] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [50]PETSC ERROR: [13] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [51]PETSC ERROR: [6] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [18] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [29]PETSC ERROR: [56]PETSC ERROR: [22]PETSC ERROR: [0]PETSC ERROR: [12]PETSC ERROR: [58]PETSC ERROR: [37]PETSC ERROR: [57]PETSC ERROR: [36]PETSC ERROR: [19] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [14] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [59]PETSC ERROR: [30]PETSC ERROR: [24] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [7]PETSC ERROR: [8]PETSC ERROR: [15] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [9]PETSC ERROR: [38]PETSC ERROR: [32] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [45]PETSC ERROR: [25] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [44]PETSC ERROR: [10]PETSC ERROR: [20] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [26] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [31]PETSC ERROR: [33] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c --------------------- Error Message ------------------------------------ [2] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [39]PETSC ERROR: [41] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [46]PETSC ERROR: [11]PETSC ERROR: [21] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [1]PETSC ERROR: [4]PETSC ERROR: [40] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [52]PETSC ERROR: [35] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [53]PETSC ERROR: [27] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [23] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [16]PETSC ERROR: [54]PETSC ERROR: [42] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [28] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [34] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [47]PETSC ERROR: [3] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [55]PETSC ERROR: [48] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [60]PETSC ERROR: [49] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [62]PETSC ERROR: [17]PETSC ERROR: [61]PETSC ERROR: [50] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [63]PETSC ERROR: [51] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [13]PETSC ERROR: [18]PETSC ERROR: [6]PETSC ERROR: [56] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [29] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c --------------------- Error Message ------------------------------------ [22] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [58] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c --------------------- Error Message ------------------------------------ [57] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [37] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [19]PETSC ERROR: [36] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [59] MatLUFactorNumeric_SuperLU line 121 src/mat/impls/aij/seq/superlu/superlu.c [14]PETSC ERROR: [24]PETSC ERROR: [30] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [9] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [15]PETSC ERROR: [32]PETSC ERROR: [38] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [25]PETSC ERROR: [45] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [10] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c --------------------- Error Message ------------------------------------ [26]PETSC ERROR: [20]PETSC ERROR: [33]PETSC ERROR: [31] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [2]PETSC ERROR: [5]PETSC ERROR: [41]PETSC ERROR: [39] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [11] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [46] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c --------------------- Error Message ------------------------------------ [21]PETSC ERROR: [40]PETSC ERROR: Signal received! [35]PETSC ERROR: [52] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [27]PETSC ERROR: --------------------- Error Message ------------------------------------ [16] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [23]PETSC ERROR: [42]PETSC ERROR: [54] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [34]PETSC ERROR: [28]PETSC ERROR: [3]PETSC ERROR: [47] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [48]PETSC ERROR: [55] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [49]PETSC ERROR: [60] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c --------------------- Error Message ------------------------------------ [62] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [50]PETSC ERROR: [61] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [51]PETSC ERROR: [63] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [18] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c --------------------- Error Message ------------------------------------ [56]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: [29]PETSC ERROR: [58]PETSC ERROR: [22]PETSC ERROR: [57]PETSC ERROR: [12]PETSC ERROR: [19] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [37]PETSC ERROR: [59]PETSC ERROR: [36]PETSC ERROR: [24] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c --------------------- Error Message ------------------------------------ [8]PETSC ERROR: [30]PETSC ERROR: [9]PETSC ERROR: [7]PETSC ERROR: [32] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c --------------------- Error Message ------------------------------------ [25] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [38]PETSC ERROR: [10]PETSC ERROR: [45]PETSC ERROR: [26] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [44]PETSC ERROR: [33] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [31]PETSC ERROR: [41] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c Signal received! [11]PETSC ERROR: [39]PETSC ERROR: [1]PETSC ERROR: [46]PETSC ERROR: [40] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c --------------------- Error Message ------------------------------------ [35] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [4]PETSC ERROR: [27] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [52]PETSC ERROR: [16]PETSC ERROR: [53]PETSC ERROR: [42] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c --------------------- Error Message ------------------------------------ [34] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [54]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [48] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [47]PETSC ERROR: [49] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [55]PETSC ERROR: [17]PETSC ERROR: [60]PETSC ERROR: [50] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [62]PETSC ERROR: [51] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [61]PETSC ERROR: [18]PETSC ERROR: [63]PETSC ERROR: [56] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c [13]PETSC ERROR: Signal received! [6]PETSC ERROR: [58] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c --------------------- Error Message ------------------------------------ [57] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c --------------------- Error Message ------------------------------------ [19]PETSC ERROR: Signal received! [59] MatLUFactorNumeric line 2575 src/mat/interface/matrix.c --------------------- Error Message ------------------------------------ [24]PETSC ERROR: --------------------- Error Message ------------------------------------ Signal received! [14]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [32]PETSC ERROR: Signal received! [25]PETSC ERROR: [15]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [26]PETSC ERROR: --------------------- Error Message ------------------------------------ [33]PETSC ERROR: Signal received! [2]PETSC ERROR: [20]PETSC ERROR: [41]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [5]PETSC ERROR: Signal received! --------------------- Error Message ------------------------------------ [40]PETSC ERROR: --------------------- Error Message ------------------------------------ [35]PETSC ERROR: [21]PETSC ERROR: [27]PETSC ERROR: ------------------------------------------------------------------------ --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [42]PETSC ERROR: Signal received! [34]PETSC ERROR: [23]PETSC ERROR: [3]PETSC ERROR: --------------------- Error Message ------------------------------------ [48]PETSC ERROR: [28]PETSC ERROR: [49]PETSC ERROR: --------------------- Error Message ------------------------------------ Signal received! --------------------- Error Message ------------------------------------ [50]PETSC ERROR: --------------------- Error Message ------------------------------------ [51]PETSC ERROR: --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ --------------------- Error Message ------------------------------------ [56]PETSC ERROR: --------------------- Error Message ------------------------------------ [0]PETSC ERROR: Signal received! [58]PETSC ERROR: Signal received! [57]PETSC ERROR: [29]PETSC ERROR: --------------------- Error Message ------------------------------------ [22]PETSC ERROR: [59]PETSC ERROR: [12]PETSC ERROR: --------------------- Error Message ------------------------------------ [37]PETSC ERROR: [8]PETSC ERROR: [36]PETSC ERROR: [9]PETSC ERROR: Signal received! --------------------- Error Message ------------------------------------ [30]PETSC ERROR: --------------------- Error Message ------------------------------------ [7]PETSC ERROR: [10]PETSC ERROR: Signal received! --------------------- Error Message ------------------------------------ [38]PETSC ERROR: --------------------- Error Message ------------------------------------ [45]PETSC ERROR: Signal received! [44]PETSC ERROR: --------------------- Error Message ------------------------------------ Signal received! [11]PETSC ERROR: [31]PETSC ERROR: [1]PETSC ERROR: ------------------------------------------------------------------------ --------------------- Error Message ------------------------------------ [39]PETSC ERROR: --------------------- Error Message ------------------------------------ [46]PETSC ERROR: --------------------- Error Message ------------------------------------ Signal received! [16]PETSC ERROR: [4]PETSC ERROR: --------------------- Error Message ------------------------------------ [52]PETSC ERROR: --------------------- Error Message ------------------------------------ [53]PETSC ERROR: Signal received! Signal received! --------------------- Error Message ------------------------------------ [54]PETSC ERROR: --------------------- Error Message ------------------------------------ Signal received! [17]PETSC ERROR: [47]PETSC ERROR: --------------------- Error Message ------------------------------------ [55]PETSC ERROR: --------------------- Error Message ------------------------------------ [60]PETSC ERROR: [18]PETSC ERROR: [62]PETSC ERROR: --------------------- Error Message ------------------------------------ [61]PETSC ERROR: ------------------------------------------------------------------------ [63]PETSC ERROR: --------------------- Error Message ------------------------------------ [13]PETSC ERROR: --------------------- Error Message ------------------------------------ [6]PETSC ERROR: [19]PETSC ERROR: Signal received! --------------------- Error Message ------------------------------------ Signal received! [24]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ Signal received! Signal received! Signal received! [32]PETSC ERROR: [14]PETSC ERROR: [25]PETSC ERROR: Signal received! Signal received! ------------------------------------------------------------------------ [26]PETSC ERROR: [15]PETSC ERROR: [33]PETSC ERROR: Signal received! [2]PETSC ERROR: Signal received! [41]PETSC ERROR: ------------------------------------------------------------------------ Signal received! [20]PETSC ERROR: ------------------------------------------------------------------------ Signal received! [40]PETSC ERROR: [5]PETSC ERROR: [35]PETSC ERROR: Signal received! [27]PETSC ERROR: Signal received! Signal received! [21]PETSC ERROR: [42]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [34]PETSC ERROR: Signal received! [3]PETSC ERROR: ------------------------------------------------------------------------ [48]PETSC ERROR: [23]PETSC ERROR: [49]PETSC ERROR: Signal received! ------------------------------------------------------------------------ [28]PETSC ERROR: [50]PETSC ERROR: Signal received! [51]PETSC ERROR: Signal received! Signal received! Signal received! [56]PETSC ERROR: Signal received! [0]PETSC ERROR: Signal received! [58]PETSC ERROR: Signal received! [57]PETSC ERROR: ------------------------------------------------------------------------ Signal received! ------------------------------------------------------------------------ [59]PETSC ERROR: [29]PETSC ERROR: Signal received! [22]PETSC ERROR: [8]PETSC ERROR: [12]PETSC ERROR: [9]PETSC ERROR: [37]PETSC ERROR: Signal received! [36]PETSC ERROR: Signal received! ------------------------------------------------------------------------ [10]PETSC ERROR: [30]PETSC ERROR: Signal received! [7]PETSC ERROR: Signal received! ------------------------------------------------------------------------ ------------------------------------------------------------------------ [38]PETSC ERROR: Signal received! [45]PETSC ERROR: [11]PETSC ERROR: [44]PETSC ERROR: [1]PETSC ERROR: ------------------------------------------------------------------------ Signal received! [31]PETSC ERROR: Signal received! Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 Signal received! [39]PETSC ERROR: [16]PETSC ERROR: [46]PETSC ERROR: Signal received! ------------------------------------------------------------------------ Signal received! [4]PETSC ERROR: ------------------------------------------------------------------------ [52]PETSC ERROR: Signal received! [53]PETSC ERROR: Signal received! ------------------------------------------------------------------------ [17]PETSC ERROR: [54]PETSC ERROR: Signal received! ------------------------------------------------------------------------ Signal received! [47]PETSC ERROR: [18]PETSC ERROR: [55]PETSC ERROR: Signal received! [60]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [62]PETSC ERROR: Signal received! [61]PETSC ERROR: Signal received! [63]PETSC ERROR: [19]PETSC ERROR: [13]PETSC ERROR: Signal received! [6]PETSC ERROR: [24]PETSC ERROR: ------------------------------------------------------------------------ Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 ------------------------------------------------------------------------ ------------------------------------------------------------------------ Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [32]PETSC ERROR: ------------------------------------------------------------------------ [25]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ [14]PETSC ERROR: [26]PETSC ERROR: ------------------------------------------------------------------------ [33]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [2]PETSC ERROR: [15]PETSC ERROR: [41]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [40]PETSC ERROR: [20]PETSC ERROR: [35]PETSC ERROR: ------------------------------------------------------------------------ [27]PETSC ERROR: [5]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ [42]PETSC ERROR: ------------------------------------------------------------------------ [34]PETSC ERROR: [21]PETSC ERROR: [3]PETSC ERROR: See docs/changes/index.html for recent updates. [48]PETSC ERROR: ------------------------------------------------------------------------ [49]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [23]PETSC ERROR: [50]PETSC ERROR: ------------------------------------------------------------------------ [51]PETSC ERROR: [28]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ [56]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: ------------------------------------------------------------------------ [58]PETSC ERROR: ------------------------------------------------------------------------ [57]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ [59]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 ------------------------------------------------------------------------ Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [8]PETSC ERROR: [29]PETSC ERROR: [9]PETSC ERROR: [22]PETSC ERROR: ------------------------------------------------------------------------ [12]PETSC ERROR: ------------------------------------------------------------------------ [37]PETSC ERROR: [10]PETSC ERROR: [36]PETSC ERROR: ------------------------------------------------------------------------ Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 ------------------------------------------------------------------------ [30]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [7]PETSC ERROR: ------------------------------------------------------------------------ Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [11]PETSC ERROR: [38]PETSC ERROR: [1]PETSC ERROR: [45]PETSC ERROR: ------------------------------------------------------------------------ [44]PETSC ERROR: ------------------------------------------------------------------------ Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 ------------------------------------------------------------------------ [31]PETSC ERROR: [16]PETSC ERROR: See docs/changes/index.html for recent updates. ------------------------------------------------------------------------ [39]PETSC ERROR: ------------------------------------------------------------------------ [46]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 ------------------------------------------------------------------------ [4]PETSC ERROR: ------------------------------------------------------------------------ [52]PETSC ERROR: [17]PETSC ERROR: [53]PETSC ERROR: ------------------------------------------------------------------------ Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 ------------------------------------------------------------------------ [54]PETSC ERROR: [18]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 ------------------------------------------------------------------------ [47]PETSC ERROR: See docs/changes/index.html for recent updates. [55]PETSC ERROR: ------------------------------------------------------------------------ [60]PETSC ERROR: ------------------------------------------------------------------------ [62]PETSC ERROR: [19]PETSC ERROR: [61]PETSC ERROR: ------------------------------------------------------------------------ [63]PETSC ERROR: [24]PETSC ERROR: [13]PETSC ERROR: See docs/changes/index.html for recent updates. [6]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [32]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [25]PETSC ERROR: See docs/changes/index.html for recent updates. Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [26]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [33]PETSC ERROR: [14]PETSC ERROR: [2]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [41]PETSC ERROR: See docs/changes/index.html for recent updates. Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [15]PETSC ERROR: See docs/changes/index.html for recent updates. Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [40]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [35]PETSC ERROR: See docs/changes/index.html for recent updates. [27]PETSC ERROR: [20]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [42]PETSC ERROR: [5]PETSC ERROR: [34]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [3]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [48]PETSC ERROR: [21]PETSC ERROR: [49]PETSC ERROR: See docs/faq.html for hints about trouble shooting. See docs/changes/index.html for recent updates. Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [50]PETSC ERROR: See docs/changes/index.html for recent updates. [51]PETSC ERROR: [23]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [56]PETSC ERROR: [28]PETSC ERROR: [0]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [58]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [57]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [59]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [8]PETSC ERROR: See docs/changes/index.html for recent updates. [9]PETSC ERROR: See docs/changes/index.html for recent updates. Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [29]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [22]PETSC ERROR: [10]PETSC ERROR: [12]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [37]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [36]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [30]PETSC ERROR: [11]PETSC ERROR: [7]PETSC ERROR: [1]PETSC ERROR: See docs/changes/index.html for recent updates. Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [38]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [45]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [44]PETSC ERROR: [16]PETSC ERROR: See docs/changes/index.html for recent updates. Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [31]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 See docs/faq.html for hints about trouble shooting. See docs/changes/index.html for recent updates. [39]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [46]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 See docs/changes/index.html for recent updates. [17]PETSC ERROR: [4]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [52]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [53]PETSC ERROR: [18]PETSC ERROR: See docs/changes/index.html for recent updates. Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [54]PETSC ERROR: See docs/faq.html for hints about trouble shooting. See docs/changes/index.html for recent updates. Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [47]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [55]PETSC ERROR: [19]PETSC ERROR: [60]PETSC ERROR: Petsc Release Version 3.1.0, Patch 7, Mon Dec 20 14:26:37 CST 2010 [62]PETSC ERROR: [24]PETSC ERROR: [61]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [63]PETSC ERROR: See docs/changes/index.html for recent updates. [13]PETSC ERROR: [32]PETSC ERROR: [6]PETSC ERROR: [25]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. [26]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [33]PETSC ERROR: See docs/changes/index.html for recent updates. [2]PETSC ERROR: See docs/changes/index.html for recent updates. [41]PETSC ERROR: [14]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. See docs/faq.html for hints about trouble shooting. See docs/faq.html for hints about trouble shooting. [40]PETSC ERROR: [15]PETSC ERROR: [35]PETSC ERROR: See docs/changes/index.html for recent updates. [27]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. See docs/faq.html for hints about trouble shooting. [42]PETSC ERROR: [20]PETSC ERROR: [34]PETSC ERROR: See docs/changes/index.html for recent updates. [3]PETSC ERROR: [5]PETSC ERROR: [48]PETSC ERROR: See docs/changes/index.html for recent updates. [49]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/faq.html for hints about trouble shooting. [21]PETSC ERROR: [50]PETSC ERROR: See docs/index.html for manual pages. [51]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. See docs/faq.html for hints about trouble shooting. [56]PETSC ERROR: [23]PETSC ERROR: [0]PETSC ERROR: See docs/changes/index.html for recent updates. [58]PETSC ERROR: [28]PETSC ERROR: [57]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. [59]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. [8]PETSC ERROR: See docs/changes/index.html for recent updates. [9]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/changes/index.html for recent updates. See docs/faq.html for hints about trouble shooting. See docs/changes/index.html for recent updates. See docs/faq.html for hints about trouble shooting. [10]PETSC ERROR: [29]PETSC ERROR: See docs/changes/index.html for recent updates. [22]PETSC ERROR: See docs/changes/index.html for recent updates. [12]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [37]PETSC ERROR: See docs/changes/index.html for recent updates. [36]PETSC ERROR: [11]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [1]PETSC ERROR: [30]PETSC ERROR: See docs/changes/index.html for recent updates. [7]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/faq.html for hints about trouble shooting. See docs/changes/index.html for recent updates. [38]PETSC ERROR: [16]PETSC ERROR: [45]PETSC ERROR: See docs/changes/index.html for recent updates. [44]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/faq.html for hints about trouble shooting. See docs/faq.html for hints about trouble shooting. [31]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/index.html for manual pages. See docs/changes/index.html for recent updates. [39]PETSC ERROR: [17]PETSC ERROR: [46]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/faq.html for hints about trouble shooting. See docs/changes/index.html for recent updates. [4]PETSC ERROR: [18]PETSC ERROR: [52]PETSC ERROR: See docs/changes/index.html for recent updates. [53]PETSC ERROR: See docs/index.html for manual pages. See docs/faq.html for hints about trouble shooting. See docs/changes/index.html for recent updates. [54]PETSC ERROR: See docs/changes/index.html for recent updates. See docs/faq.html for hints about trouble shooting. [19]PETSC ERROR: [47]PETSC ERROR: See docs/changes/index.html for recent updates. [55]PETSC ERROR: [24]PETSC ERROR: [60]PETSC ERROR: See docs/index.html for manual pages. [62]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [61]PETSC ERROR: [32]PETSC ERROR: [63]PETSC ERROR: [25]PETSC ERROR: [13]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [6]PETSC ERROR: [26]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [33]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [2]PETSC ERROR: See docs/index.html for manual pages. [41]PETSC ERROR: See docs/faq.html for hints about trouble shooting. See docs/faq.html for hints about trouble shooting. See docs/faq.html for hints about trouble shooting. See docs/index.html for manual pages. [14]PETSC ERROR: [40]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [35]PETSC ERROR: See docs/index.html for manual pages. [27]PETSC ERROR: [15]PETSC ERROR: See docs/faq.html for hints about trouble shooting. See docs/faq.html for hints about trouble shooting. [42]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [34]PETSC ERROR: See docs/index.html for manual pages. [3]PETSC ERROR: [20]PETSC ERROR: [48]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [49]PETSC ERROR: [5]PETSC ERROR: See docs/index.html for manual pages. See docs/faq.html for hints about trouble shooting. [50]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [51]PETSC ERROR: [21]PETSC ERROR: See docs/faq.html for hints about trouble shooting. ------------------------------------------------------------------------ [56]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [0]PETSC ERROR: See docs/index.html for manual pages. [58]PETSC ERROR: [23]PETSC ERROR: [57]PETSC ERROR: See docs/faq.html for hints about trouble shooting. See docs/faq.html for hints about trouble shooting. [28]PETSC ERROR: [59]PETSC ERROR: See docs/faq.html for hints about trouble shooting. See docs/faq.html for hints about trouble shooting. See docs/faq.html for hints about trouble shooting. [8]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [9]PETSC ERROR: See docs/faq.html for hints about trouble shooting. See docs/faq.html for hints about trouble shooting. See docs/faq.html for hints about trouble shooting. See docs/faq.html for hints about trouble shooting. See docs/faq.html for hints about trouble shooting. [10]PETSC ERROR: See docs/index.html for manual pages. See docs/faq.html for hints about trouble shooting. See docs/index.html for manual pages. See docs/faq.html for hints about trouble shooting. [29]PETSC ERROR: See docs/index.html for manual pages. [22]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [12]PETSC ERROR: [11]PETSC ERROR: [37]PETSC ERROR: [1]PETSC ERROR: [36]PETSC ERROR: See docs/faq.html for hints about trouble shooting. See docs/index.html for manual pages. See docs/faq.html for hints about trouble shooting. [30]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [7]PETSC ERROR: [16]PETSC ERROR: See docs/index.html for manual pages. See docs/faq.html for hints about trouble shooting. [38]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [45]PETSC ERROR: See docs/index.html for manual pages. [44]PETSC ERROR: See docs/faq.html for hints about trouble shooting. See docs/index.html for manual pages. See docs/faq.html for hints about trouble shooting. [31]PETSC ERROR: [17]PETSC ERROR: ------------------------------------------------------------------------ See docs/faq.html for hints about trouble shooting. [39]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [46]PETSC ERROR: [18]PETSC ERROR: See docs/index.html for manual pages. See docs/faq.html for hints about trouble shooting. [4]PETSC ERROR: ------------------------------------------------------------------------ [52]PETSC ERROR: See docs/faq.html for hints about trouble shooting. [53]PETSC ERROR: See docs/faq.html for hints about trouble shooting. See docs/index.html for manual pages. [19]PETSC ERROR: [54]PETSC ERROR: See docs/faq.html for hints about trouble shooting. See docs/index.html for manual pages. [24]PETSC ERROR: [47]PETSC ERROR: ------------------------------------------------------------------------ [55]PETSC ERROR: See docs/index.html for manual pages. [60]PETSC ERROR: [32]PETSC ERROR: [62]PETSC ERROR: [25]PETSC ERROR: [61]PETSC ERROR: See docs/index.html for manual pages. [63]PETSC ERROR: [26]PETSC ERROR: [13]PETSC ERROR: [33]PETSC ERROR: [6]PETSC ERROR: [2]PETSC ERROR: See docs/index.html for manual pages. [41]PETSC ERROR: See docs/index.html for manual pages. See docs/index.html for manual pages. ------------------------------------------------------------------------ ------------------------------------------------------------------------ See docs/index.html for manual pages. [40]PETSC ERROR: See docs/index.html for manual pages. [35]PETSC ERROR: [14]PETSC ERROR: [27]PETSC ERROR: See docs/index.html for manual pages. See docs/index.html for manual pages. ------------------------------------------------------------------------ [42]PETSC ERROR: [15]PETSC ERROR: [34]PETSC ERROR: See docs/index.html for manual pages. [3]PETSC ERROR: See docs/index.html for manual pages. [48]PETSC ERROR: ------------------------------------------------------------------------ [49]PETSC ERROR: [20]PETSC ERROR: ------------------------------------------------------------------------ See docs/index.html for manual pages. [50]PETSC ERROR: [5]PETSC ERROR: [51]PETSC ERROR: See docs/index.html for manual pages. See docs/index.html for manual pages. See docs/index.html for manual pages. [56]PETSC ERROR: [21]PETSC ERROR: [0]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 [58]PETSC ERROR: See docs/index.html for manual pages. [57]PETSC ERROR: ------------------------------------------------------------------------ See docs/index.html for manual pages. [23]PETSC ERROR: [59]PETSC ERROR: See docs/index.html for manual pages. See docs/index.html for manual pages. [28]PETSC ERROR: [8]PETSC ERROR: See docs/index.html for manual pages. [9]PETSC ERROR: See docs/index.html for manual pages. See docs/index.html for manual pages. See docs/index.html for manual pages. See docs/index.html for manual pages. See docs/index.html for manual pages. [10]PETSC ERROR: See docs/index.html for manual pages. See docs/index.html for manual pages. See docs/index.html for manual pages. See docs/index.html for manual pages. ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ See docs/index.html for manual pages. [29]PETSC ERROR: [11]PETSC ERROR: [22]PETSC ERROR: [1]PETSC ERROR: [12]PETSC ERROR: See docs/index.html for manual pages. [37]PETSC ERROR: See docs/index.html for manual pages. [36]PETSC ERROR: See docs/index.html for manual pages. ------------------------------------------------------------------------ [16]PETSC ERROR: [30]PETSC ERROR: See docs/index.html for manual pages. [7]PETSC ERROR: See docs/index.html for manual pages. ------------------------------------------------------------------------ ------------------------------------------------------------------------ [38]PETSC ERROR: See docs/index.html for manual pages. [45]PETSC ERROR: See docs/index.html for manual pages. [44]PETSC ERROR: [17]PETSC ERROR: ------------------------------------------------------------------------ See docs/index.html for manual pages. [31]PETSC ERROR: See docs/index.html for manual pages. /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 [18]PETSC ERROR: [39]PETSC ERROR: See docs/index.html for manual pages. [46]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 ------------------------------------------------------------------------ See docs/index.html for manual pages. [4]PETSC ERROR: See docs/index.html for manual pages. [52]PETSC ERROR: [19]PETSC ERROR: [53]PETSC ERROR: See docs/index.html for manual pages. ------------------------------------------------------------------------ [24]PETSC ERROR: [54]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 ------------------------------------------------------------------------ ------------------------------------------------------------------------ [47]PETSC ERROR: [32]PETSC ERROR: [55]PETSC ERROR: [25]PETSC ERROR: [60]PETSC ERROR: ------------------------------------------------------------------------ [62]PETSC ERROR: [26]PETSC ERROR: [61]PETSC ERROR: [33]PETSC ERROR: [63]PETSC ERROR: [2]PETSC ERROR: [13]PETSC ERROR: [41]PETSC ERROR: [6]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 ------------------------------------------------------------------------ [40]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 [35]PETSC ERROR: ------------------------------------------------------------------------ [27]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ [14]PETSC ERROR: [42]PETSC ERROR: ------------------------------------------------------------------------ [34]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 [3]PETSC ERROR: [15]PETSC ERROR: [48]PETSC ERROR: ------------------------------------------------------------------------ [49]PETSC ERROR: ------------------------------------------------------------------------ /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 [50]PETSC ERROR: [20]PETSC ERROR: [51]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ [5]PETSC ERROR: [56]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: ------------------------------------------------------------------------ [58]PETSC ERROR: [21]PETSC ERROR: [57]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib ------------------------------------------------------------------------ ------------------------------------------------------------------------ [59]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 ------------------------------------------------------------------------ [23]PETSC ERROR: [8]PETSC ERROR: ------------------------------------------------------------------------ [9]PETSC ERROR: [28]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ [10]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 ------------------------------------------------------------------------ ------------------------------------------------------------------------ /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 [11]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 [1]PETSC ERROR: [29]PETSC ERROR: ------------------------------------------------------------------------ [22]PETSC ERROR: ------------------------------------------------------------------------ [12]PETSC ERROR: ------------------------------------------------------------------------ [37]PETSC ERROR: [16]PETSC ERROR: [36]PETSC ERROR: ------------------------------------------------------------------------ /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 ------------------------------------------------------------------------ [30]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 [7]PETSC ERROR: ------------------------------------------------------------------------ /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 ------------------------------------------------------------------------ [38]PETSC ERROR: [17]PETSC ERROR: [45]PETSC ERROR: ------------------------------------------------------------------------ [44]PETSC ERROR: ------------------------------------------------------------------------ /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 [18]PETSC ERROR: [31]PETSC ERROR: ------------------------------------------------------------------------ Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [39]PETSC ERROR: ------------------------------------------------------------------------ [46]PETSC ERROR: ------------------------------------------------------------------------ /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 [19]PETSC ERROR: [4]PETSC ERROR: ------------------------------------------------------------------------ [52]PETSC ERROR: [24]PETSC ERROR: [53]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 [54]PETSC ERROR: [32]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 [25]PETSC ERROR: [47]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 [55]PETSC ERROR: [26]PETSC ERROR: [60]PETSC ERROR: [33]PETSC ERROR: [62]PETSC ERROR: [2]PETSC ERROR: [61]PETSC ERROR: [41]PETSC ERROR: [63]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 [13]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [6]PETSC ERROR: [40]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 [35]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 [27]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 [42]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 [34]PETSC ERROR: [14]PETSC ERROR: [3]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 [48]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [49]PETSC ERROR: [15]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 [50]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 [51]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 [20]PETSC ERROR: [56]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 [0]PETSC ERROR: [5]PETSC ERROR: [58]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 [57]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 [21]PETSC ERROR: [59]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 [8]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [9]PETSC ERROR: [23]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 [28]PETSC ERROR: [10]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 [11]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NE by Unknown Wed Jan 26 11:03:56 2011 [1]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 [29]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 [22]PETSC ERROR: [16]PETSC ERROR: [12]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 [37]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 [36]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 [30]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 [7]PETSC ERROR: [17]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 [38]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 [45]PETSC ERROR: [18]PETSC ERROR: [44]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Configure run at Sat Dec 25 23:04:15 2010 [31]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 Configure run at Sat Dec 25 23:04:15 2010 /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 [39]PETSC ERROR: [19]PETSC ERROR: [46]PETSC ERROR: /home/rchen/soft/fixedmesh/bypass2/codefor3.1/./joab on a bgl-ibm-g named R00M0NF by Unknown Wed Jan 26 11:03:56 2011 Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [24]PETSC ERROR: [4]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 [52]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [53]PETSC ERROR: [32]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [25]PETSC ERROR: [54]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [26]PETSC ERROR: [47]PETSC ERROR: [33]PETSC ERROR: [55]PETSC ERROR: [2]PETSC ERROR: [60]PETSC ERROR: [41]PETSC ERROR: [62]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [61]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 [63]PETSC ERROR: [40]PETSC ERROR: [13]PETSC ERROR: [35]PETSC ERROR: [6]PETSC ERROR: [27]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [42]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 [34]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [3]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [48]PETSC ERROR: [14]PETSC ERROR: [49]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Configure run at Sat Dec 25 23:04:15 2010 Configure run at Sat Dec 25 23:04:15 2010 [50]PETSC ERROR: [15]PETSC ERROR: [51]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [56]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 [0]PETSC ERROR: [20]PETSC ERROR: [58]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [57]PETSC ERROR: [5]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [59]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [21]PETSC ERROR: [8]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [9]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Configure run at Sat Dec 25 23:04:15 2010 Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [23]PETSC ERROR: [10]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [28]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Configure run at Sat Dec 25 23:04:15 2010 Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [11]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [1]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Configure run at Sat Dec 25 23:04:15 2010 Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Configure run at Sat Dec 25 23:04:15 2010 [16]PETSC ERROR: [29]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [22]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [12]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 [37]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [36]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Configure run at Sat Dec 25 23:04:15 2010 [17]PETSC ERROR: [30]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [7]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Configure run at Sat Dec 25 23:04:15 2010 [18]PETSC ERROR: [38]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [45]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [44]PETSC ERROR: Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib Configure run at Sat Dec 25 23:04:15 2010 Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [31]PETSC ERROR: [19]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Libraries linked from /home/rchen/soft/petsc-3.1-p7/bgl-ibm-goto-O3_440d/lib [39]PETSC ERROR: [24]PETSC ERROR: [46]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure run at Sat Dec 25 23:04:15 2010 Configure run at Sat Dec 25 23:04:15 2010 [4]PETSC ERROR: [32]PETSC ERROR: [52]PETSC ERROR: [25]PETSC ERROR: [53]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 Configure run at Sat Dec 25 23:04:15 2010 [26]PETSC ERROR: [54]PETSC ERROR: [33]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 [2]PETSC ERROR: [47]PETSC ERROR: [41]PETSC ERROR: [55]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 [60]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [62]PETSC ERROR: [40]PETSC ERROR: [61]PETSC ERROR: [35]PETSC ERROR: [63]PETSC ERROR: [27]PETSC ERROR: [13]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 [6]PETSC ERROR: [42]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 [34]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 [3]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [48]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 [49]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [14]PETSC ERROR: [50]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 [51]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure run at Sat Dec 25 23:04:15 2010 [15]PETSC ERROR: [56]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 [0]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 [58]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [57]PETSC ERROR: [20]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 Configure run at Sat Dec 25 23:04:15 2010 [59]PETSC ERROR: [5]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 Configure run at Sat Dec 25 23:04:15 2010 [8]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 [9]PETSC ERROR: [21]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 ------------------------------------------------------------------------ Configure run at Sat Dec 25 23:04:15 2010 Configure run at Sat Dec 25 23:04:15 2010 [10]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure run at Sat Dec 25 23:04:15 2010 [23]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 Configure run at Sat Dec 25 23:04:15 2010 Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [28]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 Configure run at Sat Dec 25 23:04:15 2010 [11]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 [1]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 Configure run at Sat Dec 25 23:04:15 2010 Configure run at Sat Dec 25 23:04:15 2010 Configure run at Sat Dec 25 23:04:15 2010 Configure run at Sat Dec 25 23:04:15 2010 Configure run at Sat Dec 25 23:04:15 2010 Configure run at Sat Dec 25 23:04:15 2010 [16]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure run at Sat Dec 25 23:04:15 2010 Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure run at Sat Dec 25 23:04:15 2010 [29]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [22]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 [12]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 [37]PETSC ERROR: [17]PETSC ERROR: [36]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure run at Sat Dec 25 23:04:15 2010 [30]PETSC ERROR: [18]PETSC ERROR: [7]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz ------------------------------------------------------------------------ [38]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 [45]PETSC ERROR: Configure run at Sat Dec 25 23:04:15 2010 [44]PETSC ERROR: [19]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure run at Sat Dec 25 23:04:15 2010 [31]PETSC ERROR: [24]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ [39]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [46]PETSC ERROR: [32]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [25]PETSC ERROR: [4]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [52]PETSC ERROR: [26]PETSC ERROR: [53]PETSC ERROR: [33]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [2]PETSC ERROR: [54]PETSC ERROR: [41]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [47]PETSC ERROR: ------------------------------------------------------------------------ [55]PETSC ERROR: [40]PETSC ERROR: [60]PETSC ERROR: [35]PETSC ERROR: [62]PETSC ERROR: [27]PETSC ERROR: [61]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [63]PETSC ERROR: [42]PETSC ERROR: [13]PETSC ERROR: [34]PETSC ERROR: [6]PETSC ERROR: [3]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [48]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [49]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [50]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [51]PETSC ERROR: [14]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [56]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: [15]PETSC ERROR: [58]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [57]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz ------------------------------------------------------------------------ [59]PETSC ERROR: [20]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [8]PETSC ERROR: [5]PETSC ERROR: [9]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [21]PETSC ERROR: [10]PETSC ERROR: User provided function() line 0 in unknown directory unknown file Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz ------------------------------------------------------------------------ ------------------------------------------------------------------------ [23]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [11]PETSC ERROR: [28]PETSC ERROR: [1]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [16]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [29]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [22]PETSC ERROR: [17]PETSC ERROR: [12]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [37]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [36]PETSC ERROR: [18]PETSC ERROR: ------------------------------------------------------------------------ Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [30]PETSC ERROR: User provided function() line 0 in unknown directory unknown file [7]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz ------------------------------------------------------------------------ Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [38]PETSC ERROR: [19]PETSC ERROR: [45]PETSC ERROR: Configure options --known-level1-dcache-size=32768 --known-level1-dcache-linesize=32 --known-level1-dcache-assoc=0 --known-memcmp-ok=1 --known-sizeof-char=1 --known-sizeof-void-p=4 --known-sizeof-short=2 --known-sizeof-int=4 --known-sizeof-long=4 --known-sizeof-long-long=8 --known-sizeof-float=4 --known-sizeof-double=8 --known-sizeof-size_t=4 --known-bits-per-byte=8 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-mpi-long-double=1 --with-cc=/contrib/bgl/bin/mpxlc --with-cxx=/contrib/bgl/bin/mpxlC --with-fc="/contrib/bgl/bin/mpxlf -qnosave" --with-mpi-dir=/bgl/BlueLight/ppcfloor/bglsys --with-blas-lapack-lib="-L/contrib/bgl/lib -llapack440 -L/contrib/bgl/lib -lgoto" --with-is-color-value-type=short --with-shared=0 -COPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -CXXOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" -FOPTFLAGS="-O2 -qbgl -qarch=440d -qtune=440 -qmaxmem=-1" --with-debugging=1 --with-x=0 --with-x11=0 --with-batch=1 --with-memcmp-ok --sizeof-char=1 --sizeof-void-p=4 --sizeof-short=2 --sizeof-int=4 --sizeof-long=4 --sizeof-size-t=4 --sizeof-long-long=8 --sizeof-float=4 --sizeof-double=8 --bits-per-byte=8 --sizeof-MPI-Comm=4 --sizeof-MPI-Fint=4 --have-mpi-long-double=1 -PETSC_ARCH=bgl-ibm-goto-O3_440d --download-superlu=/home/rchen/soft/petsc-3.1-p7/externalpackages/superlu_4.0-March_7_2010.tar.gz --download-superlu_dist=/home/rchen/soft/petsc-3.1-p7/externalpackages/SuperLU_DIST_2.4-hg-v2.tar.gz --download-parmetis=/home/rchen/soft/petsc-3.1-p7/externalpackages/ParMetis-dev-p3.tar.gz --download-scalapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/scalapack.tgz --download-blacs=/home/rchen/soft/petsc-3.1-p7/externalpackages/blacs-dev.tar.gz --download-f-blas-lapack=/home/rchen/soft/petsc-3.1-p7/externalpackages/fblaslapack-3.1.1.tar.gz --download-mumps=/home/rchen/soft/petsc-3.1-p7/externalpackages/MUMPS_4.9.2.tar.gz --download-hypre=/home/rchen/soft/petsc-3.1-p7/externalpackages/hypre-2.6.0b.tar.gz [44]PETSC ERROR: [24]PETSC ERROR: ------------------------------------------------------------------------ User provided function() line 0 in unknown directory unknown file [31]PETSC ERROR: ------------------------------------------------------------------------ User provided function() line 0 in unknown directory unknown file [32]PETSC ERROR: [39]PETSC ERROR: [25]PETSC ERROR: [46]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ [26]PETSC ERROR: application called MPI_Abort(MPI_COMM_WORLD, 59) - process 4[33]PETSC ERROR: [52]PETSC ERROR: [2]PETSC ERROR: [53]PETSC ERROR: [41]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ [54]PETSC ERROR: User provided function() line 0 in unknown directory unknown file ------------------------------------------------------------------------ [40]PETSC ERROR: [47]PETSC ERROR: [35]PETSC ERROR: [55]PETSC ERROR: [27]PETSC ERROR: [60]PETSC ERROR: ------------------------------------------------------------------------ [62]PETSC ERROR: [42]PETSC ERROR: [61]PETSC ERROR: [34]PETSC ERROR: [63]PETSC ERROR: [3]PETSC ERROR: [13]PETSC ERROR: [48]PETSC ERROR: [6]PETSC ERROR: [49]PETSC ERROR: ------------------------------------------------------------------------ User provided function() line 0 in unknown directory unknown file ------------------------------------------------------------------------ [50]PETSC ERROR: User provided function() line 0 in unknown directory unknown file [51]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ [56]PETSC ERROR: [14]PETSC ERROR: application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0------------------------------------------------------------------------ [58]PETSC ERROR: User provided function() line 0 in unknown directory unknown file [57]PETSC ERROR: [15]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ [59]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ User provided function() line 0 in unknown directory unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 8[20]PETSC ERROR: [9]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ application called MPI_Abort(MPI_COMM_WORLD, 59) - process 5------------------------------------------------------------------------ ------------------------------------------------------------------------ [10]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ [21]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file ------------------------------------------------------------------------ [23]PETSC ERROR: [11]PETSC ERROR: ------------------------------------------------------------------------ application called MPI_Abort(MPI_COMM_WORLD, 59) - process 1[28]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ [16]PETSC ERROR: ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ ------------------------------------------------------------------------ User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file ------------------------------------------------------------------------ User provided function() line 0 in unknown directory unknown file ------------------------------------------------------------------------ [29]PETSC ERROR: application called MPI_Abort(MPI_COMM_WORLD, 59) - process 17[22]PETSC ERROR: ------------------------------------------------------------------------ application called MPI_Abort(MPI_COMM_WORLD, 59) - process 12------------------------------------------------------------------------ [37]PETSC ERROR: [18]PETSC ERROR: [36]PETSC ERROR: ------------------------------------------------------------------------ User provided function() line 0 in unknown directory unknown file ------------------------------------------------------------------------ [30]PETSC ERROR: ------------------------------------------------------------------------ application called MPI_Abort(MPI_COMM_WORLD, 59) - process 7[19]PETSC ERROR: User provided function() line 0 in unknown directory unknown file ------------------------------------------------------------------------ [38]PETSC ERROR: [24]PETSC ERROR: [45]PETSC ERROR: User provided function() line 0 in unknown directory unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 44[32]PETSC ERROR: User provided function() line 0 in unknown directory unknown file [25]PETSC ERROR: [31]PETSC ERROR: User provided function() line 0 in unknown directory unknown file [39]PETSC ERROR: [26]PETSC ERROR: [46]PETSC ERROR: [33]PETSC ERROR: User provided function() line 0 in unknown directory unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 2[52]PETSC ERROR: [41]PETSC ERROR: application called MPI_Abort(MPI_COMM_WORLD, 59) - process 53User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file [40]PETSC ERROR: [54]PETSC ERROR: [35]PETSC ERROR: User provided function() line 0 in unknown directory unknown file [27]PETSC ERROR: [47]PETSC ERROR: User provided function() line 0 in unknown directory unknown file [55]PETSC ERROR: [42]PETSC ERROR: [60]PETSC ERROR: [34]PETSC ERROR: [62]PETSC ERROR: application called MPI_Abort(MPI_COMM_WORLD, 59) - process 3[61]PETSC ERROR: [48]PETSC ERROR: [63]PETSC ERROR: [49]PETSC ERROR: application called MPI_Abort(MPI_COMM_WORLD, 59) - process 13[50]PETSC ERROR: application called MPI_Abort(MPI_COMM_WORLD, 59) - process 6[51]PETSC ERROR: User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file [56]PETSC ERROR: User provided function() line 0 in unknown directory unknown file [58]PETSC ERROR: User provided function() line 0 in unknown directory unknown file [57]PETSC ERROR: application called MPI_Abort(MPI_COMM_WORLD, 59) - process 14User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file [59]PETSC ERROR: application called MPI_Abort(MPI_COMM_WORLD, 59) - process 15User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 9User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 20User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 10User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 21User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 11application called MPI_Abort(MPI_COMM_WORLD, 59) - process 23User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 28User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 16User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file User provided function() line 0 in unknown directory unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 29User provided function() line 0 in unknown directory unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 22application called MPI_Abort(MPI_COMM_WORLD, 59) - process 18application called MPI_Abort(MPI_COMM_WORLD, 59) - process 37User provided function() line 0 in unknown directory unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 36User provided function() line 0 in unknown directory unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 30User provided function() line 0 in unknown directory unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 38application called MPI_Abort(MPI_COMM_WORLD, 59) - process 19application called MPI_Abort(MPI_COMM_WORLD, 59) - process 45User provided function() line 0 in unknown directory unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 31application called MPI_Abort(MPI_COMM_WORLD, 59) - process 24application called MPI_Abort(MPI_COMM_WORLD, 59) - process 39application called MPI_Abort(MPI_COMM_WORLD, 59) - process 32application called MPI_Abort(MPI_COMM_WORLD, 59) - process 46application called MPI_Abort(MPI_COMM_WORLD, 59) - process 25application called MPI_Abort(MPI_COMM_WORLD, 59) - process 52application called MPI_Abort(MPI_COMM_WORLD, 59) - process 26application called MPI_Abort(MPI_COMM_WORLD, 59) - process 54application called MPI_Abort(MPI_COMM_WORLD, 59) - process 33application called MPI_Abort(MPI_COMM_WORLD, 59) - process 47application called MPI_Abort(MPI_COMM_WORLD, 59) - process 41application called MPI_Abort(MPI_COMM_WORLD, 59) - process 55application called MPI_Abort(MPI_COMM_WORLD, 59) - process 40application called MPI_Abort(MPI_COMM_WORLD, 59) - process 60application called MPI_Abort(MPI_COMM_WORLD, 59) - process 35application called MPI_Abort(MPI_COMM_WORLD, 59) - process 62application called MPI_Abort(MPI_COMM_WORLD, 59) - process 27application called MPI_Abort(MPI_COMM_WORLD, 59) - process 61application called MPI_Abort(MPI_COMM_WORLD, 59) - process 42application called MPI_Abort(MPI_COMM_WORLD, 59) - process 63application called MPI_Abort(MPI_COMM_WORLD, 59) - process 34application called MPI_Abort(MPI_COMM_WORLD, 59) - process 48application called MPI_Abort(MPI_COMM_WORLD, 59) - process 49application called MPI_Abort(MPI_COMM_WORLD, 59) - process 50application called MPI_Abort(MPI_COMM_WORLD, 59) - process 51application called MPI_Abort(MPI_COMM_WORLD, 59) - process 56application called MPI_Abort(MPI_COMM_WORLD, 59) - process 58application called MPI_Abort(MPI_COMM_WORLD, 59) - process 57application called MPI_Abort(MPI_COMM_WORLD, 59) - process 59 BE_MPI (ERROR): The error message in the job record is as follows: BE_MPI (ERROR): "killed by exit(1) on node 71" BE_MPI (ERROR): The error message in the job record is as follows: BE_MPI (ERROR): "killed by exit(1) on node 71" From bsmith at mcs.anl.gov Wed Jan 26 13:49:15 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 26 Jan 2011 13:49:15 -0600 Subject: [petsc-users] Implementing Schur complement approach (domain decomposition) In-Reply-To: <4D4018BB.5060901@tu-dresden.de> References: <4D4018BB.5060901@tu-dresden.de> Message-ID: <56CA50B7-40ED-4D8B-9F66-26F09B41AE95@mcs.anl.gov> Thomas, There are two classes of related non-overlapping domain decomposition methods that use Schur complements. 1) the "iterative substructuring" methods. This can be used will fully assembled global stiffness matrices. They apply the Schur complement S = A_BB - A_BI * A_II^-1 A_IB implicitly by applying first A_IB then A_II^-1 etc. The preconditioner for S is applied by directly knowing something about the structure of S. For example for the Laplacian the S associated with any particular edge is spectrally equivalent to l_00^{1/2} and its inverse can be applied efficiently using FFTs. This is introduced in section 4.2 of my book with particular examples of preconditioners for edges in 4.2.3, 4.2.4, 4.2.5 4.2.6 adding a coarse grid is discussed in 4.3.4 2) the Neumann-Dirichlet type methods, include FEIT and balancing. These require parts of the unassembled stiffness matrix because they involve solving Neumann boundary condition problems on subdomains in the preconditioner. They are more general purpose than traditional iterative substructuring methods because they don't depend on particular properties of the interface operators like l_00^{1/2}. These are discussed in Section 4.2.1 4.2.2 4.3.1 4.3.2 4.3.3 Note that the book doesn't discuss FEIT methods directly, they are similar to the balancing that is discussed. So what methods do you want to use? From the matrix you wrote below and its decomposition that is only appropriate for the iterative substructuring methods. Barry On Jan 26, 2011, at 6:51 AM, Thomas Witkowski wrote: > I want to solve the equation in my FEM code (that makes already use of PETSc) with a Schur complement approach (iterative substructuring). Although I have some basic knowledge about PETSc, I have no good idea how to start with it. To concretize my question, I want to solve a system of the form > > [A_II A_IB] * [u_I] = [f_I] > [A_IB^T A_BB] [u_B] [f_B] > > A_II is a block diagonal matrix with each block consisting of all interior node of one partition. A_BB is the block consisting of all bounday nodes. A_IB is the connection between the interior and the bounday node. The same for the unknown vector u und the right hand side vector f. My first idea is not to assemble to matrices A_II, A_IB and A_BB in a global way, but just local and to define their action using a MatShell for each of these matrices. Okay, but what's about the global index of the whole system. Till now I have a continuous global index of the nodes on each partition, what is required by PETSC, if I'm right. But for using a Schur complement approach I need to split the index in the interior and the boundary nodes. Who to circumvent this (first of my) problem? > > Thank you for any advise, > > Thomas From stephen.wornom at inria.fr Thu Jan 27 03:45:52 2011 From: stephen.wornom at inria.fr (Stephen Wornom) Date: Thu, 27 Jan 2011 10:45:52 +0100 Subject: [petsc-users] pmetis Message-ID: <4D413ED0.7080007@inria.fr> I have an unstructured mesh created from a structured x,y,z Cartesian mesh. Is it possible to partition the mesh along x= constant values (partitions are slice in x of the global mesh). Hope y question is clear. Thanks in advance, Stephen -- stephen.wornom at inria.fr 2004 route des lucioles - BP93 Sophia Antipolis 06902 CEDEX Tel: 04 92 38 50 54 Fax: 04 97 15 53 51 -------------- next part -------------- A non-text attachment was scrubbed... Name: stephen_wornom.vcf Type: text/x-vcard Size: 160 bytes Desc: not available URL: From gdiso at ustc.edu Thu Jan 27 06:23:48 2011 From: gdiso at ustc.edu (Gong Ding) Date: Thu, 27 Jan 2011 20:23:48 +0800 Subject: [petsc-users] Is it possible to add a hook function after SNES_DIVERGED_LINEAR_SOLVE Message-ID: <73DB02720B524CB799015A6BC6AA71D1@cogendaeda> Hello, I sometimes try to use iterative solvers in the SENS. However, it may fail to convergence for some difficult problems. SNES will report SNES_DIVERGED_LINEAR_SOLVE for this situation. Is it possible to add a hook function here? i.e. I'd like to change the linear solver to LU and solve the Newton step again instead of break the nonlinear solver. Here is the code from ls.c which report the SNES_DIVERGED_LINEAR_SOLVE if (kspreason < 0) { if (++snes->numLinearSolveFailures >= snes->maxLinearSolveFailures) { ierr = PetscInfo2(snes,"iter=%D, number linear solve failures %D greater than current SNES allowed, stopping solve\n",snes->iter,snes->numLinearSolveFailures);CHKERRQ(ierr); snes->reason = SNES_DIVERGED_LINEAR_SOLVE; break; } } The hook function may be added befour the break. Regards, Gong Ding From knepley at gmail.com Thu Jan 27 08:32:23 2011 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 27 Jan 2011 08:32:23 -0600 Subject: [petsc-users] Is it possible to add a hook function after SNES_DIVERGED_LINEAR_SOLVE In-Reply-To: <73DB02720B524CB799015A6BC6AA71D1@cogendaeda> References: <73DB02720B524CB799015A6BC6AA71D1@cogendaeda> Message-ID: 2011/1/27 Gong Ding > Hello, > I sometimes try to use iterative solvers in the SENS. > However, it may fail to convergence for some difficult problems. > SNES will report SNES_DIVERGED_LINEAR_SOLVE for this situation. > > Is it possible to add a hook function here? > i.e. I'd like to change the linear solver to LU and solve the > Newton step again instead of break the nonlinear solver. > I believe the complexity is best controlled by changing the linear solver rather than adding another generic hook to the nonlinear solver. You can create your own KSP. It holds a KSP which you set from options and another which is LU. If the first returns kspreason < 0, you solve with the second. A slightly easier, but more hacky way to do this is to override the ConvergenceTest for your current KSP, so that it solves with LU on failure. Matt > Here is the code from ls.c which report the SNES_DIVERGED_LINEAR_SOLVE > > if (kspreason < 0) { > if (++snes->numLinearSolveFailures >= snes->maxLinearSolveFailures) { > ierr = PetscInfo2(snes,"iter=%D, number linear solve failures %D > greater than current SNES allowed, stopping > solve\n",snes->iter,snes->numLinearSolveFailures);CHKERRQ(ierr); > snes->reason = SNES_DIVERGED_LINEAR_SOLVE; > break; > } > } > The hook function may be added befour the break. > > Regards, > > Gong Ding > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From thomas.witkowski at tu-dresden.de Fri Jan 28 03:02:59 2011 From: thomas.witkowski at tu-dresden.de (Thomas Witkowski) Date: Fri, 28 Jan 2011 10:02:59 +0100 Subject: [petsc-users] Implementing Schur complement approach (domain decomposition) In-Reply-To: <56CA50B7-40ED-4D8B-9F66-26F09B41AE95@mcs.anl.gov> References: <4D4018BB.5060901@tu-dresden.de> <56CA50B7-40ED-4D8B-9F66-26F09B41AE95@mcs.anl.gov> Message-ID: <4D428643.7000109@tu-dresden.de> Barry, I want to implement the iterative substructuring method as described in point 1). You wrote that the method can be applied on fully assembled global matrices. But how would I than implement the action of the Schur complement on a vector. The matrices A_BB, A_IB, A_BI and A_II are than just submatrices of A. Is there an efficient way to access them from matrix A. I though this is not possible because the matrix is in sparse format. My (very general) idea was to assemble the matrices only local on each proc and to define the action of the global Schur complement as the sum of the actions of the local Schur complements. Could you make this point more clear to me? Thanks! Thomas Barry Smith wrote: > Thomas, > > There are two classes of related non-overlapping domain decomposition methods that use Schur complements. > > 1) the "iterative substructuring" methods. This can be used will fully assembled global stiffness matrices. They apply the Schur complement S = A_BB - A_BI * A_II^-1 A_IB implicitly by applying first A_IB then A_II^-1 etc. The preconditioner for S is applied by directly knowing something about the structure of S. For example for the Laplacian the S associated with any particular edge is spectrally equivalent to l_00^{1/2} and its inverse can be applied efficiently using FFTs. This is introduced in section 4.2 of my book with particular examples of preconditioners for edges in 4.2.3, 4.2.4, 4.2.5 4.2.6 adding a coarse grid is discussed in 4.3.4 > > 2) the Neumann-Dirichlet type methods, include FEIT and balancing. These require parts of the unassembled stiffness matrix because they involve solving Neumann boundary condition problems on subdomains in the preconditioner. They are more general purpose than traditional iterative substructuring methods because they don't depend on particular properties of the interface operators like l_00^{1/2}. These are discussed in Section 4.2.1 4.2.2 4.3.1 4.3.2 4.3.3 Note that the book doesn't discuss FEIT methods directly, they are similar to the balancing that is discussed. > > So what methods do you want to use? From the matrix you wrote below and its decomposition that is only appropriate for the iterative substructuring methods. > > > Barry > > > On Jan 26, 2011, at 6:51 AM, Thomas Witkowski wrote: > > >> I want to solve the equation in my FEM code (that makes already use of PETSc) with a Schur complement approach (iterative substructuring). Although I have some basic knowledge about PETSc, I have no good idea how to start with it. To concretize my question, I want to solve a system of the form >> >> [A_II A_IB] * [u_I] = [f_I] >> [A_IB^T A_BB] [u_B] [f_B] >> >> A_II is a block diagonal matrix with each block consisting of all interior node of one partition. A_BB is the block consisting of all bounday nodes. A_IB is the connection between the interior and the bounday node. The same for the unknown vector u und the right hand side vector f. My first idea is not to assemble to matrices A_II, A_IB and A_BB in a global way, but just local and to define their action using a MatShell for each of these matrices. Okay, but what's about the global index of the whole system. Till now I have a continuous global index of the nodes on each partition, what is required by PETSC, if I'm right. But for using a Schur complement approach I need to split the index in the interior and the boundary nodes. Who to circumvent this (first of my) problem? >> >> Thank you for any advise, >> >> Thomas >> > > > > From thomas.witkowski at tu-dresden.de Fri Jan 28 07:30:16 2011 From: thomas.witkowski at tu-dresden.de (Thomas Witkowski) Date: Fri, 28 Jan 2011 14:30:16 +0100 Subject: [petsc-users] Parallel mesh partitioning Message-ID: <4D42C4E8.2000002@tu-dresden.de> Does anybody of you nows a mesh/graph partitioner which works in parallel and can guaranty to create only connected parts (subdomains)? I tried ParMETiS and Zoltan. But both have no possibility to (optionally) ensure this. Regards, Thomas From jed at 59A2.org Fri Jan 28 09:48:02 2011 From: jed at 59A2.org (Jed Brown) Date: Fri, 28 Jan 2011 12:48:02 -0300 Subject: [petsc-users] pmetis In-Reply-To: <4D413ED0.7080007@inria.fr> References: <4D413ED0.7080007@inria.fr> Message-ID: On Thu, Jan 27, 2011 at 06:45, Stephen Wornom wrote: > I have an unstructured mesh created from a structured x,y,z Cartesian > mesh. > What does this mean? > Is it possible to partition the mesh along x= constant values > (partitions are slice in x of the global mesh). > Sure, but the partitions may not be "good". -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at 59A2.org Fri Jan 28 09:54:06 2011 From: jed at 59A2.org (Jed Brown) Date: Fri, 28 Jan 2011 12:54:06 -0300 Subject: [petsc-users] How to symmetrical the pattern of an unsymmetrical matrix In-Reply-To: <59025EE265CF46D5B906CF32B79662DE@cogendaeda> References: <59025EE265CF46D5B906CF32B79662DE@cogendaeda> Message-ID: 2011/1/26 Gong Ding > I have unsymmetrical jacobian matrix in MPIAIJ format (it is nearlly > symmetric, I guess). I'd like to pad it to symmetrical pattern by just add 0 > to corresponding matrix entry, which is required to some matrix partition > step. How are you obtaining the matrix? The best way is to preallocate those extra zeros. A traditional purely algebraic way is to add the transpose (zeroed in this case), but transpose is a bad operation to perform in parallel so I would try to avoid it. -------------- next part -------------- An HTML attachment was scrubbed... URL: From stephen.wornom at inria.fr Fri Jan 28 10:19:11 2011 From: stephen.wornom at inria.fr (Stephen Wornom) Date: Fri, 28 Jan 2011 17:19:11 +0100 Subject: [petsc-users] pmetis In-Reply-To: References: <4D413ED0.7080007@inria.fr> Message-ID: <4D42EC7F.2080505@inria.fr> Jed Brown wrote: > On Thu, Jan 27, 2011 at 06:45, Stephen Wornom > wrote: > > I have an unstructured mesh created from a structured x,y,z Cartesian > mesh. > > > What does this mean? Usually one defines the mesh vertices on the exterior faces and the mesh generator determines the location of the mesh vertices inside the rectangular. Since I have no mesh generator, I simply created a 3D structured mesh and wrote the mesh in an unstructured format (tetrahedra, ... etc). Thus metis reads an unstructured mesh. > > > Is it possible to partition the mesh along x= constant values > (partitions are slice in x of the global mesh). > > > Sure, but the partitions may not be "good". How do I tell metis to partition the mesh so that the partition boundaries are along x= constant lines? Hope this helps. Thanks, Stephen -- stephen.wornom at inria.fr 2004 route des lucioles - BP93 Sophia Antipolis 06902 CEDEX Tel: 04 92 38 50 54 Fax: 04 97 15 53 51 -------------- next part -------------- A non-text attachment was scrubbed... Name: stephen_wornom.vcf Type: text/x-vcard Size: 160 bytes Desc: not available URL: From jed at 59A2.org Fri Jan 28 10:29:58 2011 From: jed at 59A2.org (Jed Brown) Date: Fri, 28 Jan 2011 13:29:58 -0300 Subject: [petsc-users] pmetis In-Reply-To: <4D42EC7F.2080505@inria.fr> References: <4D413ED0.7080007@inria.fr> <4D42EC7F.2080505@inria.fr> Message-ID: On Fri, Jan 28, 2011 at 13:19, Stephen Wornom wrote: > Usually one defines the mesh vertices on the exterior faces and the mesh > generator determines the location of the mesh vertices inside the > rectangular. > Since I have no mesh generator, I simply created a 3D structured mesh and > wrote the mesh in an unstructured format (tetrahedra, ... etc). > So you are storing a structured mesh (perhaps tets) in an unstructured format? > Thus metis reads an unstructured mesh. > You don't need METIS for such a simple partitioning and METIS does not use coordinates anyway (it's just a topological graph, perhaps with edge weights). > How do I tell metis to partition the mesh so that the partition boundaries > are along x= constant lines? > I would just use your own code for partitioning. To subvert METIS, you could prescribe large weights for edges in directions that you don't want to be cut. -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Jan 28 11:46:07 2011 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 28 Jan 2011 11:46:07 -0600 Subject: [petsc-users] Implementing Schur complement approach (domain decomposition) In-Reply-To: <4D428643.7000109@tu-dresden.de> References: <4D4018BB.5060901@tu-dresden.de> <56CA50B7-40ED-4D8B-9F66-26F09B41AE95@mcs.anl.gov> <4D428643.7000109@tu-dresden.de> Message-ID: You should move to petsc-dev and use the new FieldSplit preconditioner. It can apply the Schur complement as you want. Matt On Fri, Jan 28, 2011 at 3:02 AM, Thomas Witkowski < thomas.witkowski at tu-dresden.de> wrote: > Barry, > > I want to implement the iterative substructuring method as described in > point 1). You wrote that the method can be applied on fully assembled global > matrices. But how would I than implement the action of the Schur complement > on a vector. The matrices A_BB, A_IB, A_BI and A_II are than just > submatrices of A. Is there an efficient way to access them from matrix A. I > though this is not possible because the matrix is in sparse format. My (very > general) idea was to assemble the matrices only local on each proc and to > define the action of the global Schur complement as the sum of the actions > of the local Schur complements. Could you make this point more clear to me? > Thanks! > > Thomas > > > Barry Smith wrote: > >> Thomas, >> >> There are two classes of related non-overlapping domain decomposition >> methods that use Schur complements. >> >> 1) the "iterative substructuring" methods. This can be used will fully >> assembled global stiffness matrices. They apply the Schur complement S = >> A_BB - A_BI * A_II^-1 A_IB implicitly by applying first A_IB then A_II^-1 >> etc. The preconditioner for S is applied by directly knowing something about >> the structure of S. For example for the Laplacian the S associated with any >> particular edge is spectrally equivalent to l_00^{1/2} and its inverse can >> be applied efficiently using FFTs. This is introduced in section 4.2 of my >> book with particular examples of preconditioners for edges in 4.2.3, 4.2.4, >> 4.2.5 4.2.6 adding a coarse grid is discussed in 4.3.4 >> >> 2) the Neumann-Dirichlet type methods, include FEIT and balancing. These >> require parts of the unassembled stiffness matrix because they involve >> solving Neumann boundary condition problems on subdomains in the >> preconditioner. They are more general purpose than traditional iterative >> substructuring methods because they don't depend on particular properties of >> the interface operators like l_00^{1/2}. These are discussed in Section >> 4.2.1 4.2.2 4.3.1 4.3.2 4.3.3 Note that the book doesn't discuss FEIT >> methods directly, they are similar to the balancing that is discussed. >> >> So what methods do you want to use? From the matrix you wrote below and >> its decomposition that is only appropriate for the iterative substructuring >> methods. >> >> Barry >> >> >> On Jan 26, 2011, at 6:51 AM, Thomas Witkowski wrote: >> >> >> >>> I want to solve the equation in my FEM code (that makes already use of >>> PETSc) with a Schur complement approach (iterative substructuring). Although >>> I have some basic knowledge about PETSc, I have no good idea how to start >>> with it. To concretize my question, I want to solve a system of the form >>> >>> [A_II A_IB] * [u_I] = [f_I] >>> [A_IB^T A_BB] [u_B] [f_B] >>> >>> A_II is a block diagonal matrix with each block consisting of all >>> interior node of one partition. A_BB is the block consisting of all bounday >>> nodes. A_IB is the connection between the interior and the bounday node. The >>> same for the unknown vector u und the right hand side vector f. My first >>> idea is not to assemble to matrices A_II, A_IB and A_BB in a global way, but >>> just local and to define their action using a MatShell for each of these >>> matrices. Okay, but what's about the global index of the whole system. Till >>> now I have a continuous global index of the nodes on each partition, what is >>> required by PETSC, if I'm right. But for using a Schur complement approach I >>> need to split the index in the interior and the boundary nodes. Who to >>> circumvent this (first of my) problem? >>> >>> Thank you for any advise, >>> >>> Thomas >>> >>> >> >> >> >> >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Jan 28 12:02:45 2011 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 28 Jan 2011 12:02:45 -0600 Subject: [petsc-users] Parallel mesh partitioning In-Reply-To: <4D42C4E8.2000002@tu-dresden.de> References: <4D42C4E8.2000002@tu-dresden.de> Message-ID: On Fri, Jan 28, 2011 at 7:30 AM, Thomas Witkowski < thomas.witkowski at tu-dresden.de> wrote: > Does anybody of you nows a mesh/graph partitioner which works in parallel > and can guaranty to create only connected parts (subdomains)? I tried > ParMETiS and Zoltan. But both have no possibility to (optionally) ensure > this. > Why do you need connected partitions? Also, Zoltan uses ParMetis. Thanks, Matt > Regards, > > Thomas -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From Thomas.Witkowski at tu-dresden.de Fri Jan 28 12:16:35 2011 From: Thomas.Witkowski at tu-dresden.de (Thomas Witkowski) Date: Fri, 28 Jan 2011 19:16:35 +0100 Subject: [petsc-users] Parallel mesh partitioning In-Reply-To: References: <4D42C4E8.2000002@tu-dresden.de> Message-ID: <20110128191635.ap5kfwd484sccsgk@mail.zih.tu-dresden.de> Zitat von Matthew Knepley : > On Fri, Jan 28, 2011 at 7:30 AM, Thomas Witkowski < > thomas.witkowski at tu-dresden.de> wrote: > >> Does anybody of you nows a mesh/graph partitioner which works in parallel >> and can guaranty to create only connected parts (subdomains)? I tried >> ParMETiS and Zoltan. But both have no possibility to (optionally) ensure >> this. >> > > Why do you need connected partitions? Because my FEM code does not support disconnected domain! > Also, Zoltan uses ParMetis. Yes and no. Zoltan can make use of ParMetis, but it has a lot of build in algorithms for geometric, graph and hypergraph-based partitioning. Thomas > > Thanks, > > Matt > > >> Regards, >> >> Thomas > > -- > What most experimenters take for granted before they begin their experiments > is infinitely more interesting than any results to which their experiments > lead. > -- Norbert Wiener > From Thomas.Witkowski at tu-dresden.de Fri Jan 28 12:17:06 2011 From: Thomas.Witkowski at tu-dresden.de (Thomas Witkowski) Date: Fri, 28 Jan 2011 19:17:06 +0100 Subject: [petsc-users] Implementing Schur complement approach (domain decomposition) In-Reply-To: References: <4D4018BB.5060901@tu-dresden.de> <56CA50B7-40ED-4D8B-9F66-26F09B41AE95@mcs.anl.gov> <4D428643.7000109@tu-dresden.de> Message-ID: <20110128191706.u4sfcmri0wow8cok@mail.zih.tu-dresden.de> Zitat von Matthew Knepley : > You should move to petsc-dev and use the new FieldSplit preconditioner. It > can apply the Schur complement > as you want. Thanks for this hint. I will have a look on it. Thomas > > Matt > > On Fri, Jan 28, 2011 at 3:02 AM, Thomas Witkowski < > thomas.witkowski at tu-dresden.de> wrote: > >> Barry, >> >> I want to implement the iterative substructuring method as described in >> point 1). You wrote that the method can be applied on fully assembled global >> matrices. But how would I than implement the action of the Schur complement >> on a vector. The matrices A_BB, A_IB, A_BI and A_II are than just >> submatrices of A. Is there an efficient way to access them from matrix A. I >> though this is not possible because the matrix is in sparse format. My (very >> general) idea was to assemble the matrices only local on each proc and to >> define the action of the global Schur complement as the sum of the actions >> of the local Schur complements. Could you make this point more clear to me? >> Thanks! >> >> Thomas >> >> >> Barry Smith wrote: >> >>> Thomas, >>> >>> There are two classes of related non-overlapping domain decomposition >>> methods that use Schur complements. >>> >>> 1) the "iterative substructuring" methods. This can be used will fully >>> assembled global stiffness matrices. They apply the Schur complement S = >>> A_BB - A_BI * A_II^-1 A_IB implicitly by applying first A_IB then A_II^-1 >>> etc. The preconditioner for S is applied by directly knowing >>> something about >>> the structure of S. For example for the Laplacian the S associated with any >>> particular edge is spectrally equivalent to l_00^{1/2} and its inverse can >>> be applied efficiently using FFTs. This is introduced in section 4.2 of my >>> book with particular examples of preconditioners for edges in 4.2.3, 4.2.4, >>> 4.2.5 4.2.6 adding a coarse grid is discussed in 4.3.4 >>> >>> 2) the Neumann-Dirichlet type methods, include FEIT and balancing. These >>> require parts of the unassembled stiffness matrix because they involve >>> solving Neumann boundary condition problems on subdomains in the >>> preconditioner. They are more general purpose than traditional iterative >>> substructuring methods because they don't depend on particular >>> properties of >>> the interface operators like l_00^{1/2}. These are discussed in Section >>> 4.2.1 4.2.2 4.3.1 4.3.2 4.3.3 Note that the book doesn't discuss FEIT >>> methods directly, they are similar to the balancing that is discussed. >>> >>> So what methods do you want to use? From the matrix you wrote below and >>> its decomposition that is only appropriate for the iterative substructuring >>> methods. >>> >>> Barry >>> >>> >>> On Jan 26, 2011, at 6:51 AM, Thomas Witkowski wrote: >>> >>> >>> >>>> I want to solve the equation in my FEM code (that makes already use of >>>> PETSc) with a Schur complement approach (iterative >>>> substructuring). Although >>>> I have some basic knowledge about PETSc, I have no good idea how to start >>>> with it. To concretize my question, I want to solve a system of the form >>>> >>>> [A_II A_IB] * [u_I] = [f_I] >>>> [A_IB^T A_BB] [u_B] [f_B] >>>> >>>> A_II is a block diagonal matrix with each block consisting of all >>>> interior node of one partition. A_BB is the block consisting of >>>> all bounday >>>> nodes. A_IB is the connection between the interior and the >>>> bounday node. The >>>> same for the unknown vector u und the right hand side vector f. My first >>>> idea is not to assemble to matrices A_II, A_IB and A_BB in a >>>> global way, but >>>> just local and to define their action using a MatShell for each of these >>>> matrices. Okay, but what's about the global index of the whole >>>> system. Till >>>> now I have a continuous global index of the nodes on each >>>> partition, what is >>>> required by PETSC, if I'm right. But for using a Schur complement >>>> approach I >>>> need to split the index in the interior and the boundary nodes. Who to >>>> circumvent this (first of my) problem? >>>> >>>> Thank you for any advise, >>>> >>>> Thomas >>>> >>>> >>> >>> >>> >>> >>> >> >> > > > -- > What most experimenters take for granted before they begin their experiments > is infinitely more interesting than any results to which their experiments > lead. > -- Norbert Wiener > From bsmith at mcs.anl.gov Fri Jan 28 12:20:21 2011 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 28 Jan 2011 12:20:21 -0600 Subject: [petsc-users] PETSc Web services impacted References: <1911322444.112934.1296238687548.JavaMail.root@zimbra.anl.gov> Message-ID: <45C64877-AFAC-413F-84BA-8591D9C4C1BA@mcs.anl.gov> The various problems with downloading from the PETSc webservers should be resolved now. --download--package should work Please report any problems to petsc-maint at mcs.anl.gov Thanks for your patience, Barry Begin forwarded message: > From: Dan Olson > Date: January 28, 2011 12:18:07 PM CST > To: Barry Smith > Cc: mcs at mcs.anl.gov > Subject: Re: Web services impacted > > I just received a note that the rules are in place. > > ---- > Daniel Murphy-Olson > Systems Administrator > Mathematics & Computer Science Division > Argonne National Laboratory > 630-252-0055 > > ----- Original Message ----- > From: "Dan Olson" > To: "Barry Smith" > Cc: mcs at mcs.anl.gov > Sent: Friday, January 28, 2011 12:17:23 PM > Subject: Re: Web services impacted > > We changed the ip address of the ftp.mcs.anl.gov server yesterday. The lab wide firewall request only contained ftp rules, not http. We are pushing through a request now, it should be reactivated shortly. > > ---- > Daniel Murphy-Olson > Systems Administrator > Mathematics & Computer Science Division > Argonne National Laboratory > 630-252-0055 > > ----- Original Message ----- > From: "Barry Smith" > To: "John Roberts" > Cc: mcs at mcs.anl.gov > Sent: Friday, January 28, 2011 11:32:15 AM > Subject: Re: Web services impacted > > > Hmmm, things like > http://ftp.mcs.anl.gov/pub/petsc/externalpackages/Chaco-2.2.tar.gz > > still are not working. > > Barry > > On Jan 28, 2011, at 8:58 AM, John Roberts wrote: > >> All, >> >> Known web issues have been corrected. Please let us know if you are still experiencing a problem. >> >> Thanks! >> >> John Roberts >> Argonne National Laboratory >> MCS Division >> roberts at mcs.anl.gov >> >> On 01/28/2011 08:08 AM, Schmitz Corby wrote: >>> -----BEGIN PGP SIGNED MESSAGE----- >>> Hash: SHA1 >>> >>> All: >>> We are experiencing some web-related problems this morning. We are actively working on the problem and hope to have things back up and functional as soon as possible. Please stand by, and thank you for your patience. >>> >>> - -corby >>> >>> >>> >>> -----BEGIN PGP SIGNATURE----- >>> Version: GnuPG/MacGPG2 v2.0.11 (Darwin) >>> >>> iD8DBQFNQs3DQhpwH3ALVFERAlAFAJ9wL6T7TUBY+LGD0BfW4vTCqO+CDACfWy8Z >>> NpogZitia2A/qqisJ5zqfCw= >>> =rCmG >>> -----END PGP SIGNATURE----- > From ctibirna at giref.ulaval.ca Fri Jan 28 14:28:05 2011 From: ctibirna at giref.ulaval.ca (Cristian Tibirna) Date: Fri, 28 Jan 2011 15:28:05 -0500 Subject: [petsc-users] reuse of KSP is sometimes harmful Message-ID: <201101281528.14584.ctibirna@giref.ulaval.ca> Hello We use PETSc-2.3.3-p15 (long story...) embedded inside a large FE code from which it would be difficult for me to extract exact PETSc code. Sorry for this. But we have a rather ugly problem and I would like to know just if it is our use or there might be some intrinsic problem. What we do, in principle, is 1) create a KSP, a Mat and some Vec 2) start a loop (e.g. a home-made nonlinear solver) inside the loop: 2.1) reassemble Mat either through one of two variants: a) sometimes the nonzero structure is constant only the assembled values are changing b) sometimes the nonzero structure changes too, which brings us to create and use a new Mat 2.2) reassemble the rhs Vec 2.3) on fait un KSPSetOperators avec 2.4) solve the system and get the solution (or correction) NOTE: As you see, the KSP is created only once then reused. NOTE: the point b) -- the main issue of this email -- consists most usually of a MatPtAP operation, which naturally gives us a new Mat with new non-zero structure. If we do the above with the variant a), all works perfectly. if we do the variant b), the solution is completely aberant. The only workaround we found was to recreate the KSP at each iteration in the loop. Then the solution vector is correctly computed. What gives? -- Cristian Tibirna (1-418-) 656-2131 / 4340 Laval University - Quebec, CAN ... http://www.giref.ulaval.ca/~ctibirna Research professional at GIREF ... ctibirna at giref.ulaval.ca From knepley at gmail.com Fri Jan 28 14:31:45 2011 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 28 Jan 2011 14:31:45 -0600 Subject: [petsc-users] reuse of KSP is sometimes harmful In-Reply-To: <201101281528.14584.ctibirna@giref.ulaval.ca> References: <201101281528.14584.ctibirna@giref.ulaval.ca> Message-ID: 2011/1/28 Cristian Tibirna > Hello > > We use PETSc-2.3.3-p15 (long story...) embedded inside a large FE code from > which it would be difficult for me to extract exact PETSc code. Sorry for > this. But we have a rather ugly problem and I would like to know just if it > is > our use or there might be some intrinsic problem. > > What we do, in principle, is > 1) create a KSP, a Mat and some Vec > 2) start a loop (e.g. a home-made nonlinear solver) > inside the loop: > 2.1) reassemble Mat either through one of two variants: > a) sometimes the nonzero structure is constant > only the assembled values are changing > b) sometimes the nonzero structure changes too, > which brings us to create and use a new Mat > 2.2) reassemble the rhs Vec > 2.3) on fait un KSPSetOperators avec > 2.4) solve the system and get the solution (or correction) > > NOTE: As you see, the KSP is created only once then reused. > NOTE: the point b) -- the main issue of this email -- consists most usually > of > a MatPtAP operation, which naturally gives us a new Mat with new non-zero > structure. > > If we do the above with the variant a), all works perfectly. > if we do the variant b), the solution is completely aberant. > That should work. This is not really enough of a description for us to debug the problem. I suggest using -pc_type lu -ksp_type preonly for an exact solution, and then check the solution for doing a MatMult(). Usually, this problem occurs because you forget to call KSPSetOperators(). Matt > The only workaround we found was to recreate the KSP at each iteration in > the > loop. Then the solution vector is correctly computed. > > What gives? > > -- > Cristian Tibirna (1-418-) 656-2131 / 4340 > Laval University - Quebec, CAN ... http://www.giref.ulaval.ca/~ctibirna > Research professional at GIREF ... ctibirna at giref.ulaval.ca > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From spam.wax at gmail.com Fri Jan 28 16:06:51 2011 From: spam.wax at gmail.com (Hamid M.) Date: Fri, 28 Jan 2011 17:06:51 -0500 Subject: [petsc-users] PLAPACK question/issues Message-ID: Hello As was suggested on this list, I configured petsc with --download-plapack: ./config/configure.py PETSC_ARCH=intel_R_mpich2 --download-plapack --with-debugging=0 --with-blas-lapack-dir=/opt/intel/mkl/10.1.1.019/lib/em64t/ --with-mpi-dir=/opt/mpich2/mpich2-1.1-intel-11.1 --with-gnu-compilers=0 Trying to run the PLAPACK tests on our cluster fails. I am in contact with Robert van de Geijn (the author) of PLAPCK and he thinks these issues are related to compiling the code on a 64bit machine. So I was wondering if anyone in PETSc community is using PLAPCK in their projects and if they've encounter such problems. For example, after configuring and building PETSc I ran the test in 'externalpackages/PLAPACKR32-hg/EXAMPLES/LU' and got this: Fatal error in MPI_Scatterv: Other MPI error, error stack: MPI_Scatterv(344)..................: MPI_Scatterv(sbuf=0x10f8680, scnts=0x10f02c0, displs=0x10a8fc0, MPI_DOUBLE_COMPLEX, rbuf=0x1526770, rcount=5760, MPI_DOUBLE_COMPLEX, root=0, comm=0xc4000002) failed MPIR_Scatterv(133).................: MPIC_Recv(83)......................: MPIC_Wait(405).....................: MPIDI_CH3I_Progress(150)...........: MPID_nem_mpich2_blocking_recv(1074): MPID_nem_tcp_connpoll(1667)........: state_commrdy_handler(1517)........: MPID_nem_tcp_recv_handler(1413)....: socket closed thanks in advance, Hamid From gdiso at ustc.edu Fri Jan 28 21:27:53 2011 From: gdiso at ustc.edu (Gong Ding) Date: Sat, 29 Jan 2011 11:27:53 +0800 Subject: [petsc-users] How to symmetrical the pattern of anunsymmetrical matrix References: <59025EE265CF46D5B906CF32B79662DE@cogendaeda> Message-ID: <5D66B2A90D8D4B5D9C63752995A5F5C2@cogendaeda> > 2011/1/26 Gong Ding > >> I have unsymmetrical jacobian matrix in MPIAIJ format (it is nearlly >> symmetric, I guess). I'd like to pad it to symmetrical pattern by just add 0 >> to corresponding matrix entry, which is required to some matrix partition >> step. > How are you obtaining the matrix? The best way is to preallocate those > extra zeros. Yes, it is the problem. I have to build the pattern of symmetrical matrix for memory allocation from the unsymmetrical matrix. I don't konw how to do it with petsc API. Any good idea? After build the symmetrical matrix, things are easy. > A traditional purely algebraic way is to add the transpose (zeroed in this > case), but transpose is a bad operation to perform in parallel so I would > try to avoid it. From stephen.wornom at inria.fr Sat Jan 29 04:26:40 2011 From: stephen.wornom at inria.fr (Stephen Wornom) Date: Sat, 29 Jan 2011 11:26:40 +0100 Subject: [petsc-users] best PETSC solver for long nozzles Message-ID: <4D43EB60.2020208@inria.fr> Geometry: Tube with L/D is >> 1 Characteristics: -Unstructured partitioned mesh with 500K vertices globally. -The tube wall is heated, thus solution evolves in the x-direction. -X is the dominant flow direction. -Seek steady solution using the time advancing scheme with a low Mach number preconditionner. *Question*: Which *PETSC* solver would give the best convergence? Stephen p.s. My present solver does not take into consideration that the flow has a dominant flow direction (x). Thus the convergence is very slow, 30000 time steps using 64-processors with a CFLmax= 50 applied in each cell. At 15000 time steps the 0-L/2 portion of the tube is converged with an additional 15000 time steps needed to converge the L/2-L part of the tube. -- stephen.wornom at inria.fr 2004 route des lucioles - BP93 Sophia Antipolis 06902 CEDEX Tel: 04 92 38 50 54 Fax: 04 97 15 53 51 -------------- next part -------------- A non-text attachment was scrubbed... Name: stephen_wornom.vcf Type: text/x-vcard Size: 160 bytes Desc: not available URL: From amari at cpht.polytechnique.fr Sat Jan 29 05:00:33 2011 From: amari at cpht.polytechnique.fr (Tahar Amari) Date: Sat, 29 Jan 2011 12:00:33 +0100 Subject: [petsc-users] best PETSC solver for long nozzles In-Reply-To: <4D43EB60.2020208@inria.fr> References: <4D43EB60.2020208@inria.fr> Message-ID: <3E5BE5E2-3BA3-4938-8073-AED337D5F732@cpht.polytechnique.fr> C'est quoi cette plist qui est affichee au debut au premier click "choose data" map.plist, defaut.plist ????? entering handlecgirequest HTTP_COOKIE user BF53A419-E94B-4046-B92B-A57987B3489B command standard dir filename solar_choose_data_0.html fitsdatafile N/A lang en mission choosedata movie3dfilename myfilename solar_main.html service solar sessionid userlang en view2dbundle view3dbundle handlecgirequest avant p -------------- next part -------------- An HTML attachment was scrubbed... URL: From gdiso at ustc.edu Sat Jan 29 07:27:03 2011 From: gdiso at ustc.edu (Gong Ding) Date: Sat, 29 Jan 2011 21:27:03 +0800 Subject: [petsc-users] How to symmetrical the pattern of anunsymmetrical matrix References: <59025EE265CF46D5B906CF32B79662DE@cogendaeda> Message-ID: OK, I had gather all the pattern of A+A^T by my code. Can I call MatSeqAIJSetPreallocation again for enlarge preallocated memory? And then I can add 0 to some entry to make the matrix symmetrical. > 2011/1/26 Gong Ding > >> I have unsymmetrical jacobian matrix in MPIAIJ format (it is nearlly >> symmetric, I guess). I'd like to pad it to symmetrical pattern by just add 0 >> to corresponding matrix entry, which is required to some matrix partition >> step. > > > How are you obtaining the matrix? The best way is to preallocate those > extra zeros. > > A traditional purely algebraic way is to add the transpose (zeroed in this > case), but transpose is a bad operation to perform in parallel so I would > try to avoid it. > From knepley at gmail.com Sat Jan 29 07:58:48 2011 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 29 Jan 2011 07:58:48 -0600 Subject: [petsc-users] How to symmetrical the pattern of anunsymmetrical matrix In-Reply-To: References: <59025EE265CF46D5B906CF32B79662DE@cogendaeda> Message-ID: On Sat, Jan 29, 2011 at 7:27 AM, Gong Ding wrote: > OK, I had gather all the pattern of A+A^T by my code. > Can I call MatSeqAIJSetPreallocation again for enlarge preallocated > memory? > And then I can add 0 to some entry to make the matrix symmetrical. > Matrices cannot be resized. You want to move that symmetrization code to the point where you originally preallocate the matrix (you do not need values to do this, so this is fine). Then insert the zeros along with the regular values. Matt > > 2011/1/26 Gong Ding > > > >> I have unsymmetrical jacobian matrix in MPIAIJ format (it is nearlly > >> symmetric, I guess). I'd like to pad it to symmetrical pattern by just > add 0 > >> to corresponding matrix entry, which is required to some matrix > partition > >> step. > > > > > > How are you obtaining the matrix? The best way is to preallocate those > > extra zeros. > > > > A traditional purely algebraic way is to add the transpose (zeroed in > this > > case), but transpose is a bad operation to perform in parallel so I would > > try to avoid it. > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sat Jan 29 08:03:59 2011 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 29 Jan 2011 08:03:59 -0600 Subject: [petsc-users] best PETSC solver for long nozzles In-Reply-To: <4D43EB60.2020208@inria.fr> References: <4D43EB60.2020208@inria.fr> Message-ID: On Sat, Jan 29, 2011 at 4:26 AM, Stephen Wornom wrote: > Geometry: Tube with L/D is >> 1 > Characteristics: > -Unstructured partitioned mesh with 500K vertices globally. > -The tube wall is heated, thus solution evolves in the x-direction. > -X is the dominant flow direction. > -Seek steady solution using the time advancing scheme with a low Mach > number preconditionner. > *Question*: Which *PETSC* solver would give the best convergence? Stephen > p.s. > What are you currently using? I see two main points a) Make sure your mesh partitioning is compatible with this dominant flow direction. It sounds like any decent partition should be, but check it. b) Probably more important is a good preconditioner for heat transfer coupled to incompressible flow. Have you tried block preconditioning using FieldSplit. Treating the whole thing as a black box is usually pretty terrible. An optimal preconditioner is described in the 4th talk by Marc Spiegelman here: http://www.bu.edu/pasi/materials/ Matt > My present solver does not take into consideration that the flow has a > dominant flow direction (x). > Thus the convergence is very slow, 30000 time steps using 64-processors > with a CFLmax= 50 applied in each cell. At 15000 time steps the 0-L/2 > portion of the tube is converged with an additional 15000 time steps needed > to converge the L/2-L part of the tube. > > -- > stephen.wornom at inria.fr > 2004 route des lucioles - BP93 > Sophia Antipolis > 06902 CEDEX > > Tel: 04 92 38 50 54 > Fax: 04 97 15 53 51 > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From irfan.khan at gatech.edu Sun Jan 30 21:51:10 2011 From: irfan.khan at gatech.edu (Khan, Irfan) Date: Sun, 30 Jan 2011 22:51:10 -0500 (EST) Subject: [petsc-users] MatZeroRowsIS for non-square matrices In-Reply-To: <731142550.216280.1296445768728.JavaMail.root@mail8.gatech.edu> Message-ID: <1408432990.216334.1296445870944.JavaMail.root@mail8.gatech.edu> How will "MatZeroRowsIS()" work for rectangular matrices with rows > cols? Particularly, for a given row, what will be the column number chosen for specifying the constant value? Thank you Irfan -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Jan 30 21:58:01 2011 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 30 Jan 2011 21:58:01 -0600 Subject: [petsc-users] MatZeroRowsIS for non-square matrices In-Reply-To: <1408432990.216334.1296445870944.JavaMail.root@mail8.gatech.edu> References: <731142550.216280.1296445768728.JavaMail.root@mail8.gatech.edu> <1408432990.216334.1296445870944.JavaMail.root@mail8.gatech.edu> Message-ID: On Sun, Jan 30, 2011 at 9:51 PM, Khan, Irfan wrote: > How will "MatZeroRowsIS()" work for rectangular matrices with rows > cols? > Particularly, for a given row, what will be the column number chosen for > specifying the constant value? > There will be no "diagonal" value set for row > numCols. Matt > Thank you > Irfan > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From aldo.bonfiglioli at unibas.it Mon Jan 31 03:34:36 2011 From: aldo.bonfiglioli at unibas.it (Aldo Bonfiglioli) Date: Mon, 31 Jan 2011 10:34:36 +0100 Subject: [petsc-users] best PETSC solver for long nozzles In-Reply-To: <4D43EB60.2020208@inria.fr> References: <4D43EB60.2020208@inria.fr> Message-ID: <4D46822C.2050408@unibas.it> Stephen Wornom wrote: > > -Seek steady solution using the time advancing scheme with a low Mach > number preconditionner. When using low speed preconditioning, it may be necessary to take into account the low-Mach-number preconditioner in the linear system to be solved at each Newton iteration. See, e.g.: http://dx.doi.org/10.1016/j.cam.2006.04.068 http://proceedings.fyper.com/eccomascfd2006/documents/398.pdf Aldo Dr. Aldo Bonfiglioli Dip.to di Ingegneria e Fisica dell'Ambiente (DIFA) Universita' della Basilicata V.le dell'Ateneo lucano, 10 85100 Potenza ITALY tel:+39.0971.205203 fax:+39.0971.205160 From SJ_Ormiston at UManitoba.ca Mon Jan 31 15:25:32 2011 From: SJ_Ormiston at UManitoba.ca (Ormiston, Scott J.) Date: Mon, 31 Jan 2011 15:25:32 -0600 Subject: [petsc-users] PETSc equivalent to PGMRES Message-ID: <4D4728CC.1080507@UManitoba.ca> We have been using Yousef Saad's PGMRES with the ILUT pre-conditioner with drop tolerance of 0.01 and lfil of 100. What specific settings/calls are needed in PETSc to achieve the equivalent of this previous solution scheme? Scott Ormiston -------------- next part -------------- A non-text attachment was scrubbed... Name: SJ_Ormiston.vcf Type: text/x-vcard Size: 321 bytes Desc: not available URL: From knepley at gmail.com Mon Jan 31 16:17:21 2011 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 31 Jan 2011 16:17:21 -0600 Subject: [petsc-users] PETSc equivalent to PGMRES In-Reply-To: <4D4728CC.1080507@UManitoba.ca> References: <4D4728CC.1080507@UManitoba.ca> Message-ID: On Mon, Jan 31, 2011 at 3:25 PM, Ormiston, Scott J. < SJ_Ormiston at umanitoba.ca> wrote: > We have been using Yousef Saad's PGMRES with the ILUT pre-conditioner with > drop tolerance of 0.01 and lfil of 100. > > What specific settings/calls are needed in PETSc to achieve the equivalent > of this previous solution scheme? PETSc does not have ILUT builtin (it has ILUK). You can try Euclid through the Hypre package for ILUT. Matt > > Scott Ormiston -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Mon Jan 31 16:21:30 2011 From: hzhang at mcs.anl.gov (Hong Zhang) Date: Mon, 31 Jan 2011 16:21:30 -0600 Subject: [petsc-users] PETSc equivalent to PGMRES In-Reply-To: <4D4728CC.1080507@UManitoba.ca> References: <4D4728CC.1080507@UManitoba.ca> Message-ID: Scott: > We have been using Yousef Saad's PGMRES with the ILUT pre-conditioner with > drop tolerance of 0.01 and lfil of 100. > > What specific settings/calls are needed in PETSc to achieve the equivalent > of this previous solution scheme? Your can use superlu's ILUT through PETSc, for example, src/ksp/ksp/examples/tutorials>./ex2 -pc_type ilu -pc_factor_mat_solver_package superlu -mat_superlu_ilu_droptol 0.01 Norm of error 0.000544426 iterations 2 Running an example with '-help |grep superlu' lists all available options -mat_superlu_ilu_droptol <0.0001>: ILU_DropTol (None) -mat_superlu_ilu_filltol <0.01>: ILU_FillTol (None) -mat_superlu_ilu_fillfactor <10>: ILU_FillFactor (None) -mat_superlu_ilu_droprull <9>: ILU_DropRule (None) -mat_superlu_ilu_norm <2>: ILU_Norm (None) -mat_superlu_ilu_milu <0>: ILU_MILU (None) Note: you must configure petsc with superlu. See http://www.mcs.anl.gov/petsc/petsc-as/miscellaneous/external.html Hong > > Scott Ormiston >